NASA Astrophysics Data System (ADS)
Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda
2018-05-01
This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.
NASA Astrophysics Data System (ADS)
Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda
2018-01-01
This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.
Integrating in silico models to enhance predictivity for developmental toxicity.
Marzo, Marco; Kulkarni, Sunil; Manganaro, Alberto; Roncaglioni, Alessandra; Wu, Shengde; Barton-Maclaren, Tara S; Lester, Cathy; Benfenati, Emilio
2016-08-31
Application of in silico models to predict developmental toxicity has demonstrated limited success particularly when employed as a single source of information. It is acknowledged that modelling the complex outcomes related to this endpoint is a challenge; however, such models have been developed and reported in the literature. The current study explored the possibility of integrating the selected public domain models (CAESAR, SARpy and P&G model) with the selected commercial modelling suites (Multicase, Leadscope and Derek Nexus) to assess if there is an increase in overall predictive performance. The results varied according to the data sets used to assess performance which improved upon model integration relative to individual models. Moreover, because different models are based on different specific developmental toxicity effects, integration of these models increased the applicable chemical and biological spaces. It is suggested that this approach reduces uncertainty associated with in silico predictions by achieving a consensus among a battery of models. The use of tools to assess the applicability domain also improves the interpretation of the predictions. This has been verified in the case of the software VEGA, which makes freely available QSAR models with a measurement of the applicability domain. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.
2010-06-06
The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less
Quinn, Francis; Johnston, Marie; Johnston, Derek W
2013-01-01
Previous research has supported an integrated biomedical and behavioural model explaining activity limitations. However, further tests of this model are required at the within-person level, because while it proposes that the constructs are related within individuals, it has primarily been tested between individuals in large group studies. We aimed to test the integrated model at the within-person level. Six correlational N-of-1 studies in participants with arthritis, chronic pain and walking limitations were carried out. Daily measures of theoretical constructs were collected using a hand-held computer (PDA), the activity was assessed by self-report and accelerometer and the data were analysed using time-series analysis. The biomedical model was not supported as pain impairment did not predict activity, so the integrated model was supported partially. Impairment predicted intention to move around, while perceived behavioural control (PBC) and intention predicted activity. PBC did not predict activity limitation in the expected direction. The integrated model of disability was partially supported within individuals, especially the behavioural elements. However, results suggest that different elements of the model may drive activity (limitations) for different individuals. The integrated model provides a useful framework for understanding disability and suggests interventions, and the utility of N-of-1 methodology for testing theory is illustrated.
Koshkina, Vira; Wang, Yang; Gordon, Ascelin; Dorazio, Robert; White, Matthew; Stone, Lewi
2017-01-01
Two main sources of data for species distribution models (SDMs) are site-occupancy (SO) data from planned surveys, and presence-background (PB) data from opportunistic surveys and other sources. SO surveys give high quality data about presences and absences of the species in a particular area. However, due to their high cost, they often cover a smaller area relative to PB data, and are usually not representative of the geographic range of a species. In contrast, PB data is plentiful, covers a larger area, but is less reliable due to the lack of information on species absences, and is usually characterised by biased sampling. Here we present a new approach for species distribution modelling that integrates these two data types.We have used an inhomogeneous Poisson point process as the basis for constructing an integrated SDM that fits both PB and SO data simultaneously. It is the first implementation of an Integrated SO–PB Model which uses repeated survey occupancy data and also incorporates detection probability.The Integrated Model's performance was evaluated, using simulated data and compared to approaches using PB or SO data alone. It was found to be superior, improving the predictions of species spatial distributions, even when SO data is sparse and collected in a limited area. The Integrated Model was also found effective when environmental covariates were significantly correlated. Our method was demonstrated with real SO and PB data for the Yellow-bellied glider (Petaurus australis) in south-eastern Australia, with the predictive performance of the Integrated Model again found to be superior.PB models are known to produce biased estimates of species occupancy or abundance. The small sample size of SO datasets often results in poor out-of-sample predictions. Integrated models combine data from these two sources, providing superior predictions of species abundance compared to using either data source alone. Unlike conventional SDMs which have restrictive scale-dependence in their predictions, our Integrated Model is based on a point process model and has no such scale-dependency. It may be used for predictions of abundance at any spatial-scale while still maintaining the underlying relationship between abundance and area.
North Atlantic climate model bias influence on multiyear predictability
NASA Astrophysics Data System (ADS)
Wu, Y.; Park, T.; Park, W.; Latif, M.
2018-01-01
The influences of North Atlantic biases on multiyear predictability of unforced surface air temperature (SAT) variability are examined in the Kiel Climate Model (KCM). By employing a freshwater flux correction over the North Atlantic to the model, which strongly alleviates both North Atlantic sea surface salinity (SSS) and sea surface temperature (SST) biases, the freshwater flux-corrected integration depicts significantly enhanced multiyear SAT predictability in the North Atlantic sector in comparison to the uncorrected one. The enhanced SAT predictability in the corrected integration is due to a stronger and more variable Atlantic Meridional Overturning Circulation (AMOC) and its enhanced influence on North Atlantic SST. Results obtained from preindustrial control integrations of models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) support the findings obtained from the KCM: models with large North Atlantic biases tend to have a weak AMOC influence on SAT and exhibit a smaller SAT predictability over the North Atlantic sector.
Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim
2016-01-01
Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458
Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, Martin; Giebel, Gregor; Skov Nielsen, Torben; Hahmann, Andrea; Sørensen, Poul; Madsen, Henrik
2013-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the title "Integrated Wind Power Planning Tool". The goal is to integrate a mesoscale numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. Using the state-of-the-art mesoscale NWP model Weather Research & Forecasting model (WRF) the forecast error is sought quantified in dependence of the time scale involved. This task constitutes a preparative study for later implementation of features accounting for NWP forecast errors in the DTU Wind Energy maintained Corwind code - a long term wind power planning tool. Within the framework of PSO 10464 research related to operational short term wind power prediction will be carried out, including a comparison of forecast quality at different mesoscale NWP model resolutions and development of a statistical wind power prediction tool taking input from WRF. The short term prediction part of the project is carried out in collaboration with ENFOR A/S; a Danish company that specialises in forecasting and optimisation for the energy sector. The integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated short term prediction tool constitutes scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator. The need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2025 from the current 20%.
Satellite Remote Sensing is Key to Water Cycle Integrator
NASA Astrophysics Data System (ADS)
Koike, T.
2016-12-01
To promote effective multi-sectoral, interdisciplinary collaboration based on coordinated and integrated efforts, the Global Earth Observation System of Systems (GEOSS) is now developing a "GEOSS Water Cycle Integrator (WCI)", which integrates "Earth observations", "modeling", "data and information", "management systems" and "education systems". GEOSS/WCI sets up "work benches" by which partners can share data, information and applications in an interoperable way, exchange knowledge and experiences, deepen mutual understanding and work together effectively to ultimately respond to issues of both mitigation and adaptation. (A work bench is a virtual geographical or phenomenological space where experts and managers collaborate to use information to address a problem within that space). GEOSS/WCI enhances the coordination of efforts to strengthen individual, institutional and infrastructure capacities, especially for effective interdisciplinary coordination and integration. GEOSS/WCI archives various satellite data to provide various hydrological information such as cloud, rainfall, soil moisture, or land-surface snow. These satellite products were validated using land observation in-situ data. Water cycle models can be developed by coupling in-situ and satellite data. River flows and other hydrological parameters can be simulated and validated by in-situ data. Model outputs from weather-prediction, seasonal-prediction, and climate-prediction models are archived. Some of these model outputs are archived on an online basis, but other models, e.g., climate-prediction models are archived on an offline basis. After models are evaluated and biases corrected, the outputs can be used as inputs into the hydrological models for predicting the hydrological parameters. Additionally, we have already developed a data-assimilation system by combining satellite data and the models. This system can improve our capability to predict hydrological phenomena. The WCI can provide better predictions of the hydrological parameters for integrated water resources management (IWRM) and also assess the impact of climate change and calculate adaptation needs.
Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J
2011-07-01
The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.
ERIC Educational Resources Information Center
Lee, Chien-Sing
2007-01-01
Models represent a set of generic patterns to test hypotheses. This paper presents the CogMoLab student model in the context of an integrated learning environment. Three aspects are discussed: diagnostic and predictive modeling with respect to the issues of credit assignment and scalability and compositional modeling of the student profile in the…
Shi, Xiaohu; Zhang, Jingfen; He, Zhiquan; Shang, Yi; Xu, Dong
2011-09-01
One of the major challenges in protein tertiary structure prediction is structure quality assessment. In many cases, protein structure prediction tools generate good structural models, but fail to select the best models from a huge number of candidates as the final output. In this study, we developed a sampling-based machine-learning method to rank protein structural models by integrating multiple scores and features. First, features such as predicted secondary structure, solvent accessibility and residue-residue contact information are integrated by two Radial Basis Function (RBF) models trained from different datasets. Then, the two RBF scores and five selected scoring functions developed by others, i.e., Opus-CA, Opus-PSP, DFIRE, RAPDF, and Cheng Score are synthesized by a sampling method. At last, another integrated RBF model ranks the structural models according to the features of sampling distribution. We tested the proposed method by using two different datasets, including the CASP server prediction models of all CASP8 targets and a set of models generated by our in-house software MUFOLD. The test result shows that our method outperforms any individual scoring function on both best model selection, and overall correlation between the predicted ranking and the actual ranking of structural quality.
The 3-D CFD modeling of gas turbine combustor-integral bleed flow interaction
NASA Technical Reports Server (NTRS)
Chen, D. Y.; Reynolds, R. S.
1993-01-01
An advanced 3-D Computational Fluid Dynamics (CFD) model was developed to analyze the flow interaction between a gas turbine combustor and an integral bleed plenum. In this model, the elliptic governing equations of continuity, momentum and the k-e turbulence model were solved on a boundary-fitted, curvilinear, orthogonal grid system. The model was first validated against test data from public literature and then applied to a gas turbine combustor with integral bleed. The model predictions agreed well with data from combustor rig testing. The model predictions also indicated strong flow interaction between the combustor and the integral bleed. Integral bleed flow distribution was found to have a great effect on the pressure distribution around the gas turbine combustor.
On the effects of alternative optima in context-specific metabolic model predictions
Nikoloski, Zoran
2017-01-01
The integration of experimental data into genome-scale metabolic models can greatly improve flux predictions. This is achieved by restricting predictions to a more realistic context-specific domain, like a particular cell or tissue type. Several computational approaches to integrate data have been proposed—generally obtaining context-specific (sub)models or flux distributions. However, these approaches may lead to a multitude of equally valid but potentially different models or flux distributions, due to possible alternative optima in the underlying optimization problems. Although this issue introduces ambiguity in context-specific predictions, it has not been generally recognized, especially in the case of model reconstructions. In this study, we analyze the impact of alternative optima in four state-of-the-art context-specific data integration approaches, providing both flux distributions and/or metabolic models. To this end, we present three computational methods and apply them to two particular case studies: leaf-specific predictions from the integration of gene expression data in a metabolic model of Arabidopsis thaliana, and liver-specific reconstructions derived from a human model with various experimental data sources. The application of these methods allows us to obtain the following results: (i) we sample the space of alternative flux distributions in the leaf- and the liver-specific case and quantify the ambiguity of the predictions. In addition, we show how the inclusion of ℓ1-regularization during data integration reduces the ambiguity in both cases. (ii) We generate sets of alternative leaf- and liver-specific models that are optimal to each one of the evaluated model reconstruction approaches. We demonstrate that alternative models of the same context contain a marked fraction of disparate reactions. Further, we show that a careful balance between model sparsity and metabolic functionality helps in reducing the discrepancies between alternative models. Finally, our findings indicate that alternative optima must be taken into account for rendering the context-specific metabolic model predictions less ambiguous. PMID:28557990
On the effects of alternative optima in context-specific metabolic model predictions.
Robaina-Estévez, Semidán; Nikoloski, Zoran
2017-05-01
The integration of experimental data into genome-scale metabolic models can greatly improve flux predictions. This is achieved by restricting predictions to a more realistic context-specific domain, like a particular cell or tissue type. Several computational approaches to integrate data have been proposed-generally obtaining context-specific (sub)models or flux distributions. However, these approaches may lead to a multitude of equally valid but potentially different models or flux distributions, due to possible alternative optima in the underlying optimization problems. Although this issue introduces ambiguity in context-specific predictions, it has not been generally recognized, especially in the case of model reconstructions. In this study, we analyze the impact of alternative optima in four state-of-the-art context-specific data integration approaches, providing both flux distributions and/or metabolic models. To this end, we present three computational methods and apply them to two particular case studies: leaf-specific predictions from the integration of gene expression data in a metabolic model of Arabidopsis thaliana, and liver-specific reconstructions derived from a human model with various experimental data sources. The application of these methods allows us to obtain the following results: (i) we sample the space of alternative flux distributions in the leaf- and the liver-specific case and quantify the ambiguity of the predictions. In addition, we show how the inclusion of ℓ1-regularization during data integration reduces the ambiguity in both cases. (ii) We generate sets of alternative leaf- and liver-specific models that are optimal to each one of the evaluated model reconstruction approaches. We demonstrate that alternative models of the same context contain a marked fraction of disparate reactions. Further, we show that a careful balance between model sparsity and metabolic functionality helps in reducing the discrepancies between alternative models. Finally, our findings indicate that alternative optima must be taken into account for rendering the context-specific metabolic model predictions less ambiguous.
Integrated Wind Power Planning Tool
NASA Astrophysics Data System (ADS)
Rosgaard, M. H.; Giebel, G.; Nielsen, T. S.; Hahmann, A.; Sørensen, P.; Madsen, H.
2012-04-01
This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely statistical tools in order to assess wind power fluctuations, with focus on long term power system planning for future wind farms as well as short term forecasting for existing wind farms. Currently, wind power fluctuation models are either purely statistical or integrated with NWP models of limited resolution. With regard to the latter, one such simulation tool has been developed at the Wind Energy Division, Risø DTU, intended for long term power system planning. As part of the PSO project the inferior NWP model used at present will be replaced by the state-of-the-art Weather Research & Forecasting (WRF) model. Furthermore, the integrated simulation tool will be improved so it can handle simultaneously 10-50 times more turbines than the present ~ 300, as well as additional atmospheric parameters will be included in the model. The WRF data will also be input for a statistical short term prediction model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting for its spatio-temporal dependencies, and depending on the prevailing weather conditions defined by the WRF output. The output from the integrated prediction tool constitute scenario forecasts for the coming period, which can then be fed into any type of system model or decision making problem to be solved. The high resolution of the WRF results loaded into the integrated prediction model will ensure a high accuracy data basis is available for use in the decision making process of the Danish transmission system operator, and the need for high accuracy predictions will only increase over the next decade as Denmark approaches the goal of 50% wind power based electricity in 2020, from the current 20%.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-01-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671
Xue, Fangzheng; Li, Qian; Li, Xiumin
2017-01-01
Recently, echo state network (ESN) has attracted a great deal of attention due to its high accuracy and efficient learning performance. Compared with the traditional random structure and classical sigmoid units, simple circle topology and leaky integrator neurons have more advantages on reservoir computing of ESN. In this paper, we propose a new model of ESN with both circle reservoir structure and leaky integrator units. By comparing the prediction capability on Mackey-Glass chaotic time series of four ESN models: classical ESN, circle ESN, traditional leaky integrator ESN, circle leaky integrator ESN, we find that our circle leaky integrator ESN shows significantly better performance than other ESNs with roughly 2 orders of magnitude reduction of the predictive error. Moreover, this model has stronger ability to approximate nonlinear dynamics and resist noise than conventional ESN and ESN with only simple circle structure or leaky integrator neurons. Our results show that the combination of circle topology and leaky integrator neurons can remarkably increase dynamical diversity and meanwhile decrease the correlation of reservoir states, which contribute to the significant improvement of computational performance of Echo state network on time series prediction.
ERIC Educational Resources Information Center
Fü rst, Guillaume; Ghisletta, Paolo; Lubart, Todd
2016-01-01
The present work proposes an integrative model of creativity that includes personality traits and cognitive processes. This model hypothesizes that three high-order personality factors predict two main process factors, which in turn predict intensity and achievement of creative activities. The personality factors are: "Plasticity" (high…
Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi
2018-02-01
To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.
Sweet, Shane N.; Fortier, Michelle S.; Strachan, Shaelyn M.; Blanchard, Chris M.; Boulay, Pierre
2014-01-01
Self-determination theory and self-efficacy theory are prominent theories in the physical activity literature, and studies have begun integrating their concepts. Sweet, Fortier, Strachan and Blanchard (2012) have integrated these two theories in a cross-sectional study. Therefore, this study sought to test a longitudinal integrated model to predict physical activity at the end of a 4-month cardiac rehabilitation program based on theory, research and Sweet et al.’s cross-sectional model. Participants from two cardiac rehabilitation programs (N=109) answered validated self-report questionnaires at baseline, two and four months. Data were analyzed using Amos to assess the path analysis and model fit. Prior to integration, perceived competence and self-efficacy were combined, and labeled as confidence. After controlling for 2-month physical activity and cardiac rehabilitation site, no motivational variables significantly predicted residual change in 4-month physical activity. Although confidence at two months did not predict residual change in 4-month physical activity, it had a strong positive relationship with 2-month physical activity (β=0.30, P<0.001). The overall model retained good fit indices. In conclusion, results diverged from theoretical predictions of physical activity, but self-determination and self-efficacy theory were still partially supported. Because the model had good fit, this study demonstrated that theoretical integration is feasible. PMID:26973926
USDA-ARS?s Scientific Manuscript database
Representing the performance of cattle finished on an all forage diet in process-based whole farm system models has presented a challenge. To address this challenge, a study was done to evaluate average daily gain (ADG) predictions of the Integrated Farm System Model (IFSM) for steers consuming all-...
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2016-09-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Integrated multiscale biomaterials experiment and modelling: a perspective
Buehler, Markus J.; Genin, Guy M.
2016-01-01
Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126
Multi-scale modeling of tsunami flows and tsunami-induced forces
NASA Astrophysics Data System (ADS)
Qin, X.; Motley, M. R.; LeVeque, R. J.; Gonzalez, F. I.
2016-12-01
The modeling of tsunami flows and tsunami-induced forces in coastal communities with the incorporation of the constructed environment is challenging for many numerical modelers because of the scale and complexity of the physical problem. A two-dimensional (2D) depth-averaged model can be efficient for modeling of waves offshore but may not be accurate enough to predict the complex flow with transient variance in vertical direction around constructed environments on land. On the other hand, using a more complex three-dimensional model is much more computational expensive and can become impractical due to the size of the problem and the meshing requirements near the built environment. In this study, a 2D depth-integrated model and a 3D Reynolds Averaged Navier-Stokes (RANS) model are built to model a 1:50 model-scale, idealized community, representative of Seaside, OR, USA, for which existing experimental data is available for comparison. Numerical results from the two numerical models are compared with each other as well as experimental measurement. Both models predict the flow parameters (water level, velocity, and momentum flux in the vicinity of the buildings) accurately, in general, except for time period near the initial impact, where the depth-averaged models can fail to capture the complexities in the flow. Forces predicted using direct integration of predicted pressure on structural surfaces from the 3D model and using momentum flux from the 2D model with constructed environment are compared, which indicates that force prediction from the 2D model is not always reliable in such a complicated case. Force predictions from integration of the pressure are also compared with forces predicted from bare earth momentum flux calculations to reveal the importance of incorporating the constructed environment in force prediction models.
Identification of chemical vascular disruptors during development using an integrative predictive toxicity model and zebrafish and in vitro functional angiogenesis assays Chemically-induced vascular toxicity during embryonic development can result in a wide range of adverse pre...
Weiler, Gabriele; Schwarz, Ulf; Rauch, Jochen; Rohm, Kerstin; Lehr, Thorsten; Theobald, Stefan; Kiefer, Stephan; Götz, Katharina; Och, Katharina; Pfeifer, Nico; Handl, Lisa; Smola, Sigrun; Ihle, Matthias; Turki, Amin T; Beelen, Dietrich W; Rissland, Jürgen; Bittenbring, Jörg; Graf, Norbert
2018-01-01
Predictive models can support physicians to tailor interventions and treatments to their individual patients based on their predicted response and risk of disease and help in this way to put personalized medicine into practice. In allogeneic stem cell transplantation risk assessment is to be enhanced in order to respond to emerging viral infections and transplantation reactions. However, to develop predictive models it is necessary to harmonize and integrate high amounts of heterogeneous medical data that is stored in different health information systems. Driven by the demand for predictive instruments in allogeneic stem cell transplantation we present in this paper an ontology-based platform that supports data owners and model developers to share and harmonize their data for model development respecting data privacy.
Choi, Ickwon; Kattan, Michael W; Wells, Brian J; Yu, Changhong
2012-01-01
In medical society, the prognostic models, which use clinicopathologic features and predict prognosis after a certain treatment, have been externally validated and used in practice. In recent years, most research has focused on high dimensional genomic data and small sample sizes. Since clinically similar but molecularly heterogeneous tumors may produce different clinical outcomes, the combination of clinical and genomic information, which may be complementary, is crucial to improve the quality of prognostic predictions. However, there is a lack of an integrating scheme for clinic-genomic models due to the P ≥ N problem, in particular, for a parsimonious model. We propose a methodology to build a reduced yet accurate integrative model using a hybrid approach based on the Cox regression model, which uses several dimension reduction techniques, L₂ penalized maximum likelihood estimation (PMLE), and resampling methods to tackle the problem. The predictive accuracy of the modeling approach is assessed by several metrics via an independent and thorough scheme to compare competing methods. In breast cancer data studies on a metastasis and death event, we show that the proposed methodology can improve prediction accuracy and build a final model with a hybrid signature that is parsimonious when integrating both types of variables.
ERIC Educational Resources Information Center
Mello, Susan; Hovick, Shelly R.
2016-01-01
There is a growing body of evidence linking childhood exposure to environmental toxins and a range of adverse health outcomes, including preterm birth, cognitive deficits, and cancer. Little is known, however, about what drives mothers to engage in health behaviors to reduce such risks. Guided by the integrative model of behavioral prediction,…
Low, Yen S.; Sedykh, Alexander; Rusyn, Ivan; Tropsha, Alexander
2017-01-01
Cheminformatics approaches such as Quantitative Structure Activity Relationship (QSAR) modeling have been used traditionally for predicting chemical toxicity. In recent years, high throughput biological assays have been increasingly employed to elucidate mechanisms of chemical toxicity and predict toxic effects of chemicals in vivo. The data generated in such assays can be considered as biological descriptors of chemicals that can be combined with molecular descriptors and employed in QSAR modeling to improve the accuracy of toxicity prediction. In this review, we discuss several approaches for integrating chemical and biological data for predicting biological effects of chemicals in vivo and compare their performance across several data sets. We conclude that while no method consistently shows superior performance, the integrative approaches rank consistently among the best yet offer enriched interpretation of models over those built with either chemical or biological data alone. We discuss the outlook for such interdisciplinary methods and offer recommendations to further improve the accuracy and interpretability of computational models that predict chemical toxicity. PMID:24805064
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
Ansari, Mozafar; Othman, Faridah; Abunama, Taher; El-Shafie, Ahmed
2018-04-01
The function of a sewage treatment plant is to treat the sewage to acceptable standards before being discharged into the receiving waters. To design and operate such plants, it is necessary to measure and predict the influent flow rate. In this research, the influent flow rate of a sewage treatment plant (STP) was modelled and predicted by autoregressive integrated moving average (ARIMA), nonlinear autoregressive network (NAR) and support vector machine (SVM) regression time series algorithms. To evaluate the models' accuracy, the root mean square error (RMSE) and coefficient of determination (R 2 ) were calculated as initial assessment measures, while relative error (RE), peak flow criterion (PFC) and low flow criterion (LFC) were calculated as final evaluation measures to demonstrate the detailed accuracy of the selected models. An integrated model was developed based on the individual models' prediction ability for low, average and peak flow. An initial assessment of the results showed that the ARIMA model was the least accurate and the NAR model was the most accurate. The RE results also prove that the SVM model's frequency of errors above 10% or below - 10% was greater than the NAR model's. The influent was also forecasted up to 44 weeks ahead by both models. The graphical results indicate that the NAR model made better predictions than the SVM model. The final evaluation of NAR and SVM demonstrated that SVM made better predictions at peak flow and NAR fit well for low and average inflow ranges. The integrated model developed includes the NAR model for low and average influent and the SVM model for peak inflow.
Lee, Yong Ju; Jung, Byeong Su; Kim, Kee-Tae; Paik, Hyun-Dong
2015-09-01
A predictive model was performed to describe the growth of Staphylococcus aureus in raw pork by using Integrated Pathogen Modeling Program 2013 and a polynomial model as a secondary predictive model. S. aureus requires approximately 180 h to reach 5-6 log CFU/g at 10 °C. At 15 °C and 25 °C, approximately 48 and 20 h, respectively, are required to cause food poisoning. Predicted data using the Gompertz model was the most accurate in this study. For lag time (LT) model, bias factor (Bf) and accuracy factor (Af) values were both 1.014, showing that the predictions were within a reliable range. For specific growth rate (SGR) model, Bf and Af were 1.188 and 1.190, respectively. Additionally, both Bf and Af values of the LT and SGR models were close to 1, indicating that IPMP Gompertz model is more adequate for predicting the growth of S. aureus on raw pork than other models. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Sutton, Jazmyne A.; Walsh-Buhi, Eric R.
2017-01-01
Objective: This study investigated variables within the Integrative Model of Behavioral Prediction (IMBP) as well as differences across socioeconomic status (SES) levels within the context of inconsistent contraceptive use among college women. Participants: A nonprobability sample of 515 female college students completed an Internet-based survey…
ERIC Educational Resources Information Center
Putwain, Dave; Deveney, Carolyn
2009-01-01
The aim of this study was to examine an expanded integrative hierarchical model of test emotions and achievement goal orientations in predicting the examination performance of undergraduate students. Achievement goals were theorised as mediating the relationship between test emotions and performance. 120 undergraduate students completed…
Tank System Integrated Model: A Cryogenic Tank Performance Prediction Program
NASA Technical Reports Server (NTRS)
Bolshinskiy, L. G.; Hedayat, A.; Hastings, L. J.; Sutherlin, S. G.; Schnell, A. R.; Moder, J. P.
2017-01-01
Accurate predictions of the thermodynamic state of the cryogenic propellants, pressurization rate, and performance of pressure control techniques in cryogenic tanks are required for development of cryogenic fluid long-duration storage technology and planning for future space exploration missions. This Technical Memorandum (TM) presents the analytical tool, Tank System Integrated Model (TankSIM), which can be used for modeling pressure control and predicting the behavior of cryogenic propellant for long-term storage for future space missions. Utilizing TankSIM, the following processes can be modeled: tank self-pressurization, boiloff, ullage venting, mixing, and condensation on the tank wall. This TM also includes comparisons of TankSIM program predictions with the test data andexamples of multiphase mission calculations.
Bayesian Integration of Information in Hippocampal Place Cells
Madl, Tamas; Franklin, Stan; Chen, Ke; Montaldi, Daniela; Trappl, Robert
2014-01-01
Accurate spatial localization requires a mechanism that corrects for errors, which might arise from inaccurate sensory information or neuronal noise. In this paper, we propose that Hippocampal place cells might implement such an error correction mechanism by integrating different sources of information in an approximately Bayes-optimal fashion. We compare the predictions of our model with physiological data from rats. Our results suggest that useful predictions regarding the firing fields of place cells can be made based on a single underlying principle, Bayesian cue integration, and that such predictions are possible using a remarkably small number of model parameters. PMID:24603429
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
The Use of Behavior Models for Predicting Complex Operations
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2010-01-01
Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.
Snitkin, Evan S; Dudley, Aimée M; Janse, Daniel M; Wong, Kaisheen; Church, George M; Segrè, Daniel
2008-01-01
Background Understanding the response of complex biochemical networks to genetic perturbations and environmental variability is a fundamental challenge in biology. Integration of high-throughput experimental assays and genome-scale computational methods is likely to produce insight otherwise unreachable, but specific examples of such integration have only begun to be explored. Results In this study, we measured growth phenotypes of 465 Saccharomyces cerevisiae gene deletion mutants under 16 metabolically relevant conditions and integrated them with the corresponding flux balance model predictions. We first used discordance between experimental results and model predictions to guide a stage of experimental refinement, which resulted in a significant improvement in the quality of the experimental data. Next, we used discordance still present in the refined experimental data to assess the reliability of yeast metabolism models under different conditions. In addition to estimating predictive capacity based on growth phenotypes, we sought to explain these discordances by examining predicted flux distributions visualized through a new, freely available platform. This analysis led to insight into the glycerol utilization pathway and the potential effects of metabolic shortcuts on model results. Finally, we used model predictions and experimental data to discriminate between alternative raffinose catabolism routes. Conclusions Our study demonstrates how a new level of integration between high throughput measurements and flux balance model predictions can improve understanding of both experimental and computational results. The added value of a joint analysis is a more reliable platform for specific testing of biological hypotheses, such as the catabolic routes of different carbon sources. PMID:18808699
Key Issues for Seamless Integrated Chemistry–Meteorology Modeling
Online coupled meteorology–atmospheric chemistry models have greatly evolved in recent years. Although mainly developed by the air quality modeling community, these integrated models are also of interest for numerical weather prediction and climate modeling, as they can con...
ERIC Educational Resources Information Center
Collado-Rivera, Maria; Branscum, Paul; Larson, Daniel; Gao, Haijuan
2018-01-01
Objective: The objective of this study was to evaluate the determinants of sugary drink consumption among overweight and obese adults attempting to lose weight using the Integrative Model of Behavioural Prediction (IMB). Design: Cross-sectional design. Method: Determinants of behavioural intentions (attitudes, perceived norms and perceived…
DW-75-92243901
Title: Integrating Earth Observation and Field Data into a Lyme Disease Model to Map and Predict Risks to Biodiversity and Human HealthDurland Fish, Maria Diuk-Wasser, Joe Roman, Yongtao Guan, Brad Lobitz, Rama Nemani, Joe Piesman, Montira J. Pongsiri, F...
ERIC Educational Resources Information Center
Xu, Xiaohe; Tung, Yuk-Ying; Dunaway, R. Gregory
2000-01-01
This article constructs a model to predict the likelihood of parental use of corporal punishment on children in two-parent families. Reports that corporal punishment is primarily determined by cultural, human, and social capital that are available to, or already acquired by parents. Discusses an integrated, resource-based theory for predicting use…
Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System
2010-09-13
model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of
NASA Technical Reports Server (NTRS)
Perry, Bruce; Anderson, Molly
2015-01-01
The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station (ISS) Water Processor Assembly (WPA) to form a complete Water Recovery System (WRS) for future missions. Independent chemical process simulations with varying levels of detail have previously been developed using Aspen Custom Modeler (ACM) to aid in the analysis of the CDS and several WPA components. The existing CDS simulation could not model behavior during thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. The first part of this paper describes modifications to the ACM model of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version of the model can accurately predict behavior during thermal startup for both NaCl solution and pretreated urine feeds. The model is used to predict how changing operating parameters and design features of the CDS affects its performance, and conclusions from these predictions are discussed. The second part of this paper describes the integration of the modified CDS model and the existing WPA component models into a single WRS model. The integrated model is used to demonstrate the effects that changes to one component can have on the dynamic behavior of the system as a whole.
Akbaş, Halil; Bilgen, Bilge; Turhan, Aykut Melih
2015-11-01
This study proposes an integrated prediction and optimization model by using multi-layer perceptron neural network and particle swarm optimization techniques. Three different objective functions are formulated. The first one is the maximization of methane percentage with single output. The second one is the maximization of biogas production with single output. The last one is the maximization of biogas quality and biogas production with two outputs. Methane percentage, carbon dioxide percentage, and other contents' percentage are used as the biogas quality criteria. Based on the formulated models and data from a wastewater treatment facility, optimal values of input variables and their corresponding maximum output values are found out for each model. It is expected that the application of the integrated prediction and optimization models increases the biogas production and biogas quality, and contributes to the quantity of electricity production at the wastewater treatment facility. Copyright © 2015 Elsevier Ltd. All rights reserved.
Stegen, James C
2018-01-01
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modeling frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. We can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.
Wombacher, Kevin; Dai, Minhao; Matig, Jacob J; Harrington, Nancy Grant
2018-03-22
To identify salient behavioral determinants related to STI testing among college students by testing a model based on the integrative model of behavioral (IMBP) prediction. 265 undergraduate students from a large university in the Southeastern US. Formative and survey research to test an IMBP-based model that explores the relationships between determinants and STI testing intention and behavior. Results of path analyses supported a model in which attitudinal beliefs predicted intention and intention predicted behavior. Normative beliefs and behavioral control beliefs were not significant in the model; however, select individual normative and control beliefs were significantly correlated with intention and behavior. Attitudinal beliefs are the strongest predictor of STI testing intention and behavior. Future efforts to increase STI testing rates should identify and target salient attitudinal beliefs.
COMPARING MID-INFRARED GLOBULAR CLUSTER COLORS WITH POPULATION SYNTHESIS MODELS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barmby, P.; Jalilian, F. F.
2012-04-15
Several population synthesis models now predict integrated colors of simple stellar populations in the mid-infrared bands. To date, the models have not been extensively tested in this wavelength range. In a comparison of the predictions of several recent population synthesis models, the integrated colors are found to cover approximately the same range but to disagree in detail, for example, on the effects of metallicity. To test against observational data, globular clusters (GCs) are used as the closest objects to idealized groups of stars with a single age and single metallicity. Using recent mass estimates, we have compiled a sample ofmore » massive, old GCs in M31 which contain enough stars to guard against the stochastic effects of small-number statistics, and measured their integrated colors in the Spitzer/IRAC bands. Comparison of the cluster photometry in the IRAC bands with the model predictions shows that the models reproduce the cluster colors reasonably well, except for a small (not statistically significant) offset in [4.5] - [5.8]. In this color, models without circumstellar dust emission predict bluer values than are observed. Model predictions of colors formed from the V band and the IRAC 3.6 and 4.5 {mu}m bands are redder than the observed data at high metallicities and we discuss several possible explanations. In agreement with model predictions, V - [3.6] and V - [4.5] colors are found to have metallicity sensitivity similar to or slightly better than V - K{sub s}.« less
NASA Astrophysics Data System (ADS)
Norton, Andrew S.
An integral component of managing game species is an understanding of population dynamics and relative abundance. Harvest data are frequently used to estimate abundance of white-tailed deer. Unless harvest age-structure is representative of the population age-structure and harvest vulnerability remains constant from year to year, these data alone are of limited value. Additional model structure and auxiliary information has accommodated this shortcoming. Specifically, integrated age-at-harvest (AAH) state-space population models can formally combine multiple sources of data, and regularization via hierarchical model structure can increase flexibility of model parameters. I collected known fates data, which I evaluated and used to inform trends in survival parameters for an integrated AAH model. I used temperature and snow depth covariates to predict survival outside of the hunting season, and opening weekend temperature and percent of corn harvest covariates to predict hunting season survival. When auxiliary empirical data were unavailable for the AAH model, moderately informative priors provided sufficient information for convergence and parameter estimates. The AAH model was most sensitive to errors in initial abundance, but this error was calibrated after 3 years. Among vital rates, the AAH model was most sensitive to reporting rates (percentage of mortality during the hunting season related to harvest). The AAH model, using only harvest data, was able to track changing abundance trends due to changes in survival rates even when prior models did not inform these changes (i.e. prior models were constant when truth varied). I also compared AAH model results with estimates from the Wisconsin Department of Natural Resources (WIDNR). Trends in abundance estimates from both models were similar, although AAH model predictions were systematically higher than WIDNR estimates in the East study area. When I incorporated auxiliary information (i.e. integrated AAH model) about survival outside the hunting season from known fates data, predicted trends appeared more closely related to what was expected. Disagreements between the AAH model and WIDNR estimates in the East were likely related to biased predictions for reporting and survival rates from the AAH model.
Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process
NASA Astrophysics Data System (ADS)
Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.
2018-06-01
A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...
Integrated and spectral energetics of the GLAS general circulation model
NASA Technical Reports Server (NTRS)
Tenenbaum, J.
1981-01-01
Integrated and spectral error energetics of the Goddard Laboratory for Atmospheric Sciences (GLAS) general circulation model are compared with observations for periods in January 1975, 1976, and 1977. For two cases the model shows significant skill in predicting integrated energetics quantities out to two weeks, and for all three cases, the integrated monthly mean energetics show qualitative improvements over previous versions of the model in eddy kinetic energy and barotropic conversions. Fundamental difficulties remain with leakage of energy to the stratospheric level. General circulation model spectral energetics predictions are compared with the corresponding observational spectra on a day by day basis. Eddy kinetic energy can be correct while significant errors occur in the kinetic energy of wavenumber three. Single wavenumber dominance in eddy kinetic energy and the correlation of spectral kinetic and potential energy are demonstrated.
New Integrated Modeling Capabilities: MIDAS' Recent Behavioral Enhancements
NASA Technical Reports Server (NTRS)
Gore, Brian F.; Jarvis, Peter A.
2005-01-01
The Man-machine Integration Design and Analysis System (MIDAS) is an integrated human performance modeling software tool that is based on mechanisms that underlie and cause human behavior. A PC-Windows version of MIDAS has been created that integrates the anthropometric character "Jack (TM)" with MIDAS' validated perceptual and attention mechanisms. MIDAS now models multiple simulated humans engaging in goal-related behaviors. New capabilities include the ability to predict situations in which errors and/or performance decrements are likely due to a variety of factors including concurrent workload and performance influencing factors (PIFs). This paper describes a new model that predicts the effects of microgravity on a mission specialist's performance, and its first application to simulating the task of conducting a Life Sciences experiment in space according to a sequential or parallel schedule of performance.
Genomic signals of selection predict climate-driven population declines in a migratory bird.
Bay, Rachael A; Harrigan, Ryan J; Underwood, Vinh Le; Gibbs, H Lisle; Smith, Thomas B; Ruegg, Kristen
2018-01-05
The ongoing loss of biodiversity caused by rapid climatic shifts requires accurate models for predicting species' responses. Despite evidence that evolutionary adaptation could mitigate climate change impacts, evolution is rarely integrated into predictive models. Integrating population genomics and environmental data, we identified genomic variation associated with climate across the breeding range of the migratory songbird, yellow warbler ( Setophaga petechia ). Populations requiring the greatest shifts in allele frequencies to keep pace with future climate change have experienced the largest population declines, suggesting that failure to adapt may have already negatively affected populations. Broadly, our study suggests that the integration of genomic adaptation can increase the accuracy of future species distribution models and ultimately guide more effective mitigation efforts. Copyright © 2018, American Association for the Advancement of Science.
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information
Wang, Xiaohong; Wang, Lizhi
2017-01-01
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system. PMID:28926930
Modeling of BN Lifetime Prediction of a System Based on Integrated Multi-Level Information.
Wang, Jingbin; Wang, Xiaohong; Wang, Lizhi
2017-09-15
Predicting system lifetime is important to ensure safe and reliable operation of products, which requires integrated modeling based on multi-level, multi-sensor information. However, lifetime characteristics of equipment in a system are different and failure mechanisms are inter-coupled, which leads to complex logical correlations and the lack of a uniform lifetime measure. Based on a Bayesian network (BN), a lifetime prediction method for systems that combine multi-level sensor information is proposed. The method considers the correlation between accidental failures and degradation failure mechanisms, and achieves system modeling and lifetime prediction under complex logic correlations. This method is applied in the lifetime prediction of a multi-level solar-powered unmanned system, and the predicted results can provide guidance for the improvement of system reliability and for the maintenance and protection of the system.
USDA-ARS?s Scientific Manuscript database
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...
Predictive models of moth development
USDA-ARS?s Scientific Manuscript database
Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...
NASA Astrophysics Data System (ADS)
Yeo, I. Y.; Lang, M.; Lee, S.; Huang, C.; Jin, H.; McCarty, G.; Sadeghi, A.
2017-12-01
The wetland ecosystem plays crucial roles in improving hydrological function and ecological integrity for the downstream water and the surrounding landscape. However, changing behaviours and functioning of wetland ecosystems are poorly understood and extremely difficult to characterize. Improved understanding on hydrological behaviours of wetlands, considering their interaction with surrounding landscapes and impacts on downstream waters, is an essential first step toward closing the knowledge gap. We present an integrated wetland-catchment modelling study that capitalizes on recently developed inundation maps and other geospatial data. The aim of the data-model integration is to improve spatial prediction of wetland inundation and evaluate cumulative hydrological benefits at the catchment scale. In this paper, we highlight problems arising from data preparation, parameterization, and process representation in simulating wetlands within a distributed catchment model, and report the recent progress on mapping of wetland dynamics (i.e., inundation) using multiple remotely sensed data. We demonstrate the value of spatially explicit inundation information to develop site-specific wetland parameters and to evaluate model prediction at multi-spatial and temporal scales. This spatial data-model integrated framework is tested using Soil and Water Assessment Tool (SWAT) with improved wetland extension, and applied for an agricultural watershed in the Mid-Atlantic Coastal Plain, USA. This study illustrates necessity of spatially distributed information and a data integrated modelling approach to predict inundation of wetlands and hydrologic function at the local landscape scale, where monitoring and conservation decision making take place.
Statistical Modeling and Prediction for Tourism Economy Using Dendritic Neural Network
Yu, Ying; Wang, Yirui; Tang, Zheng
2017-01-01
With the impact of global internationalization, tourism economy has also been a rapid development. The increasing interest aroused by more advanced forecasting methods leads us to innovate forecasting methods. In this paper, the seasonal trend autoregressive integrated moving averages with dendritic neural network model (SA-D model) is proposed to perform the tourism demand forecasting. First, we use the seasonal trend autoregressive integrated moving averages model (SARIMA model) to exclude the long-term linear trend and then train the residual data by the dendritic neural network model and make a short-term prediction. As the result showed in this paper, the SA-D model can achieve considerably better predictive performances. In order to demonstrate the effectiveness of the SA-D model, we also use the data that other authors used in the other models and compare the results. It also proved that the SA-D model achieved good predictive performances in terms of the normalized mean square error, absolute percentage of error, and correlation coefficient. PMID:28246527
Statistical Modeling and Prediction for Tourism Economy Using Dendritic Neural Network.
Yu, Ying; Wang, Yirui; Gao, Shangce; Tang, Zheng
2017-01-01
With the impact of global internationalization, tourism economy has also been a rapid development. The increasing interest aroused by more advanced forecasting methods leads us to innovate forecasting methods. In this paper, the seasonal trend autoregressive integrated moving averages with dendritic neural network model (SA-D model) is proposed to perform the tourism demand forecasting. First, we use the seasonal trend autoregressive integrated moving averages model (SARIMA model) to exclude the long-term linear trend and then train the residual data by the dendritic neural network model and make a short-term prediction. As the result showed in this paper, the SA-D model can achieve considerably better predictive performances. In order to demonstrate the effectiveness of the SA-D model, we also use the data that other authors used in the other models and compare the results. It also proved that the SA-D model achieved good predictive performances in terms of the normalized mean square error, absolute percentage of error, and correlation coefficient.
A computationally efficient modelling of laminar separation bubbles
NASA Technical Reports Server (NTRS)
Maughmer, Mark D.
1988-01-01
The goal of this research is to accurately predict the characteristics of the laminar separation bubble and its effects on airfoil performance. To this end, a model of the bubble is under development and will be incorporated in the analysis section of the Eppler and Somers program. As a first step in this direction, an existing bubble model was inserted into the program. It was decided to address the problem of the short bubble before attempting the prediction of the long bubble. In the second place, an integral boundary-layer method is believed more desirable than a finite difference approach. While these two methods achieve similar prediction accuracy, finite-difference methods tend to involve significantly longer computer run times than the integral methods. Finally, as the boundary-layer analysis in the Eppler and Somers program employs the momentum and kinetic energy integral equations, a short-bubble model compatible with these equations is most preferable.
Assessment of the hybrid propagation model, Volume 2: Comparison with the Integrated Noise Model
DOT National Transportation Integrated Search
2012-08-31
This is the second of two volumes of the report on the Hybrid Propagation Model (HPM), an advanced prediction model for aviation noise propagation. This volume presents comparisons of the HPM and the Integrated Noise Model (INM) for conditions of une...
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith; Lewis, Shelia
2017-01-01
This study integrated the Spiral Curriculum approach into the Robust Learning Model as part of a continuous improvement process that was designed to improve educational effectiveness and then assessed the differences between the initial and integrated models as well as the predictability of the first course in the integrated learning model on a…
ERIC Educational Resources Information Center
Knezek, Gerald; Christensen, Rhonda
2016-01-01
An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework
Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-01-01
Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.
Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-02-01
Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.
NASA Astrophysics Data System (ADS)
Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.
2015-12-01
Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we contend that creating believable soil carbon predictions requires a robust, transparent, and community-available benchmarking framework. I will present an ILAMB evaluation of several of the above-mentioned approaches in ACME, and attempt to motivate community adoption of this evaluation approach.
The Use of a Block Diagram Simulation Language for Rapid Model Prototyping
NASA Technical Reports Server (NTRS)
Whitlow, Johnathan E.; Engrand, Peter
1996-01-01
The research performed this summer was a continuation of work performed during the 1995 NASA/ASEE Summer Fellowship. The focus of the work was to expand previously generated predictive models for liquid oxygen (LOX) loading into the external fuel tank of the shuttle. The models which were developed using a block diagram simulation language known as VisSim, were evaluated on numerous shuttle flights and found to well in most cases. Once the models were refined and validated, the predictive methods were integrated into the existing Rockwell software propulsion advisory tool (PAT). Although time was not sufficient to completely integrate the models developed into PAT, the ability to predict flows and pressures in the orbiter section and graphically display the results was accomplished.
Which Working Memory Functions Predict Intelligence?
ERIC Educational Resources Information Center
Oberauer, Klaus; Sub, Heinz-Martin; Wilhelm, Oliver; Wittmann, Werner W.
2008-01-01
Investigates the relationship between three factors of working memory (storage and processing, relational integration, and supervision) and four factors of intelligence (reasoning, speed, memory, and creativity) using structural equation models. Relational integration predicted reasoning ability at least as well as the storage-and-processing…
Automatically updating predictive modeling workflows support decision-making in drug design.
Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O
2016-09-01
Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.
NASA Astrophysics Data System (ADS)
Zhu, Linqi; Zhang, Chong; Zhang, Chaomo; Wei, Yang; Zhou, Xueqing; Cheng, Yuan; Huang, Yuyang; Zhang, Le
2018-06-01
There is increasing interest in shale gas reservoirs due to their abundant reserves. As a key evaluation criterion, the total organic carbon content (TOC) of the reservoirs can reflect its hydrocarbon generation potential. The existing TOC calculation model is not very accurate and there is still the possibility for improvement. In this paper, an integrated hybrid neural network (IHNN) model is proposed for predicting the TOC. This is based on the fact that the TOC information on the low TOC reservoir, where the TOC is easy to evaluate, comes from a prediction problem, which is the inherent problem of the existing algorithm. By comparing the prediction models established in 132 rock samples in the shale gas reservoir within the Jiaoshiba area, it can be seen that the accuracy of the proposed IHNN model is much higher than that of the other prediction models. The mean square error of the samples, which were not joined to the established models, was reduced from 0.586 to 0.442. The results show that TOC prediction is easier after logging prediction has been improved. Furthermore, this paper puts forward the next research direction of the prediction model. The IHNN algorithm can help evaluate the TOC of a shale gas reservoir.
Magnotti, John F; Beauchamp, Michael S
2017-02-01
Audiovisual speech integration combines information from auditory speech (talker's voice) and visual speech (talker's mouth movements) to improve perceptual accuracy. However, if the auditory and visual speech emanate from different talkers, integration decreases accuracy. Therefore, a key step in audiovisual speech perception is deciding whether auditory and visual speech have the same source, a process known as causal inference. A well-known illusion, the McGurk Effect, consists of incongruent audiovisual syllables, such as auditory "ba" + visual "ga" (AbaVga), that are integrated to produce a fused percept ("da"). This illusion raises two fundamental questions: first, given the incongruence between the auditory and visual syllables in the McGurk stimulus, why are they integrated; and second, why does the McGurk effect not occur for other, very similar syllables (e.g., AgaVba). We describe a simplified model of causal inference in multisensory speech perception (CIMS) that predicts the perception of arbitrary combinations of auditory and visual speech. We applied this model to behavioral data collected from 60 subjects perceiving both McGurk and non-McGurk incongruent speech stimuli. The CIMS model successfully predicted both the audiovisual integration observed for McGurk stimuli and the lack of integration observed for non-McGurk stimuli. An identical model without causal inference failed to accurately predict perception for either form of incongruent speech. The CIMS model uses causal inference to provide a computational framework for studying how the brain performs one of its most important tasks, integrating auditory and visual speech cues to allow us to communicate with others.
Towards Assessing the Human Trajectory Planning Horizon
Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk
2016-01-01
Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models. PMID:27936015
Towards Assessing the Human Trajectory Planning Horizon.
Carton, Daniel; Nitsch, Verena; Meinzer, Dominik; Wollherr, Dirk
2016-01-01
Mobile robots are envisioned to cooperate closely with humans and to integrate seamlessly into a shared environment. For locomotion, these environments resemble traversable areas which are shared between multiple agents like humans and robots. The seamless integration of mobile robots into these environments requires accurate predictions of human locomotion. This work considers optimal control and model predictive control approaches for accurate trajectory prediction and proposes to integrate aspects of human behavior to improve their performance. Recently developed models are not able to reproduce accurately trajectories that result from sudden avoidance maneuvers. Particularly, the human locomotion behavior when handling disturbances from other agents poses a problem. The goal of this work is to investigate whether humans alter their trajectory planning horizon, in order to resolve abruptly emerging collision situations. By modeling humans as model predictive controllers, the influence of the planning horizon is investigated in simulations. Based on these results, an experiment is designed to identify, whether humans initiate a change in their locomotion planning behavior while moving in a complex environment. The results support the hypothesis, that humans employ a shorter planning horizon to avoid collisions that are triggered by unexpected disturbances. Observations presented in this work are expected to further improve the generalizability and accuracy of prediction methods based on dynamic models.
3D Printed Organ Models with Physical Properties of Tissue and Integrated Sensors.
Qiu, Kaiyan; Zhao, Zichen; Haghiashtiani, Ghazaleh; Guo, Shuang-Zhuang; He, Mingyu; Su, Ruitao; Zhu, Zhijie; Bhuiyan, Didarul B; Murugan, Paari; Meng, Fanben; Park, Sung Hyun; Chu, Chih-Chang; Ogle, Brenda M; Saltzman, Daniel A; Konety, Badrinath R; Sweet, Robert M; McAlpine, Michael C
2018-03-01
The design and development of novel methodologies and customized materials to fabricate patient-specific 3D printed organ models with integrated sensing capabilities could yield advances in smart surgical aids for preoperative planning and rehearsal. Here, we demonstrate 3D printed prostate models with physical properties of tissue and integrated soft electronic sensors using custom-formulated polymeric inks. The models show high quantitative fidelity in static and dynamic mechanical properties, optical characteristics, and anatomical geometries to patient tissues and organs. The models offer tissue-mimicking tactile sensation and behavior and thus can be used for the prediction of organ physical behavior under deformation. The prediction results show good agreement with values obtained from simulations. The models also allow the application of surgical and diagnostic tools to their surface and inner channels. Finally, via the conformal integration of 3D printed soft electronic sensors, pressure applied to the models with surgical tools can be quantitatively measured.
3D Printed Organ Models with Physical Properties of Tissue and Integrated Sensors
Qiu, Kaiyan; Zhao, Zichen; Haghiashtiani, Ghazaleh; Guo, Shuang-Zhuang; He, Mingyu; Su, Ruitao; Zhu, Zhijie; Bhuiyan, Didarul B.; Murugan, Paari; Meng, Fanben; Park, Sung Hyun; Chu, Chih-Chang; Ogle, Brenda M.; Saltzman, Daniel A.; Konety, Badrinath R.
2017-01-01
The design and development of novel methodologies and customized materials to fabricate patient-specific 3D printed organ models with integrated sensing capabilities could yield advances in smart surgical aids for preoperative planning and rehearsal. Here, we demonstrate 3D printed prostate models with physical properties of tissue and integrated soft electronic sensors using custom-formulated polymeric inks. The models show high quantitative fidelity in static and dynamic mechanical properties, optical characteristics, and anatomical geometries to patient tissues and organs. The models offer tissue-mimicking tactile sensation and behavior and thus can be used for the prediction of organ physical behavior under deformation. The prediction results show good agreement with values obtained from simulations. The models also allow the application of surgical and diagnostic tools to their surface and inner channels. Finally, via the conformal integration of 3D printed soft electronic sensors, pressure applied to the models with surgical tools can be quantitatively measured. PMID:29608202
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
ERIC Educational Resources Information Center
Liao, Jung-Yu; Chang, Li-Chun; Hsu, Hsiao-Pei; Huang, Chiu-Mieh; Huang, Su-Fei; Guo, Jong-Long
2017-01-01
This study assessed the effects of a model that integrated the theory of planned behavior (TPB) with extrinsic motivation (EM) in predicting the intentions of fifth-grade students to not use illicit drugs. A cluster-sampling design was adopted in a cross-sectional survey (N = 571). The structural equation modeling results showed that the model…
Reid, Allecia E.; Aiken, Leona S.
2011-01-01
The purpose of this research was to select from the health belief model (HBM), theories of reasoned action (TRA) and planned behaviour (TPB), information-motivation-behavioural skills model (IMB), and social cognitive theory (SCT) the strongest longitudinal predictors of women’s condom use and to combine these constructs into a single integrated model of condom use. The integrated model was evaluated for prediction of condom use among young women who had steady versus casual partners. At Time 1, all constructs of the five models and condom use were assessed in an initial and a replication sample (n= 193, n= 161). Condom use reassessed 8 weeks later (Time 2) served as the main outcome. Information from IMB, perceived susceptibility, benefits, and barriers from HBM, self-efficacy and self-evaluative expectancies from SCT, and partner norm and attitudes from TPB served as indirect or direct predictors of condom use. All paths replicated across samples. Direct predictors of behaviour varied with relationship status: self-efficacy significantly predicted condom use for women with casual partners, while attitude and partner norm predicted for those with steady partners. Integrated psychosocial models, rich in constructs and relationships drawn from multiple theories of behaviour, may provide a more complete characterization of health protective behaviour. PMID:21678166
Improving of local ozone forecasting by integrated models.
Gradišar, Dejan; Grašič, Boštjan; Božnar, Marija Zlata; Mlakar, Primož; Kocijan, Juš
2016-09-01
This paper discuss the problem of forecasting the maximum ozone concentrations in urban microlocations, where reliable alerting of the local population when thresholds have been surpassed is necessary. To improve the forecast, the methodology of integrated models is proposed. The model is based on multilayer perceptron neural networks that use as inputs all available information from QualeAria air-quality model, WRF numerical weather prediction model and onsite measurements of meteorology and air pollution. While air-quality and meteorological models cover large geographical 3-dimensional space, their local resolution is often not satisfactory. On the other hand, empirical methods have the advantage of good local forecasts. In this paper, integrated models are used for improved 1-day-ahead forecasting of the maximum hourly value of ozone within each day for representative locations in Slovenia. The WRF meteorological model is used for forecasting meteorological variables and the QualeAria air-quality model for gas concentrations. Their predictions, together with measurements from ground stations, are used as inputs to a neural network. The model validation results show that integrated models noticeably improve ozone forecasts and provide better alert systems.
NASA Astrophysics Data System (ADS)
Gottschalk, Ian P.; Hermans, Thomas; Knight, Rosemary; Caers, Jef; Cameron, David A.; Regnery, Julia; McCray, John E.
2017-12-01
Geophysical data have proven to be very useful for lithological characterization. However, quantitatively integrating the information gained from acquiring geophysical data generally requires colocated lithological and geophysical data for constructing a rock-physics relationship. In this contribution, the issue of integrating noncolocated geophysical and lithological data is addressed, and the results are applied to simulate groundwater flow in a heterogeneous aquifer in the Prairie Waters Project North Campus aquifer recharge site, Colorado. Two methods of constructing a rock-physics transform between electrical resistivity tomography (ERT) data and lithology measurements are assessed. In the first approach, a maximum likelihood estimation (MLE) is used to fit a bimodal lognormal distribution to horizontal crosssections of the ERT resistivity histogram. In the second approach, a spatial bootstrap is applied to approximate the rock-physics relationship. The rock-physics transforms provide soft data for multiple point statistics (MPS) simulations. Subsurface models are used to run groundwater flow and tracer test simulations. Each model's uncalibrated, predicted breakthrough time is evaluated based on its agreement with measured subsurface travel time values from infiltration basins to selected groundwater recovery wells. We find that incorporating geophysical information into uncalibrated flow models reduces the difference with observed values, as compared to flow models without geophysical information incorporated. The integration of geophysical data also narrows the variance of predicted tracer breakthrough times substantially. Accuracy is highest and variance is lowest in breakthrough predictions generated by the MLE-based rock-physics transform. Calibrating the ensemble of geophysically constrained models would help produce a suite of realistic flow models for predictive purposes at the site. We find that the success of breakthrough predictions is highly sensitive to the definition of the rock-physics transform; it is therefore important to model this transfer function accurately.
Fast integration-based prediction bands for ordinary differential equation models.
Hass, Helge; Kreutz, Clemens; Timmer, Jens; Kaschek, Daniel
2016-04-15
To gain a deeper understanding of biological processes and their relevance in disease, mathematical models are built upon experimental data. Uncertainty in the data leads to uncertainties of the model's parameters and in turn to uncertainties of predictions. Mechanistic dynamic models of biochemical networks are frequently based on nonlinear differential equation systems and feature a large number of parameters, sparse observations of the model components and lack of information in the available data. Due to the curse of dimensionality, classical and sampling approaches propagating parameter uncertainties to predictions are hardly feasible and insufficient. However, for experimental design and to discriminate between competing models, prediction and confidence bands are essential. To circumvent the hurdles of the former methods, an approach to calculate a profile likelihood on arbitrary observations for a specific time point has been introduced, which provides accurate confidence and prediction intervals for nonlinear models and is computationally feasible for high-dimensional models. In this article, reliable and smooth point-wise prediction and confidence bands to assess the model's uncertainty on the whole time-course are achieved via explicit integration with elaborate correction mechanisms. The corresponding system of ordinary differential equations is derived and tested on three established models for cellular signalling. An efficiency analysis is performed to illustrate the computational benefit compared with repeated profile likelihood calculations at multiple time points. The integration framework and the examples used in this article are provided with the software package Data2Dynamics, which is based on MATLAB and freely available at http://www.data2dynamics.org helge.hass@fdm.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.
2008-09-01
The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less
The US EPA National Exposure Research Laboratory (NERL) is currently developing an integrated human exposure source-to-dose modeling system (HES2D). This modeling system will incorporate models that use a probabilistic approach to predict population exposures to environmental ...
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Wang, Kewei; Song, Wentao; Li, Jinping; Lu, Wu; Yu, Jiangang; Han, Xiaofeng
2016-05-01
The aim of this study is to forecast the incidence of bacillary dysentery with a prediction model. We collected the annual and monthly laboratory data of confirmed cases from January 2004 to December 2014. In this study, we applied an autoregressive integrated moving average (ARIMA) model to forecast bacillary dysentery incidence in Jiangsu, China. The ARIMA (1, 1, 1) × (1, 1, 2)12 model fitted exactly with the number of cases during January 2004 to December 2014. The fitted model was then used to predict bacillary dysentery incidence during the period January to August 2015, and the number of cases fell within the model's CI for the predicted number of cases during January-August 2015. This study shows that the ARIMA model fits the fluctuations in bacillary dysentery frequency, and it can be used for future forecasting when applied to bacillary dysentery prevention and control. © 2016 APJPH.
NASA Astrophysics Data System (ADS)
To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian
2008-09-01
Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.
The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct
ERIC Educational Resources Information Center
Knezek, Gerald; Christensen, Rhonda
2015-01-01
An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…
A System for Integrated Reliability and Safety Analyses
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles
1999-01-01
We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.
This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...
Knecht, Carolin; Mort, Matthew; Junge, Olaf; Cooper, David N.; Krawczak, Michael
2017-01-01
Abstract The in silico prediction of the functional consequences of mutations is an important goal of human pathogenetics. However, bioinformatic tools that classify mutations according to their functionality employ different algorithms so that predictions may vary markedly between tools. We therefore integrated nine popular prediction tools (PolyPhen-2, SNPs&GO, MutPred, SIFT, MutationTaster2, Mutation Assessor and FATHMM as well as conservation-based Grantham Score and PhyloP) into a single predictor. The optimal combination of these tools was selected by means of a wide range of statistical modeling techniques, drawing upon 10 029 disease-causing single nucleotide variants (SNVs) from Human Gene Mutation Database and 10 002 putatively ‘benign’ non-synonymous SNVs from UCSC. Predictive performance was found to be markedly improved by model-based integration, whilst maximum predictive capability was obtained with either random forest, decision tree or logistic regression analysis. A combination of PolyPhen-2, SNPs&GO, MutPred, MutationTaster2 and FATHMM was found to perform as well as all tools combined. Comparison of our approach with other integrative approaches such as Condel, CoVEC, CAROL, CADD, MetaSVM and MetaLR using an independent validation dataset, revealed the superiority of our newly proposed integrative approach. An online implementation of this approach, IMHOTEP (‘Integrating Molecular Heuristics and Other Tools for Effect Prediction’), is provided at http://www.uni-kiel.de/medinfo/cgi-bin/predictor/. PMID:28180317
Sadique, Z; Grieve, R; Harrison, D A; Jit, M; Allen, E; Rowan, K M
2013-12-01
This article proposes an integrated approach to the development, validation, and evaluation of new risk prediction models illustrated with the Fungal Infection Risk Evaluation study, which developed risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive fungal disease (IFD). Our decision-analytical model compared alternative strategies for preventing IFD at up to three clinical decision time points (critical care admission, after 24 hours, and end of day 3), followed with antifungal prophylaxis for those judged "high" risk versus "no formal risk assessment." We developed prognostic models to predict the risk of IFD before critical care unit discharge, with data from 35,455 admissions to 70 UK adult, critical care units, and validated the models externally. The decision model was populated with positive predictive values and negative predictive values from the best-fitting risk models. We projected lifetime cost-effectiveness and expected value of partial perfect information for groups of parameters. The risk prediction models performed well in internal and external validation. Risk assessment and prophylaxis at the end of day 3 was the most cost-effective strategy at the 2% and 1% risk threshold. Risk assessment at each time point was the most cost-effective strategy at a 0.5% risk threshold. Expected values of partial perfect information were high for positive predictive values or negative predictive values (£11 million-£13 million) and quality-adjusted life-years (£11 million). It is cost-effective to formally assess the risk of IFD for non-neutropenic, critically ill adult patients. This integrated approach to developing and evaluating risk models is useful for informing clinical practice and future research investment. © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Published by International Society for Pharmacoeconomics and Outcomes Research (ISPOR) All rights reserved.
A High Precision Prediction Model Using Hybrid Grey Dynamic Model
ERIC Educational Resources Information Center
Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro
2008-01-01
In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…
Predictive Models and Computational Embryology
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
We developed a numerical model to predict chemical concentrations in indoor environments resulting from soil vapor intrusion and volatilization from groundwater. The model, which integrates new and existing algorithms for chemical fate and transport, was originally...
Integrated and spectral energetics of the GLAS general circulation model
NASA Technical Reports Server (NTRS)
Tenenbaum, J.
1982-01-01
Integrated and spectral error energetics of the GLAS General circulation model are compared with observations for periods in January 1975, 1976, and 1977. For two cases the model shows significant skill in predicting integrated energetics quantities out to two weeks, and for all three cases, the integrated monthly mean energetics show qualitative improvements over previous versions of the model in eddy kinetic energy and barotropic conversions. Fundamental difficulties remain with leakage of energy to the stratospheric level, particularly above strong initial jet streams associated in part with regions of steep terrain. The spectral error growth study represents the first comparison of general circulation model spectral energetics predictions with the corresponding observational spectra on a day by day basis. The major conclusion is that eddy kinetics energy can be correct while significant errors occur in the kinetic energy of wavenumber 3. Both the model and observations show evidence of single wavenumber dominance in eddy kinetic energy and the correlation of spectral kinetics and potential energy.
Integrating Conceptual Knowledge Within and Across Representational Modalities
McNorgan, Chris; Reid, Jackie; McRae, Ken
2011-01-01
Research suggests that concepts are distributed across brain regions specialized for processing information from different sensorimotor modalities. Multimodal semantic models fall into one of two broad classes differentiated by the assumed hierarchy of convergence zones over which information is integrated. In shallow models, communication within- and between-modality is accomplished using either direct connectivity, or a central semantic hub. In deep models, modalities are connected via cascading integration sites with successively wider receptive fields. Four experiments provide the first direct behavioral tests of these models using speeded tasks involving feature inference and concept activation. Shallow models predict no within-modal versus cross-modal difference in either task, whereas deep models predict a within-modal advantage for feature inference, but a cross-modal advantage for concept activation. Experiments 1 and 2 used relatedness judgments to tap participants’ knowledge of relations for within- and cross-modal feature pairs. Experiments 3 and 4 used a dual feature verification task. The pattern of decision latencies across Experiments 1 to 4 is consistent with a deep integration hierarchy. PMID:21093853
Todd, Robert G.; van der Zee, Lucas
2016-01-01
Abstract The eukaryotic cell cycle is robustly designed, with interacting molecules organized within a definite topology that ensures temporal precision of its phase transitions. Its underlying dynamics are regulated by molecular switches, for which remarkable insights have been provided by genetic and molecular biology efforts. In a number of cases, this information has been made predictive, through computational models. These models have allowed for the identification of novel molecular mechanisms, later validated experimentally. Logical modeling represents one of the youngest approaches to address cell cycle regulation. We summarize the advances that this type of modeling has achieved to reproduce and predict cell cycle dynamics. Furthermore, we present the challenge that this type of modeling is now ready to tackle: its integration with intracellular networks, and its formalisms, to understand crosstalks underlying systems level properties, ultimate aim of multi-scale models. Specifically, we discuss and illustrate how such an integration may be realized, by integrating a minimal logical model of the cell cycle with a metabolic network. PMID:27993914
Methods for exploring uncertainty in groundwater management predictions
Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew
2016-01-01
Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.
Hansen, James W
2005-01-01
Interest in integrating crop simulation models with dynamic seasonal climate forecast models is expanding in response to a perceived opportunity to add value to seasonal climate forecasts for agriculture. Integrated modelling may help to address some obstacles to effective agricultural use of climate information. First, modelling can address the mismatch between farmers' needs and available operational forecasts. Probabilistic crop yield forecasts are directly relevant to farmers' livelihood decisions and, at a different scale, to early warning and market applications. Second, credible ex ante evidence of livelihood benefits, using integrated climate–crop–economic modelling in a value-of-information framework, may assist in the challenge of obtaining institutional, financial and political support; and inform targeting for greatest benefit. Third, integrated modelling can reduce the risk and learning time associated with adaptation and adoption, and related uncertainty on the part of advisors and advocates. It can provide insights to advisors, and enhance site-specific interpretation of recommendations when driven by spatial data. Model-based ‘discussion support systems’ contribute to learning and farmer–researcher dialogue. Integrated climate–crop modelling may play a genuine, but limited role in efforts to support climate risk management in agriculture, but only if they are used appropriately, with understanding of their capabilities and limitations, and with cautious evaluation of model predictions and of the insights that arises from model-based decision analysis. PMID:16433092
Meng, Jun; Shi, Lin; Luan, Yushi
2014-01-01
Background Confident identification of microRNA-target interactions is significant for studying the function of microRNA (miRNA). Although some computational miRNA target prediction methods have been proposed for plants, results of various methods tend to be inconsistent and usually lead to more false positive. To address these issues, we developed an integrated model for identifying plant miRNA–target interactions. Results Three online miRNA target prediction toolkits and machine learning algorithms were integrated to identify and analyze Arabidopsis thaliana miRNA-target interactions. Principle component analysis (PCA) feature extraction and self-training technology were introduced to improve the performance. Results showed that the proposed model outperformed the previously existing methods. The results were validated by using degradome sequencing supported Arabidopsis thaliana miRNA-target interactions. The proposed model constructed on Arabidopsis thaliana was run over Oryza sativa and Vitis vinifera to demonstrate that our model is effective for other plant species. Conclusions The integrated model of online predictors and local PCA-SVM classifier gained credible and high quality miRNA-target interactions. The supervised learning algorithm of PCA-SVM classifier was employed in plant miRNA target identification for the first time. Its performance can be substantially improved if more experimentally proved training samples are provided. PMID:25051153
2016-01-01
Modeling and prediction of polar organic chemical integrative sampler (POCIS) sampling rates (Rs) for 73 compounds using artificial neural networks (ANNs) is presented for the first time. Two models were constructed: the first was developed ab initio using a genetic algorithm (GSD-model) to shortlist 24 descriptors covering constitutional, topological, geometrical and physicochemical properties and the second model was adapted for Rs prediction from a previous chromatographic retention model (RTD-model). Mechanistic evaluation of descriptors showed that models did not require comprehensive a priori information to predict Rs. Average predicted errors for the verification and blind test sets were 0.03 ± 0.02 L d–1 (RTD-model) and 0.03 ± 0.03 L d–1 (GSD-model) relative to experimentally determined Rs. Prediction variability in replicated models was the same or less than for measured Rs. Networks were externally validated using a measured Rs data set of six benzodiazepines. The RTD-model performed best in comparison to the GSD-model for these compounds (average absolute errors of 0.0145 ± 0.008 L d–1 and 0.0437 ± 0.02 L d–1, respectively). Improvements to generalizability of modeling approaches will be reliant on the need for standardized guidelines for Rs measurement. The use of in silico tools for Rs determination represents a more economical approach than laboratory calibrations. PMID:27363449
An integrated approach to reconstructing genome-scale transcriptional regulatory networks
Imam, Saheed; Noguera, Daniel R.; Donohue, Timothy J.; ...
2015-02-27
Transcriptional regulatory networks (TRNs) program cells to dynamically alter their gene expression in response to changing internal or environmental conditions. In this study, we develop a novel workflow for generating large-scale TRN models that integrates comparative genomics data, global gene expression analyses, and intrinsic properties of transcription factors (TFs). An assessment of this workflow using benchmark datasets for the well-studied γ-proteobacterium Escherichia coli showed that it outperforms expression-based inference approaches, having a significantly larger area under the precision-recall curve. Further analysis indicated that this integrated workflow captures different aspects of the E. coli TRN than expression-based approaches, potentially making themmore » highly complementary. We leveraged this new workflow and observations to build a large-scale TRN model for the α-Proteobacterium Rhodobacter sphaeroides that comprises 120 gene clusters, 1211 genes (including 93 TFs), 1858 predicted protein-DNA interactions and 76 DNA binding motifs. We found that ~67% of the predicted gene clusters in this TRN are enriched for functions ranging from photosynthesis or central carbon metabolism to environmental stress responses. We also found that members of many of the predicted gene clusters were consistent with prior knowledge in R. sphaeroides and/or other bacteria. Experimental validation of predictions from this R. sphaeroides TRN model showed that high precision and recall was also obtained for TFs involved in photosynthesis (PpsR), carbon metabolism (RSP_0489) and iron homeostasis (RSP_3341). In addition, this integrative approach enabled generation of TRNs with increased information content relative to R. sphaeroides TRN models built via other approaches. We also show how this approach can be used to simultaneously produce TRN models for each related organism used in the comparative genomics analysis. Our results highlight the advantages of integrating comparative genomics of closely related organisms with gene expression data to assemble large-scale TRN models with high-quality predictions.« less
Marshall, Amy D.; Jones, Damon E.; Feinberg, Mark E.
2011-01-01
We tested an integrative model of individual and dyadic variables contributing to intimate partner violence (IPV) perpetration. Based on the vulnerability-stress-adaptation (VSA) model, we hypothesized that three “enduring vulnerabilities” (i.e., antisocial behavior, hostility, and depressive symptoms) would be associated with a “maladaptive process” (i.e., negative relationship attributions) that would lead to difficulties in couple conflict resolution, thus leading to IPV. Among a community sample of 167 heterosexual couples who were expecting their first child, we used an actor-partner interdependence model to account for the dyadic nature of conflict and IPV, as well as a hurdle count model to improve upon prior methods for modeling IPV data. Study results provided general support for the integrative model, demonstrating the importance of considering couple conflict in the prediction of IPV and showing the relative importance of multiple predictor variables. Gender symmetry was observed for the prediction of IPV occurrence, with gender differences emerging in the prediction of IPV frequency. Relatively speaking, the prediction of IPV frequency appeared to be a function of enduring vulnerabilities among men, but a function of couple conflict among women. Results also revealed important cross-gender effects in the prediction of IPV, reflecting the inherently dyadic nature of IPV, particularly in the case of “common couple violence.” Future research using longitudinal designs is necessary to verify the conclusions suggested by the current results. PMID:21875196
NASA Astrophysics Data System (ADS)
Riley, W. J.; Zhu, Q.; Tang, J.
2016-12-01
The land models integrated in Earth System Models (ESMs) are critical components necessary to predict soil carbon dynamics and carbon-climate interactions under a changing climate. Yet, these models have been shown to have poor predictive power when compared with observations and ignore many processes known to the observational communities to influence above and belowground carbon dynamics. Here I will report work to tightly couple observations and perturbation experiment results with development of an ESM land model (ALM), focusing on nutrient constraints of the terrestrial C cycle. Using high-frequency flux tower observations and short-term nitrogen and phosphorus perturbation experiments, we show that conceptualizing plant and soil microbe interactions as a multi-substrate, multi-competitor kinetic network allows for accurate prediction of nutrient acquisition. Next, using multiple-year FACE and fertilization response observations at many forest sites, we show that capturing the observed responses requires representation of dynamic allocation to respond to the resulting stresses. Integrating the mechanisms implied by these observations into ALM leads to much lower observational bias and to very different predictions of long-term soil and aboveground C stocks and dynamics, and therefore C-climate feedbacks. I describe how these types of observational constraints are being integrated into the open-source International Land Model Benchmarking (ILAMB) package, and end with the argument that consolidating as many observations of all sorts for easy use by modelers is an important goal to improve C-climate feedback predictions.
Comparison of free-piston Stirling engine model predictions with RE1000 engine test data
NASA Technical Reports Server (NTRS)
Tew, R. C., Jr.
1984-01-01
Predictions of a free-piston Stirling engine model are compared with RE1000 engine test data taken at NASA-Lewis Research Center. The model validation and the engine testing are being done under a joint interagency agreement between the Department of Energy's Oak Ridge National Laboratory and NASA-Lewis. A kinematic code developed at Lewis was upgraded to permit simulation of free-piston engine performance; it was further upgraded and modified at Lewis and is currently being validated. The model predicts engine performance by numerical integration of equations for each control volume in the working space. Piston motions are determined by numerical integration of the force balance on each piston or can be specified as Fourier series. In addition, the model Fourier analyzes the various piston forces to permit the construction of phasor force diagrams. The paper compares predicted and experimental values of power and efficiency and shows phasor force diagrams for the RE1000 engine displacer and piston. Further development plans for the model are also discussed.
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
Stegen, James C.
2018-04-10
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modelingmore » frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. Here, we can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stegen, James C.
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modelingmore » frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. Here, we can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.« less
An Integrated Finite Element-based Simulation Framework: From Hole Piercing to Hole Expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xiaohua; Sun, Xin; Golovashchenko, Segey F.
An integrated finite element-based modeling framework is developed to predict the hole expansion ratio (HER) of AA6111-T4 sheet by considering the piercing-induced damages around the hole edge. Using damage models and parameters calibrated from previously reported tensile stretchability studies, the predicted HER correlates well with experimentally measured HER values for different hole piercing clearances. The hole piercing model shows burrs are not generated on the sheared surface for clearances less than 20%, which corresponds well with the experimental data on pierced holes cross-sections. Finite-element-calculated HER also is not especially sensitive to piercing clearances less than this value. However, as clearancesmore » increase to 30% and further to 40%, the HER values are predicted to be considerably smaller, also consistent with experimental measurements. Upon validation, the integrated modeling framework is used to examine the effects of different hole piercing and hole expansion conditions on the critical HERs for AA6111-T4.« less
Predictive Models and Computational Toxicology (II IBAMTOX)
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
The spatial structure of a nonlinear receptive field.
Schwartz, Gregory W; Okawa, Haruhisa; Dunn, Felice A; Morgan, Josh L; Kerschensteiner, Daniel; Wong, Rachel O; Rieke, Fred
2012-11-01
Understanding a sensory system implies the ability to predict responses to a variety of inputs from a common model. In the retina, this includes predicting how the integration of signals across visual space shapes the outputs of retinal ganglion cells. Existing models of this process generalize poorly to predict responses to new stimuli. This failure arises in part from properties of the ganglion cell response that are not well captured by standard receptive-field mapping techniques: nonlinear spatial integration and fine-scale heterogeneities in spatial sampling. Here we characterize a ganglion cell's spatial receptive field using a mechanistic model based on measurements of the physiological properties and connectivity of only the primary excitatory circuitry of the retina. The resulting simplified circuit model successfully predicts ganglion-cell responses to a variety of spatial patterns and thus provides a direct correspondence between circuit connectivity and retinal output.
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
Uncertainty prediction for PUB
NASA Astrophysics Data System (ADS)
Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.
2003-04-01
IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.
Rahman, Rahbel; Pinto, Rogério M.; Wall, Melanie M.
2017-01-01
Integration of health education and welfare services in primary care systems is a key strategy to solve the multiple determinants of chronic diseases, such as Human Immunodeficiency Virus Infection and Acquired Immune Deficiency Syndrome (HIV/AIDS). However, there is a scarcity of conceptual models from which to build integration strategies. We provide a model based on cross-sectional data from 168 Community Health Agents, 62 nurses, and 32 physicians in two municipalities in Brazil’s Unified Health System (UHS). The outcome, service integration, comprised HIV education, community activities (e.g., health walks and workshops), and documentation services (e.g., obtainment of working papers and birth certificates). Predictors included individual factors (provider confidence, knowledge/skills, perseverance, efficacy); job characteristics (interprofessional collaboration, work-autonomy, decision-making autonomy, skill variety); and organizational factors (work conditions and work resources). Structural equation modeling was used to identify factors associated with service integration. Knowledge and skills, skill variety, confidence, and perseverance predicted greater integration of HIV education alongside community activities and documentation services. Job characteristics and organizational factors did not predict integration. Our study offers an explanatory model that can be adapted to examine other variables that may influence integration of different services in global primary healthcare systems. Findings suggest that practitioner trainings to improve integration should focus on cognitive constructs—confidence, perseverance, knowledge, and skills. PMID:28335444
Rahman, Rahbel; Pinto, Rogério M; Wall, Melanie M
2017-03-14
Integration of health education and welfare services in primary care systems is a key strategy to solve the multiple determinants of chronic diseases, such as Human Immunodeficiency Virus Infection and Acquired Immune Deficiency Syndrome (HIV/AIDS). However, there is a scarcity of conceptual models from which to build integration strategies. We provide a model based on cross-sectional data from 168 Community Health Agents, 62 nurses, and 32 physicians in two municipalities in Brazil's Unified Health System (UHS). The outcome, service integration, comprised HIV education, community activities (e.g., health walks and workshops), and documentation services (e.g., obtainment of working papers and birth certificates). Predictors included individual factors (provider confidence, knowledge/skills, perseverance, efficacy); job characteristics (interprofessional collaboration, work-autonomy, decision-making autonomy, skill variety); and organizational factors (work conditions and work resources). Structural equation modeling was used to identify factors associated with service integration. Knowledge and skills, skill variety, confidence, and perseverance predicted greater integration of HIV education alongside community activities and documentation services. Job characteristics and organizational factors did not predict integration. Our study offers an explanatory model that can be adapted to examine other variables that may influence integration of different services in global primary healthcare systems. Findings suggest that practitioner trainings to improve integration should focus on cognitive constructs-confidence, perseverance, knowledge, and skills.
A model for prediction of STOVL ejector dynamics
NASA Technical Reports Server (NTRS)
Drummond, Colin K.
1989-01-01
A semi-empirical control-volume approach to ejector modeling for transient performance prediction is presented. This new approach is motivated by the need for a predictive real-time ejector sub-system simulation for Short Take-Off Verticle Landing (STOVL) integrated flight and propulsion controls design applications. Emphasis is placed on discussion of the approximate characterization of the mixing process central to thrust augmenting ejector operation. The proposed ejector model suggests transient flow predictions are possible with a model based on steady-flow data. A practical test case is presented to illustrate model calibration.
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Objective Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Methods Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Results Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Conclusions Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. PMID:25002459
Kim, Dokyoon; Joung, Je-Gun; Sohn, Kyung-Ah; Shin, Hyunjung; Park, Yu Rang; Ritchie, Marylyn D; Kim, Ju Han
2015-01-01
Cancer can involve gene dysregulation via multiple mechanisms, so no single level of genomic data fully elucidates tumor behavior due to the presence of numerous genomic variations within or between levels in a biological system. We have previously proposed a graph-based integration approach that combines multi-omics data including copy number alteration, methylation, miRNA, and gene expression data for predicting clinical outcome in cancer. However, genomic features likely interact with other genomic features in complex signaling or regulatory networks, since cancer is caused by alterations in pathways or complete processes. Here we propose a new graph-based framework for integrating multi-omics data and genomic knowledge to improve power in predicting clinical outcomes and elucidate interplay between different levels. To highlight the validity of our proposed framework, we used an ovarian cancer dataset from The Cancer Genome Atlas for predicting stage, grade, and survival outcomes. Integrating multi-omics data with genomic knowledge to construct pre-defined features resulted in higher performance in clinical outcome prediction and higher stability. For the grade outcome, the model with gene expression data produced an area under the receiver operating characteristic curve (AUC) of 0.7866. However, models of the integration with pathway, Gene Ontology, chromosomal gene set, and motif gene set consistently outperformed the model with genomic data only, attaining AUCs of 0.7873, 0.8433, 0.8254, and 0.8179, respectively. Integrating multi-omics data and genomic knowledge to improve understanding of molecular pathogenesis and underlying biology in cancer should improve diagnostic and prognostic indicators and the effectiveness of therapies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Qian, Liwei; Zheng, Haoran; Zhou, Hong; Qin, Ruibin; Li, Jinlong
2013-01-01
The increasing availability of time series expression datasets, although promising, raises a number of new computational challenges. Accordingly, the development of suitable classification methods to make reliable and sound predictions is becoming a pressing issue. We propose, here, a new method to classify time series gene expression via integration of biological networks. We evaluated our approach on 2 different datasets and showed that the use of a hidden Markov model/Gaussian mixture models hybrid explores the time-dependence of the expression data, thereby leading to better prediction results. We demonstrated that the biclustering procedure identifies function-related genes as a whole, giving rise to high accordance in prognosis prediction across independent time series datasets. In addition, we showed that integration of biological networks into our method significantly improves prediction performance. Moreover, we compared our approach with several state-of–the-art algorithms and found that our method outperformed previous approaches with regard to various criteria. Finally, our approach achieved better prediction results on early-stage data, implying the potential of our method for practical prediction. PMID:23516469
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
Smith, Rachel A; Kim, Youllee; Zhu, Xun; Doudou, Dimi Théodore; Sternberg, Eleanore D; Thomas, Matthew B
2018-01-01
This study documents an investigation into the adoption and diffusion of eave tubes, a novel mosquito vector control, during a large-scale scientific field trial in West Africa. The diffusion of innovations (DOI) and the integrated model of behavior (IMB) were integrated (i.e., innovation attributes with attitudes and social pressures with norms) to predict participants' (N = 329) diffusion intentions. The findings showed that positive attitudes about the innovation's attributes were a consistent positive predictor of diffusion intentions: adopting it, maintaining it, and talking with others about it. As expected by the DOI and the IMB, the social pressure created by a descriptive norm positively predicted intentions to adopt and maintain the innovation. Drawing upon sharing research, we argued that the descriptive norm may dampen future talk about the innovation, because it may no longer be seen as a novel, useful topic to discuss. As predicted, the results showed that as the descriptive norm increased, the intention to talk about the innovation decreased. These results provide broad support for integrating the DOI and the IMB to predict diffusion and for efforts to draw on other research to understand motivations for social diffusion.
Bedoya, David; Manolakos, Elias S; Novotny, Vladimir
2011-03-01
Indices of Biological integrity (IBI) are considered valid indicators of the overall health of a water body because the biological community is an endpoint within natural systems. However, prediction of biological integrity using information from multi-parameter environmental observations is a challenging problem due to the hierarchical organization of the natural environment, the existence of nonlinear inter-dependencies among variables as well as natural stochasticity and measurement noise. We present a method for predicting the Fish Index of Biological Integrity (IBI) using multiple environmental observations at the state-scale in Ohio. Instream (chemical and physical quality) and offstream parameters (regional and local upstream land uses, stream fragmentation, and point source density and intensity) are used for this purpose. The IBI predictions are obtained using the environmental site-similarity concept and following a simple to implement leave-one-out cross validation approach. An IBI prediction for a sampling site is calculated by averaging the observed IBI scores of observations clustered in the most similar branch of a dendrogram--a hierarchical clustering tree of environmental observations--built using the rest of the observations. The standardized Euclidean distance is used to assess dissimilarity between observations. The constructed predictive model was able to explain 61% of the IBI variability statewide. Stream fragmentation and regional land use explained 60% of the variability; the remaining 1% was explained by instream habitat quality. Metrics related to local land use, water quality, and point source density and intensity did not improve the predictive model at the state-scale. The impact of local environmental conditions was evaluated by comparing local characteristics between well- and mispredicted sites. Significant differences in local land use patterns and upstream fragmentation density explained some of the model's over-predictions. Local land use conditions explained some of the model's IBI under-predictions at the state-scale since none of the variables within this group were included in the best final predictive model. Under-predicted sites also had higher levels of downstream fragmentation. The proposed variables ranking and predictive modeling methodology is very well suited for the analysis of hierarchical environments, such as natural fresh water systems, with many cross-correlated environmental variables. It is computationally efficient, can be fully automated, does not make any pre-conceived assumptions on the variables interdependency structure (such as linearity), and it is able to rank variables in a database and generate IBI predictions using only non-parametric easy to implement hierarchical clustering. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Chaochao; Vachtsevanos, George; Orchard, Marcos E.
2012-04-01
Machine prognosis can be considered as the generation of long-term predictions that describe the evolution in time of a fault indicator, with the purpose of estimating the remaining useful life (RUL) of a failing component/subsystem so that timely maintenance can be performed to avoid catastrophic failures. This paper proposes an integrated RUL prediction method using adaptive neuro-fuzzy inference systems (ANFIS) and high-order particle filtering, which forecasts the time evolution of the fault indicator and estimates the probability density function (pdf) of RUL. The ANFIS is trained and integrated in a high-order particle filter as a model describing the fault progression. The high-order particle filter is used to estimate the current state and carry out p-step-ahead predictions via a set of particles. These predictions are used to estimate the RUL pdf. The performance of the proposed method is evaluated via the real-world data from a seeded fault test for a UH-60 helicopter planetary gear plate. The results demonstrate that it outperforms both the conventional ANFIS predictor and the particle-filter-based predictor where the fault growth model is a first-order model that is trained via the ANFIS.
Owens, Robert L; Edwards, Bradley A; Eckert, Danny J; Jordan, Amy S; Sands, Scott A; Malhotra, Atul; White, David P; Loring, Stephen H; Butler, James P; Wellman, Andrew
2015-06-01
Both anatomical and nonanatomical traits are important in obstructive sleep apnea (OSA) pathogenesis. We have previously described a model combining these traits, but have not determined its diagnostic accuracy to predict OSA. A valid model, and knowledge of the published effect sizes of trait manipulation, would also allow us to predict the number of patients with OSA who might be effectively treated without using positive airway pressure (PAP). Fifty-seven subjects with and without OSA underwent standard clinical and research sleep studies to measure OSA severity and the physiological traits important for OSA pathogenesis, respectively. The traits were incorporated into a physiological model to predict OSA. The model validity was determined by comparing the model prediction of OSA to the clinical diagnosis of OSA. The effect of various trait manipulations was then simulated to predict the proportion of patients treated by each intervention. The model had good sensitivity (80%) and specificity (100%) for predicting OSA. A single intervention on one trait would be predicted to treat OSA in approximately one quarter of all patients. Combination therapy with two interventions was predicted to treat OSA in ∼50% of patients. An integrative model of physiological traits can be used to predict population-wide and individual responses to non-PAP therapy. Many patients with OSA would be expected to be treated based on known trait manipulations, making a strong case for the importance of non-anatomical traits in OSA pathogenesis and the effectiveness of non-PAP therapies. © 2015 Associated Professional Sleep Societies, LLC.
Du, Tianchuan; Liao, Li; Wu, Cathy H
2016-12-01
Identifying the residues in a protein that are involved in protein-protein interaction and identifying the contact matrix for a pair of interacting proteins are two computational tasks at different levels of an in-depth analysis of protein-protein interaction. Various methods for solving these two problems have been reported in the literature. However, the interacting residue prediction and contact matrix prediction were handled by and large independently in those existing methods, though intuitively good prediction of interacting residues will help with predicting the contact matrix. In this work, we developed a novel protein interacting residue prediction system, contact matrix-interaction profile hidden Markov model (CM-ipHMM), with the integration of contact matrix prediction and the ipHMM interaction residue prediction. We propose to leverage what is learned from the contact matrix prediction and utilize the predicted contact matrix as "feedback" to enhance the interaction residue prediction. The CM-ipHMM model showed significant improvement over the previous method that uses the ipHMM for predicting interaction residues only. It indicates that the downstream contact matrix prediction could help the interaction site prediction.
Time series modelling of increased soil temperature anomalies during long period
NASA Astrophysics Data System (ADS)
Shirvani, Amin; Moradi, Farzad; Moosavi, Ali Akbar
2015-10-01
Soil temperature just beneath the soil surface is highly dynamic and has a direct impact on plant seed germination and is probably the most distinct and recognisable factor governing emergence. Autoregressive integrated moving average as a stochastic model was developed to predict the weekly soil temperature anomalies at 10 cm depth, one of the most important soil parameters. The weekly soil temperature anomalies for the periods of January1986-December 2011 and January 2012-December 2013 were taken into consideration to construct and test autoregressive integrated moving average models. The proposed model autoregressive integrated moving average (2,1,1) had a minimum value of Akaike information criterion and its estimated coefficients were different from zero at 5% significance level. The prediction of the weekly soil temperature anomalies during the test period using this proposed model indicated a high correlation coefficient between the observed and predicted data - that was 0.99 for lead time 1 week. Linear trend analysis indicated that the soil temperature anomalies warmed up significantly by 1.8°C during the period of 1986-2011.
Boundary layer integral matrix procedure: Verification of models
NASA Technical Reports Server (NTRS)
Bonnett, W. S.; Evans, R. M.
1977-01-01
The three turbulent models currently available in the JANNAF version of the Aerotherm Boundary Layer Integral Matrix Procedure (BLIMP-J) code were studied. The BLIMP-J program is the standard prediction method for boundary layer effects in liquid rocket engine thrust chambers. Experimental data from flow fields with large edge-to-wall temperature ratios are compared to the predictions of the three turbulence models contained in BLIMP-J. In addition, test conditions necessary to generate additional data on a flat plate or in a nozzle are given. It is concluded that the Cebeci-Smith turbulence model be the recommended model for the prediction of boundary layer effects in liquid rocket engines. In addition, the effects of homogeneous chemical reaction kinetics were examined for a hydrogen/oxygen system. Results show that for most flows, kinetics are probably only significant for stoichiometric mixture ratios.
Roth, Jenny; Steffens, Melanie C; Vignoles, Vivian L
2018-01-01
The present article introduces a model based on cognitive consistency principles to predict how new identities become integrated into the self-concept, with consequences for intergroup attitudes. The model specifies four concepts (self-concept, stereotypes, identification, and group compatibility) as associative connections. The model builds on two cognitive principles, balance-congruity and imbalance-dissonance, to predict identification with social groups that people currently belong to, belonged to in the past, or newly belong to. More precisely, the model suggests that the relative strength of self-group associations (i.e., identification) depends in part on the (in)compatibility of the different social groups. Combining insights into cognitive representation of knowledge, intergroup bias, and explicit/implicit attitude change, we further derive predictions for intergroup attitudes. We suggest that intergroup attitudes alter depending on the relative associative strength between the social groups and the self, which in turn is determined by the (in)compatibility between social groups. This model unifies existing models on the integration of social identities into the self-concept by suggesting that basic cognitive mechanisms play an important role in facilitating or hindering identity integration and thus contribute to reducing or increasing intergroup bias.
Roth, Jenny; Steffens, Melanie C.; Vignoles, Vivian L.
2018-01-01
The present article introduces a model based on cognitive consistency principles to predict how new identities become integrated into the self-concept, with consequences for intergroup attitudes. The model specifies four concepts (self-concept, stereotypes, identification, and group compatibility) as associative connections. The model builds on two cognitive principles, balance–congruity and imbalance–dissonance, to predict identification with social groups that people currently belong to, belonged to in the past, or newly belong to. More precisely, the model suggests that the relative strength of self-group associations (i.e., identification) depends in part on the (in)compatibility of the different social groups. Combining insights into cognitive representation of knowledge, intergroup bias, and explicit/implicit attitude change, we further derive predictions for intergroup attitudes. We suggest that intergroup attitudes alter depending on the relative associative strength between the social groups and the self, which in turn is determined by the (in)compatibility between social groups. This model unifies existing models on the integration of social identities into the self-concept by suggesting that basic cognitive mechanisms play an important role in facilitating or hindering identity integration and thus contribute to reducing or increasing intergroup bias. PMID:29681878
Models for short term malaria prediction in Sri Lanka
Briët, Olivier JT; Vounatsou, Penelope; Gunawardena, Dissanayake M; Galappaththy, Gawrie NL; Amerasinghe, Priyanie H
2008-01-01
Background Malaria in Sri Lanka is unstable and fluctuates in intensity both spatially and temporally. Although the case counts are dwindling at present, given the past history of resurgence of outbreaks despite effective control measures, the control programmes have to stay prepared. The availability of long time series of monitored/diagnosed malaria cases allows for the study of forecasting models, with an aim to developing a forecasting system which could assist in the efficient allocation of resources for malaria control. Methods Exponentially weighted moving average models, autoregressive integrated moving average (ARIMA) models with seasonal components, and seasonal multiplicative autoregressive integrated moving average (SARIMA) models were compared on monthly time series of district malaria cases for their ability to predict the number of malaria cases one to four months ahead. The addition of covariates such as the number of malaria cases in neighbouring districts or rainfall were assessed for their ability to improve prediction of selected (seasonal) ARIMA models. Results The best model for forecasting and the forecasting error varied strongly among the districts. The addition of rainfall as a covariate improved prediction of selected (seasonal) ARIMA models modestly in some districts but worsened prediction in other districts. Improvement by adding rainfall was more frequent at larger forecasting horizons. Conclusion Heterogeneity of patterns of malaria in Sri Lanka requires regionally specific prediction models. Prediction error was large at a minimum of 22% (for one of the districts) for one month ahead predictions. The modest improvement made in short term prediction by adding rainfall as a covariate to these prediction models may not be sufficient to merit investing in a forecasting system for which rainfall data are routinely processed. PMID:18460204
Crystal plasticity assisted prediction on the yield locus evolution and forming limit curves
NASA Astrophysics Data System (ADS)
Lian, Junhe; Liu, Wenqi; Shen, Fuhui; Münstermann, Sebastian
2017-10-01
The aim of this study is to predict the plastic anisotropy evolution and its associated forming limit curves of bcc steels purely based on their microstructural features by establishing an integrated multiscale modelling approach. Crystal plasticity models are employed to describe the micro deformation mechanism and correlate the microstructure with mechanical behaviour on micro and mesoscale. Virtual laboratory is performed considering the statistical information of the microstructure, which serves as the input for the phenomenological plasticity model on the macroscale. For both scales, the microstructure evolution induced evolving features, such as the anisotropic hardening, r-value and yield locus evolution are seamlessly integrated. The predicted plasticity behaviour by the numerical simulations are compared with experiments. These evolutionary features of the material deformation behaviour are eventually considered for the prediction of formability.
The brain, self and society: a social-neuroscience model of predictive processing.
Kelly, Michael P; Kriznik, Natasha M; Kinmonth, Ann Louise; Fletcher, Paul C
2018-05-10
This paper presents a hypothesis about how social interactions shape and influence predictive processing in the brain. The paper integrates concepts from neuroscience and sociology where a gulf presently exists between the ways that each describe the same phenomenon - how the social world is engaged with by thinking humans. We combine the concepts of predictive processing models (also called predictive coding models in the neuroscience literature) with ideal types, typifications and social practice - concepts from the sociological literature. This generates a unified hypothetical framework integrating the social world and hypothesised brain processes. The hypothesis combines aspects of neuroscience and psychology with social theory to show how social behaviors may be "mapped" onto brain processes. It outlines a conceptual framework that connects the two disciplines and that may enable creative dialogue and potential future research.
A model of interval timing by neural integration.
Simen, Patrick; Balci, Fuat; de Souza, Laura; Cohen, Jonathan D; Holmes, Philip
2011-06-22
We show that simple assumptions about neural processing lead to a model of interval timing as a temporal integration process, in which a noisy firing-rate representation of time rises linearly on average toward a response threshold over the course of an interval. Our assumptions include: that neural spike trains are approximately independent Poisson processes, that correlations among them can be largely cancelled by balancing excitation and inhibition, that neural populations can act as integrators, and that the objective of timed behavior is maximal accuracy and minimal variance. The model accounts for a variety of physiological and behavioral findings in rodents, monkeys, and humans, including ramping firing rates between the onset of reward-predicting cues and the receipt of delayed rewards, and universally scale-invariant response time distributions in interval timing tasks. It furthermore makes specific, well-supported predictions about the skewness of these distributions, a feature of timing data that is usually ignored. The model also incorporates a rapid (potentially one-shot) duration-learning procedure. Human behavioral data support the learning rule's predictions regarding learning speed in sequences of timed responses. These results suggest that simple, integration-based models should play as prominent a role in interval timing theory as they do in theories of perceptual decision making, and that a common neural mechanism may underlie both types of behavior.
Woodhouse, Mark J; Behnke, Sonja A
Observations of volcanic lightning made using a lightning mapping array during the 2010 eruption of Eyjafjallajökull allow the trajectory and growth of the volcanic plume to be determined. The lightning observations are compared with predictions of an integral model of volcanic plumes that includes descriptions of the interaction with wind and the effects of moisture. We show that the trajectory predicted by the integral model closely matches the observational data and the model well describes the growth of the plume downwind of the vent. Analysis of the lightning signals reveals information on the dominant charge structure within the volcanic plume. During the Eyjafjallajökull eruption both monopole and dipole charge structures were observed in the plume. By using the integral plume model, we propose the varying charge structure is connected to the availability of condensed water and low temperatures at high altitudes in the plume, suggesting ice formation may have contributed to the generation of a dipole charge structure via thunderstorm-style ice-based charging mechanisms, though overall this charging mechanism is believed to have had only a weak influence on the production of lightning.
Jordanian Pre-Service Teachers' and Technology Integration: A Human Resource Development Approach
ERIC Educational Resources Information Center
Al-Ruz, Jamal Abu; Khasawneh, Samer
2011-01-01
The purpose of this study was to test a model in which technology integration of pre-service teachers was predicted by a number of university-based and school-based factors. Initially, factors affecting technology integration were identified, and a research-based path model was developed to explain causal relationships between these factors. The…
Overview of MSFC AMSD Integrated Modeling and Analysis
NASA Technical Reports Server (NTRS)
Cummings, Ramona; Russell, Kevin (Technical Monitor)
2002-01-01
Structural, thermal, dynamic, and optical models of the NGST AMSD mirror assemblies are being finalized and integrated for predicting cryogenic vacuum test performance of the developing designs. Analyzers in use by the MSFC Modeling and Analysis Team are identified, with overview of approach to integrate simulated effects. Guidelines to verify the individual models and calibration cases for comparison with the vendors' analyses are presented. In addition, baseline and proposed additional scenarios for the cryogenic vacuum testing are briefly described.
NASA Astrophysics Data System (ADS)
Latif, M.
2017-12-01
We investigate the influence of the Atlantic Meridional Overturning Circulation (AMOC) on the North Atlantic sector surface air temperature (SAT) in two multi-millennial control integrations of the Kiel Climate Model (KCM). One model version employs a freshwater flux correction over the North Atlantic, while the other does not. A clear influence of the AMOC on North Atlantic sector SAT only is simulated in the corrected model that depicts much reduced upper ocean salinity and temperature biases in comparison to the uncorrected model. Further, the model with much reduced biases depicts significantly enhanced multiyear SAT predictability in the North Atlantic sector relative to the uncorrected model. The enhanced SAT predictability in the corrected model is due to a stronger and more variable AMOC and its enhanced influence on North Atlantic sea surface temperature (SST). Results obtained from preindustrial control integrations of models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) support the findings obtained from the KCM: models with large North Atlantic biases tend to have a weak AMOC influence on SST and exhibit a smaller SAT predictability over the North Atlantic sector.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
Kattou, Panayiotis; Lian, Guoping; Glavin, Stephen; Sorrell, Ian; Chen, Tao
2017-10-01
The development of a new two-dimensional (2D) model to predict follicular permeation, with integration into a recently reported multi-scale model of transdermal permeation is presented. The follicular pathway is modelled by diffusion in sebum. The mass transfer and partition properties of solutes in lipid, corneocytes, viable dermis, dermis and systemic circulation are calculated as reported previously [Pharm Res 33 (2016) 1602]. The mass transfer and partition properties in sebum are collected from existing literature. None of the model input parameters was fit to the clinical data with which the model prediction is compared. The integrated model has been applied to predict the published clinical data of transdermal permeation of caffeine. The relative importance of the follicular pathway is analysed. Good agreement of the model prediction with the clinical data has been obtained. The simulation confirms that for caffeine the follicular route is important; the maximum bioavailable concentration of caffeine in systemic circulation with open hair follicles is predicted to be 20% higher than that when hair follicles are blocked. The follicular pathway contributes to not only short time fast penetration, but also the overall systemic bioavailability. With such in silico model, useful information can be obtained for caffeine disposition and localised delivery in lipid, corneocytes, viable dermis, dermis and the hair follicle. Such detailed information is difficult to obtain experimentally.
NASA Astrophysics Data System (ADS)
Honti, Mark; Schuwirth, Nele; Rieckermann, Jörg; Stamm, Christian
2017-03-01
The design and evaluation of solutions for integrated surface water quality management requires an integrated modelling approach. Integrated models have to be comprehensive enough to cover the aspects relevant for management decisions, allowing for mapping of larger-scale processes such as climate change to the regional and local contexts. Besides this, models have to be sufficiently simple and fast to apply proper methods of uncertainty analysis, covering model structure deficits and error propagation through the chain of sub-models. Here, we present a new integrated catchment model satisfying both conditions. The conceptual iWaQa
model was developed to support the integrated management of small streams. It can be used to predict traditional water quality parameters, such as nutrients and a wide set of organic micropollutants (plant and material protection products), by considering all major pollutant pathways in urban and agricultural environments. Due to its simplicity, the model allows for a full, propagative analysis of predictive uncertainty, including certain structural and input errors. The usefulness of the model is demonstrated by predicting future surface water quality in a small catchment with mixed land use in the Swiss Plateau. We consider climate change, population growth or decline, socio-economic development, and the implementation of management strategies to tackle urban and agricultural point and non-point sources of pollution. Our results indicate that input and model structure uncertainties are the most influential factors for certain water quality parameters. In these cases model uncertainty is already high for present conditions. Nevertheless, accounting for today's uncertainty makes management fairly robust to the foreseen range of potential changes in the next decades. The assessment of total predictive uncertainty allows for selecting management strategies that show small sensitivity to poorly known boundary conditions. The identification of important sources of uncertainty helps to guide future monitoring efforts and pinpoints key indicators, whose evolution should be closely followed to adapt management. The possible impact of climate change is clearly demonstrated by water quality substantially changing depending on single climate model chains. However, when all climate trajectories are combined, the human land use and management decisions have a larger influence on water quality against a time horizon of 2050 in the study.
Yang, Jie; Weng, Wenguo; Wang, Faming; Song, Guowen
2017-05-01
This paper aims to integrate a human thermoregulatory model with a clothing model to predict core and skin temperatures. The human thermoregulatory model, consisting of an active system and a passive system, was used to determine the thermoregulation and heat exchanges within the body. The clothing model simulated heat and moisture transfer from the human skin to the environment through the microenvironment and fabric. In this clothing model, the air gap between skin and clothing, as well as clothing properties such as thickness, thermal conductivity, density, porosity, and tortuosity were taken into consideration. The simulated core and mean skin temperatures were compared to the published experimental results of subject tests at three levels of ambient temperatures of 20 °C, 30 °C, and 40 °C. Although lower signal-to-noise-ratio was observed, the developed model demonstrated positive performance at predicting core temperatures with a maximum difference between the simulations and measurements of no more than 0.43 °C. Generally, the current model predicted the mean skin temperatures with reasonable accuracy. It could be applied to predict human physiological responses and assess thermal comfort and heat stress. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chen, Tao; Lian, Guoping; Kattou, Panayiotis
2016-07-01
The purpose was to develop a mechanistic mathematical model for predicting the pharmacokinetics of topically applied solutes penetrating through the skin and into the blood circulation. The model could be used to support the design of transdermal drug delivery systems and skin care products, and risk assessment of occupational or consumer exposure. A recently reported skin penetration model [Pharm Res 32 (2015) 1779] was integrated with the kinetic equations for dermis-to-capillary transport and systemic circulation. All model parameters were determined separately from the molecular, microscopic and physiological bases, without fitting to the in vivo data to be predicted. Published clinical studies of nicotine were used for model demonstration. The predicted plasma kinetics is in good agreement with observed clinical data. The simulated two-dimensional concentration profile in the stratum corneum vividly illustrates the local sub-cellular disposition kinetics, including tortuous lipid pathway for diffusion and the "reservoir" effect of the corneocytes. A mechanistic model for predicting transdermal and systemic kinetics was developed and demonstrated with published clinical data. The integrated mechanistic approach has significantly extended the applicability of a recently reported microscopic skin penetration model by providing prediction of solute concentration in the blood.
SERVIR-Africa: Developing an Integrated Platform for Floods Disaster Management in Africa
NASA Technical Reports Server (NTRS)
Macharia, Daniel; Korme, Tesfaye; Policelli, Fritz; Irwin, Dan; Adler, Bob; Hong, Yang
2010-01-01
SERVIR-Africa is an ambitious regional visualization and monitoring system that integrates remotely sensed data with predictive models and field-based data to monitor ecological processes and respond to natural disasters. It aims addressing societal benefits including floods and turning data into actionable information for decision-makers. Floods are exogenous disasters that affect many parts of Africa, probably second only to drought in terms of social-economic losses. This paper looks at SERVIR-Africa's approach to floods disaster management through establishment of an integrated platform, floods prediction models, post-event flood mapping and monitoring as well as flood maps dissemination in support of flood disaster management.
Initial Integration of Noise Prediction Tools for Acoustic Scattering Effects
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Burley, Casey L.; Tinetti, Ana; Rawls, John W.
2008-01-01
This effort provides an initial glimpse at NASA capabilities available in predicting the scattering of fan noise from a non-conventional aircraft configuration. The Aircraft NOise Prediction Program, Fast Scattering Code, and the Rotorcraft Noise Model were coupled to provide increased fidelity models of scattering effects on engine fan noise sources. The integration of these codes led to the identification of several keys issues entailed in applying such multi-fidelity approaches. In particular, for prediction at noise certification points, the inclusion of distributed sources leads to complications with the source semi-sphere approach. Computational resource requirements limit the use of the higher fidelity scattering code to predict radiated sound pressure levels for full scale configurations at relevant frequencies. And, the ability to more accurately represent complex shielding surfaces in current lower fidelity models is necessary for general application to scattering predictions. This initial step in determining the potential benefits/costs of these new methods over the existing capabilities illustrates a number of the issues that must be addressed in the development of next generation aircraft system noise prediction tools.
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Integrating conceptual knowledge within and across representational modalities.
McNorgan, Chris; Reid, Jackie; McRae, Ken
2011-02-01
Research suggests that concepts are distributed across brain regions specialized for processing information from different sensorimotor modalities. Multimodal semantic models fall into one of two broad classes differentiated by the assumed hierarchy of convergence zones over which information is integrated. In shallow models, communication within- and between-modality is accomplished using either direct connectivity, or a central semantic hub. In deep models, modalities are connected via cascading integration sites with successively wider receptive fields. Four experiments provide the first direct behavioral tests of these models using speeded tasks involving feature inference and concept activation. Shallow models predict no within-modal versus cross-modal difference in either task, whereas deep models predict a within-modal advantage for feature inference, but a cross-modal advantage for concept activation. Experiments 1 and 2 used relatedness judgments to tap participants' knowledge of relations for within- and cross-modal feature pairs. Experiments 3 and 4 used a dual-feature verification task. The pattern of decision latencies across Experiments 1-4 is consistent with a deep integration hierarchy. Copyright © 2010 Elsevier B.V. All rights reserved.
Slade, Karen; Edelman, Robert
2014-01-01
Each year approximately 110,000 people are imprisoned in England and Wales and new prisoners remain one of the highest risk groups for suicide across the world. The reduction of suicide in prisoners remains difficult as assessments and interventions tend to rely on static risk factors with few theoretical or integrated models yet evaluated. To identify the dynamic factors that contribute to suicide ideation in this population based on Williams and Pollock's (2001) Cry of Pain (CoP) model. New arrivals (N = 198) into prison were asked to complete measures derived from the CoP model plus clinical and prison-specific factors. It was hypothesized that the factors of the CoP model would be predictive of suicide ideation. Support was provided for the defeat and entrapment aspects of the CoP model with previous self-harm, repeated times in prison, and suicide-permissive cognitions also key in predicting suicide ideation for prisoners on entry to prison. An integrated and dynamic model was developed that has utility in predicting suicide in early-stage prisoners. Implications for both theory and practice are discussed along with recommendations for future research.
Health-aware Model Predictive Control of Pasteurization Plant
NASA Astrophysics Data System (ADS)
Karimi Pour, Fatemeh; Puig, Vicenç; Ocampo-Martinez, Carlos
2017-01-01
In order to optimize the trade-off between components life and energy consumption, the integration of a system health management and control modules is required. This paper proposes the integration of model predictive control (MPC) with a fatigue estimation approach that minimizes the damage of the components of a pasteurization plant. The fatigue estimation is assessed with the rainflow counting algorithm. Using data from this algorithm, a simplified model that characterizes the health of the system is developed and integrated with MPC. The MPC controller objective is modified by adding an extra criterion that takes into account the accumulated damage. But, a steady-state offset is created by adding this extra criterion. Finally, by including an integral action in the MPC controller, the steady-state error for regulation purpose is eliminated. The proposed control scheme is validated in simulation using a simulator of a utility-scale pasteurization plant.
Liu, Zhongyang; Guo, Feifei; Gu, Jiangyong; Wang, Yong; Li, Yang; Wang, Dan; Lu, Liang; Li, Dong; He, Fuchu
2015-06-01
Anatomical Therapeutic Chemical (ATC) classification system, widely applied in almost all drug utilization studies, is currently the most widely recognized classification system for drugs. Currently, new drug entries are added into the system only on users' requests, which leads to seriously incomplete drug coverage of the system, and bioinformatics prediction is helpful during this process. Here we propose a novel prediction model of drug-ATC code associations, using logistic regression to integrate multiple heterogeneous data sources including chemical structures, target proteins, gene expression, side-effects and chemical-chemical associations. The model obtains good performance for the prediction not only on ATC codes of unclassified drugs but also on new ATC codes of classified drugs assessed by cross-validation and independent test sets, and its efficacy exceeds previous methods. Further to facilitate the use, the model is developed into a user-friendly web service SPACE ( S: imilarity-based P: redictor of A: TC C: od E: ), which for each submitted compound, will give candidate ATC codes (ranked according to the decreasing probability_score predicted by the model) together with corresponding supporting evidence. This work not only contributes to knowing drugs' therapeutic, pharmacological and chemical properties, but also provides clues for drug repositioning and side-effect discovery. In addition, the construction of the prediction model also provides a general framework for similarity-based data integration which is suitable for other drug-related studies such as target, side-effect prediction etc. The web service SPACE is available at http://www.bprc.ac.cn/space. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ohshiro, Tomokazu; Angelaki, Dora E; DeAngelis, Gregory C
2017-07-19
Studies of multisensory integration by single neurons have traditionally emphasized empirical principles that describe nonlinear interactions between inputs from two sensory modalities. We previously proposed that many of these empirical principles could be explained by a divisive normalization mechanism operating in brain regions where multisensory integration occurs. This normalization model makes a critical diagnostic prediction: a non-preferred sensory input from one modality, which activates the neuron on its own, should suppress the response to a preferred input from another modality. We tested this prediction by recording from neurons in macaque area MSTd that integrate visual and vestibular cues regarding self-motion. We show that many MSTd neurons exhibit the diagnostic form of cross-modal suppression, whereas unisensory neurons in area MT do not. The normalization model also fits population responses better than a model based on subtractive inhibition. These findings provide strong support for a divisive normalization mechanism in multisensory integration. Copyright © 2017 Elsevier Inc. All rights reserved.
An Integrated Model of Suicidal Ideation in Transcultural Populations of Chinese Adolescents.
Leung, Cyrus L K; Kwok, Sylvia Y C L; Ling, Chloe C Y
2016-07-01
This study tested the model of suicidal ideation, incorporating family and personal factors to predict suicidal ideation with hopelessness as a mediating factor in the Hong Kong sample, to a sample in Shanghai. Using MGSEM, the study aims to investigate the personal correlates and the family correlates of suicidal ideation in Hong Kong and Shanghai adolescents. We integrated the family ecological and diathesis-stress-hopelessness models of suicidal ideation in connecting the correlates. A cross-sectional design was used. The full model achieved metric invariance and partial path-loading invariance. Family functioning and social problem solving negatively predicted hopelessness or suicidal ideation in both the Hong Kong and Shanghai adolescents. The results supported an integrative approach in facilitating parent-adolescent communication and strengthening family functioning, and reducing the use of negative social problem-solving styles in adolescent suicide prevention.
Weather Forecasting From Woolly Art to Solid Science
NASA Astrophysics Data System (ADS)
Lynch, P.
THE PREHISTORY OF SCIENTIFIC FORECASTING Vilhelm Bjerknes Lewis Fry Richardson Richardson's Forecast THE BEGINNING OF MODERN NUMERICAL WEATHER PREDICTION John von Neumann and the Meteorology Project The ENIAC Integrations The Barotropic Model Primitive Equation Models NUMERICAL WEATHER PREDICTION TODAY ECMWF HIRLAM CONCLUSIONS REFERENCES
Supermodeling With A Global Atmospheric Model
NASA Astrophysics Data System (ADS)
Wiegerinck, Wim; Burgers, Willem; Selten, Frank
2013-04-01
In weather and climate prediction studies it often turns out to be the case that the multi-model ensemble mean prediction has the best prediction skill scores. One possible explanation is that the major part of the model error is random and is averaged out in the ensemble mean. In the standard multi-model ensemble approach, the models are integrated in time independently and the predicted states are combined a posteriori. Recently an alternative ensemble prediction approach has been proposed in which the models exchange information during the simulation and synchronize on a common solution that is closer to the truth than any of the individual model solutions in the standard multi-model ensemble approach or a weighted average of these. This approach is called the super modeling approach (SUMO). The potential of the SUMO approach has been demonstrated in the context of simple, low-order, chaotic dynamical systems. The information exchange takes the form of linear nudging terms in the dynamical equations that nudge the solution of each model to the solution of all other models in the ensemble. With a suitable choice of the connection strengths the models synchronize on a common solution that is indeed closer to the true system than any of the individual model solutions without nudging. This approach is called connected SUMO. An alternative approach is to integrate a weighted averaged model, weighted SUMO. At each time step all models in the ensemble calculate the tendency, these tendencies are weighted averaged and the state is integrated one time step into the future with this weighted averaged tendency. It was shown that in case the connected SUMO synchronizes perfectly, the connected SUMO follows the weighted averaged trajectory and both approaches yield the same solution. In this study we pioneer both approaches in the context of a global, quasi-geostrophic, three-level atmosphere model that is capable of simulating quite realistically the extra-tropical circulation in the Northern Hemisphere winter.
Accurate and dynamic predictive model for better prediction in medicine and healthcare.
Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S
2018-05-01
Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.
Validation of the Integrated Medical Model Using Historical Space Flight Data
NASA Technical Reports Server (NTRS)
Kerstman, Eric L.; Minard, Charles G.; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.
2010-01-01
The Integrated Medical Model (IMM) utilizes Monte Carlo methodologies to predict the occurrence of medical events, utilization of resources, and clinical outcomes during space flight. Real-world data may be used to demonstrate the accuracy of the model. For this analysis, IMM predictions were compared to data from historical shuttle missions, not yet included as model source input. Initial goodness of fit test-ing on International Space Station data suggests that the IMM may overestimate the number of occurrences for three of the 83 medical conditions in the model. The IMM did not underestimate the occurrence of any medical condition. Initial comparisons with shuttle data demonstrate the importance of understanding crew preference (i.e., preferred analgesic) for accurately predicting the utilization of re-sources. The initial analysis demonstrates the validity of the IMM for its intended use and highlights areas for improvement.
Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model
Li, Xiaoqing; Wang, Yu
2018-01-01
Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing technology. PMID:29351254
Bridge Structure Deformation Prediction Based on GNSS Data Using Kalman-ARIMA-GARCH Model.
Xin, Jingzhou; Zhou, Jianting; Yang, Simon X; Li, Xiaoqing; Wang, Yu
2018-01-19
Bridges are an essential part of the ground transportation system. Health monitoring is fundamentally important for the safety and service life of bridges. A large amount of structural information is obtained from various sensors using sensing technology, and the data processing has become a challenging issue. To improve the prediction accuracy of bridge structure deformation based on data mining and to accurately evaluate the time-varying characteristics of bridge structure performance evolution, this paper proposes a new method for bridge structure deformation prediction, which integrates the Kalman filter, autoregressive integrated moving average model (ARIMA), and generalized autoregressive conditional heteroskedasticity (GARCH). Firstly, the raw deformation data is directly pre-processed using the Kalman filter to reduce the noise. After that, the linear recursive ARIMA model is established to analyze and predict the structure deformation. Finally, the nonlinear recursive GARCH model is introduced to further improve the accuracy of the prediction. Simulation results based on measured sensor data from the Global Navigation Satellite System (GNSS) deformation monitoring system demonstrated that: (1) the Kalman filter is capable of denoising the bridge deformation monitoring data; (2) the prediction accuracy of the proposed Kalman-ARIMA-GARCH model is satisfactory, where the mean absolute error increases only from 3.402 mm to 5.847 mm with the increment of the prediction step; and (3) in comparision to the Kalman-ARIMA model, the Kalman-ARIMA-GARCH model results in superior prediction accuracy as it includes partial nonlinear characteristics (heteroscedasticity); the mean absolute error of five-step prediction using the proposed model is improved by 10.12%. This paper provides a new way for structural behavior prediction based on data processing, which can lay a foundation for the early warning of bridge health monitoring system based on sensor data using sensing technology.
Integrating Articulatory Constraints into Models of Second Language Phonological Acquisition
ERIC Educational Resources Information Center
Colantoni, Laura; Steele, Jeffrey
2008-01-01
Models such as Eckman's markedness differential hypothesis, Flege's speech learning model, and Brown's feature-based theory of perception seek to explain and predict the relative difficulty second language (L2) learners face when acquiring new or similar sounds. In this paper, we test their predictive adequacy as concerns native English speakers'…
Orientation-dependent integral equation theory for a two-dimensional model of water
NASA Astrophysics Data System (ADS)
Urbič, T.; Vlachy, V.; Kalyuzhnyi, Yu. V.; Dill, K. A.
2003-03-01
We develop an integral equation theory that applies to strongly associating orientation-dependent liquids, such as water. In an earlier treatment, we developed a Wertheim integral equation theory (IET) that we tested against NPT Monte Carlo simulations of the two-dimensional Mercedes Benz model of water. The main approximation in the earlier calculation was an orientational averaging in the multidensity Ornstein-Zernike equation. Here we improve the theory by explicit introduction of an orientation dependence in the IET, based upon expanding the two-particle angular correlation function in orthogonal basis functions. We find that the new orientation-dependent IET (ODIET) yields a considerable improvement of the predicted structure of water, when compared to the Monte Carlo simulations. In particular, ODIET predicts more long-range order than the original IET, with hexagonal symmetry, as expected for the hydrogen bonded ice in this model. The new theoretical approximation still errs in some subtle properties; for example, it does not predict liquid water's density maximum with temperature or the negative thermal expansion coefficient.
Widder, Stefanie; Allen, Rosalind J; Pfeiffer, Thomas; Curtis, Thomas P; Wiuf, Carsten; Sloan, William T; Cordero, Otto X; Brown, Sam P; Momeni, Babak; Shou, Wenying; Kettle, Helen; Flint, Harry J; Haas, Andreas F; Laroche, Béatrice; Kreft, Jan-Ulrich; Rainey, Paul B; Freilich, Shiri; Schuster, Stefan; Milferstedt, Kim; van der Meer, Jan R; Groβkopf, Tobias; Huisman, Jef; Free, Andrew; Picioreanu, Cristian; Quince, Christopher; Klapper, Isaac; Labarthe, Simon; Smets, Barth F; Wang, Harris; Soyer, Orkun S
2016-01-01
The importance of microbial communities (MCs) cannot be overstated. MCs underpin the biogeochemical cycles of the earth's soil, oceans and the atmosphere, and perform ecosystem functions that impact plants, animals and humans. Yet our ability to predict and manage the function of these highly complex, dynamically changing communities is limited. Building predictive models that link MC composition to function is a key emerging challenge in microbial ecology. Here, we argue that addressing this challenge requires close coordination of experimental data collection and method development with mathematical model building. We discuss specific examples where model–experiment integration has already resulted in important insights into MC function and structure. We also highlight key research questions that still demand better integration of experiments and models. We argue that such integration is needed to achieve significant progress in our understanding of MC dynamics and function, and we make specific practical suggestions as to how this could be achieved. PMID:27022995
Liu, Yushan; Ge, Baoming; Abu-Rub, Haitham; ...
2016-06-14
In this study, the active power filter (APF) that consists of a half-bridge leg and an ac capacitor is integrated in the single-phase quasi-Z-source inverter (qZSI) in this paper to avoid the second harmonic power flowing into the dc side. The capacitor of APF buffers the second harmonic power of the load, and the ac capacitor allows highly pulsating ac voltage, so that the capacitances of both dc and ac sides can be small. A model predictive direct power control (DPC) is further proposed to achieve the purpose of this newtopology through predicting the capacitor voltage of APF at eachmore » sampling period and ensuring the APF power to track the second harmonic power of single-phase qZSI. Simulation and experimental results verify the model predictive DPC for the APF-integrated single-phase qZSI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yushan; Ge, Baoming; Abu-Rub, Haitham
In this study, the active power filter (APF) that consists of a half-bridge leg and an ac capacitor is integrated in the single-phase quasi-Z-source inverter (qZSI) in this paper to avoid the second harmonic power flowing into the dc side. The capacitor of APF buffers the second harmonic power of the load, and the ac capacitor allows highly pulsating ac voltage, so that the capacitances of both dc and ac sides can be small. A model predictive direct power control (DPC) is further proposed to achieve the purpose of this newtopology through predicting the capacitor voltage of APF at eachmore » sampling period and ensuring the APF power to track the second harmonic power of single-phase qZSI. Simulation and experimental results verify the model predictive DPC for the APF-integrated single-phase qZSI.« less
Active lifestyles in older adults: an integrated predictive model of physical activity and exercise
Galli, Federica; Chirico, Andrea; Mallia, Luca; Girelli, Laura; De Laurentiis, Michelino; Lucidi, Fabio; Giordano, Antonio; Botti, Gerardo
2018-01-01
Physical activity and exercise have been identified as behaviors to preserve physical and mental health in older adults. The aim of the present study was to test the Integrated Behavior Change model in exercise and physical activity behaviors. The study evaluated two different samples of older adults: the first engaged in exercise class, the second doing spontaneous physical activity. The key analyses relied on Variance-Based Structural Modeling, which were performed by means of WARP PLS 6.0 statistical software. The analyses estimated the Integrated Behavior Change model in predicting exercise and physical activity, in a longitudinal design across two months of assessment. The tested models exhibited a good fit with the observed data derived from the model focusing on exercise, as well as with those derived from the model focusing on physical activity. Results showed, also, some effects and relations specific to each behavioral context. Results may form a starting point for future experimental and intervention research. PMID:29875997
Estrada, Mica; Woodcock, Anna; Hernandez, Paul R.; Schultz, P. Wesley
2010-01-01
Students from several ethnic minority groups are underrepresented in the sciences, such that minority students more frequently drop out of the scientific career path than non-minority students. Viewed from a perspective of social influence, this pattern suggests that minority students do not integrate into the scientific community at the same rate as non-minority students. Kelman (1958, 2006) describes a tripartite integration model of social influence (TIMSI) by which a person orients to a social system. To test if this model predicts integration into the scientific community, we conducted analyses of data from a national panel of minority science students. A structural equation model framework showed that self-efficacy (operationalized consistent with Kelman’s ‘rule-orientation’) predicted student intentions to pursue a scientific career. However, when identification as a scientist and internalization of values are added to the model, self-efficacy becomes a poorer predictor of intention. Additional mediation analyses support the conclusion that while having scientific self-efficacy is important, identifying with and endorsing the values of the social system reflect a deeper integration and more durable motivation to persist as a scientist. PMID:21552374
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, Claire, E-mail: claire.grant@astrazeneca.com; Ewart, Lorna; Muthas, Daniel
Nausea and vomiting are components of a complex mechanism that signals food avoidance and protection of the body against the absorption of ingested toxins. This response can also be triggered by pharmaceuticals. Predicting clinical nausea and vomiting liability for pharmaceutical agents based on pre-clinical data can be problematic as no single animal model is a universal predictor. Moreover, efforts to improve models are hampered by the lack of translational animal and human data in the public domain. AZD3514 is a novel, orally-administered compound that inhibits androgen receptor signaling and down-regulates androgen receptor expression. Here we have explored the utility ofmore » integrating data from several pre-clinical models to predict nausea and vomiting in the clinic. Single and repeat doses of AZD3514 resulted in emesis, salivation and gastrointestinal disturbances in the dog, and inhibited gastric emptying in rats after a single dose. AZD3514, at clinically relevant exposures, induced dose-responsive “pica” behaviour in rats after single and multiple daily doses, and induced retching and vomiting behaviour in ferrets after a single dose. We compare these data with the clinical manifestation of nausea and vomiting encountered in patients with castration-resistant prostate cancer receiving AZD3514. Our data reveal a striking relationship between the pre-clinical observations described and the experience of nausea and vomiting in the clinic. In conclusion, the emetic nature of AZD3514 was predicted across a range of pre-clinical models, and the approach presented provides a valuable framework for predicition of clinical nausea and vomiting. - Highlights: • Integrated pre-clinical data can be used to predict clinical nausea and vomiting. • Data integrated from standard toxicology studies is sufficient to make a prediction. • The use of the nausea algorithm developed by Parkinson (2012) aids the prediction. • Additional pre-clinical studies can be used to confirm and quantify the risk.« less
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
Integrating WEPP into the WEPS infrastructure
USDA-ARS?s Scientific Manuscript database
The Wind Erosion Prediction System (WEPS) and the Water Erosion Prediction Project (WEPP) share a common modeling philosophy, that of moving away from primarily empirically based models based on indices or "average conditions", and toward a more process based approach which can be evaluated using ac...
NASA Astrophysics Data System (ADS)
Wu, M. Q.; Pan, C. K.; Chan, V. S.; Li, G. Q.; Garofalo, A. M.; Jian, X.; Liu, L.; Ren, Q. L.; Chen, J. L.; Gao, X.; Gong, X. Z.; Ding, S. Y.; Qian, J. P.; Cfetr Physics Team
2018-04-01
Time-dependent integrated modeling of DIII-D ITER-like and high bootstrap current plasma ramp-up discharges has been performed with the equilibrium code EFIT, and the transport codes TGYRO and ONETWO. Electron and ion temperature profiles are simulated by TGYRO with the TGLF (SAT0 or VX model) turbulent and NEO neoclassical transport models. The VX model is a new empirical extension of the TGLF turbulent model [Jian et al., Nucl. Fusion 58, 016011 (2018)], which captures the physics of multi-scale interaction between low-k and high-k turbulence from nonlinear gyro-kinetic simulation. This model is demonstrated to accurately model low Ip discharges from the EAST tokamak. Time evolution of the plasma current density profile is simulated by ONETWO with the experimental current ramp-up rate. The general trend of the predicted evolution of the current density profile is consistent with that obtained from the equilibrium reconstruction with Motional Stark effect constraints. The predicted evolution of βN , li , and βP also agrees well with the experiments. For the ITER-like cases, the predicted electron and ion temperature profiles using TGLF_Sat0 agree closely with the experimental measured profiles, and are demonstrably better than other proposed transport models. For the high bootstrap current case, the predicted electron and ion temperature profiles perform better in the VX model. It is found that the SAT0 model works well at high IP (>0.76 MA) while the VX model covers a wider range of plasma current ( IP > 0.6 MA). The results reported in this paper suggest that the developed integrated modeling could be a candidate for ITER and CFETR ramp-up engineering design modeling.
NASA Astrophysics Data System (ADS)
Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Chang, Kyung Hwan; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie
2017-08-01
The aim of this study is an integrated research for text-based data mining and toxicity prediction modeling system for clinical decision support system based on big data in radiation oncology as a preliminary research. The structured and unstructured data were prepared by treatment plans and the unstructured data were extracted by dose-volume data image pattern recognition of prostate cancer for research articles crawling through the internet. We modeled an artificial neural network to build a predictor model system for toxicity prediction of organs at risk. We used a text-based data mining approach to build the artificial neural network model for bladder and rectum complication predictions. The pattern recognition method was used to mine the unstructured toxicity data for dose-volume at the detection accuracy of 97.9%. The confusion matrix and training model of the neural network were achieved with 50 modeled plans (n = 50) for validation. The toxicity level was analyzed and the risk factors for 25% bladder, 50% bladder, 20% rectum, and 50% rectum were calculated by the artificial neural network algorithm. As a result, 32 plans could cause complication but 18 plans were designed as non-complication among 50 modeled plans. We integrated data mining and a toxicity modeling method for toxicity prediction using prostate cancer cases. It is shown that a preprocessing analysis using text-based data mining and prediction modeling can be expanded to personalized patient treatment decision support based on big data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paret, Paul P; DeVoto, Douglas J; Narumanchi, Sreekant V
Sintered silver has proven to be a promising candidate for use as a die-attach and substrate-attach material in automotive power electronics components. It holds promise of greater reliability than lead-based and lead-free solders, especially at higher temperatures (less than 200 degrees Celcius). Accurate predictive lifetime models of sintered silver need to be developed and its failure mechanisms thoroughly characterized before it can be deployed as a die-attach or substrate-attach material in wide-bandgap device-based packages. We present a finite element method (FEM) modeling methodology that can offer greater accuracy in predicting the failure of sintered silver under accelerated thermal cycling. Amore » fracture mechanics-based approach is adopted in the FEM model, and J-integral/thermal cycle values are computed. In this paper, we outline the procedures for obtaining the J-integral/thermal cycle values in a computational model and report on the possible advantage of using these values as modeling parameters in a predictive lifetime model.« less
NASA Astrophysics Data System (ADS)
Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.
2010-12-01
The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.
2017-01-01
Background Influenza is a viral respiratory disease capable of causing epidemics that represent a threat to communities worldwide. The rapidly growing availability of electronic “big data” from diagnostic and prediagnostic sources in health care and public health settings permits advance of a new generation of methods for local detection and prediction of winter influenza seasons and influenza pandemics. Objective The aim of this study was to present a method for integrated detection and prediction of influenza virus activity in local settings using electronically available surveillance data and to evaluate its performance by retrospective application on authentic data from a Swedish county. Methods An integrated detection and prediction method was formally defined based on a design rationale for influenza detection and prediction methods adapted for local surveillance. The novel method was retrospectively applied on data from the winter influenza season 2008-09 in a Swedish county (population 445,000). Outcome data represented individuals who met a clinical case definition for influenza (based on International Classification of Diseases version 10 [ICD-10] codes) from an electronic health data repository. Information from calls to a telenursing service in the county was used as syndromic data source. Results The novel integrated detection and prediction method is based on nonmechanistic statistical models and is designed for integration in local health information systems. The method is divided into separate modules for detection and prediction of local influenza virus activity. The function of the detection module is to alert for an upcoming period of increased load of influenza cases on local health care (using influenza-diagnosis data), whereas the function of the prediction module is to predict the timing of the activity peak (using syndromic data) and its intensity (using influenza-diagnosis data). For detection modeling, exponential regression was used based on the assumption that the beginning of a winter influenza season has an exponential growth of infected individuals. For prediction modeling, linear regression was applied on 7-day periods at the time in order to find the peak timing, whereas a derivate of a normal distribution density function was used to find the peak intensity. We found that the integrated detection and prediction method detected the 2008-09 winter influenza season on its starting day (optimal timeliness 0 days), whereas the predicted peak was estimated to occur 7 days ahead of the factual peak and the predicted peak intensity was estimated to be 26% lower than the factual intensity (6.3 compared with 8.5 influenza-diagnosis cases/100,000). Conclusions Our detection and prediction method is one of the first integrated methods specifically designed for local application on influenza data electronically available for surveillance. The performance of the method in a retrospective study indicates that further prospective evaluations of the methods are justified. PMID:28619700
Spreco, Armin; Eriksson, Olle; Dahlström, Örjan; Cowling, Benjamin John; Timpka, Toomas
2017-06-15
Influenza is a viral respiratory disease capable of causing epidemics that represent a threat to communities worldwide. The rapidly growing availability of electronic "big data" from diagnostic and prediagnostic sources in health care and public health settings permits advance of a new generation of methods for local detection and prediction of winter influenza seasons and influenza pandemics. The aim of this study was to present a method for integrated detection and prediction of influenza virus activity in local settings using electronically available surveillance data and to evaluate its performance by retrospective application on authentic data from a Swedish county. An integrated detection and prediction method was formally defined based on a design rationale for influenza detection and prediction methods adapted for local surveillance. The novel method was retrospectively applied on data from the winter influenza season 2008-09 in a Swedish county (population 445,000). Outcome data represented individuals who met a clinical case definition for influenza (based on International Classification of Diseases version 10 [ICD-10] codes) from an electronic health data repository. Information from calls to a telenursing service in the county was used as syndromic data source. The novel integrated detection and prediction method is based on nonmechanistic statistical models and is designed for integration in local health information systems. The method is divided into separate modules for detection and prediction of local influenza virus activity. The function of the detection module is to alert for an upcoming period of increased load of influenza cases on local health care (using influenza-diagnosis data), whereas the function of the prediction module is to predict the timing of the activity peak (using syndromic data) and its intensity (using influenza-diagnosis data). For detection modeling, exponential regression was used based on the assumption that the beginning of a winter influenza season has an exponential growth of infected individuals. For prediction modeling, linear regression was applied on 7-day periods at the time in order to find the peak timing, whereas a derivate of a normal distribution density function was used to find the peak intensity. We found that the integrated detection and prediction method detected the 2008-09 winter influenza season on its starting day (optimal timeliness 0 days), whereas the predicted peak was estimated to occur 7 days ahead of the factual peak and the predicted peak intensity was estimated to be 26% lower than the factual intensity (6.3 compared with 8.5 influenza-diagnosis cases/100,000). Our detection and prediction method is one of the first integrated methods specifically designed for local application on influenza data electronically available for surveillance. The performance of the method in a retrospective study indicates that further prospective evaluations of the methods are justified. ©Armin Spreco, Olle Eriksson, Örjan Dahlström, Benjamin John Cowling, Toomas Timpka. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.06.2017.
Predicting Athletes' Pre-Exercise Fluid Intake: A Theoretical Integration Approach.
Li, Chunxiao; Sun, Feng-Hua; Zhang, Liancheng; Chan, Derwin King Chung
2018-05-21
Pre-exercise fluid intake is an important healthy behavior for maintaining athletes’ sports performances and health. However, athletes’ behavioral adherence to fluid intake and its underlying psychological mechanisms have not been investigated. This prospective study aimed to use a health psychology model that integrates the self-determination theory and the theory of planned behavior for understanding pre-exercise fluid intake among athletes. Participants ( n = 179) were athletes from college sport teams who completed surveys at two time points. Baseline (Time 1) assessment comprised psychological variables of the integrated model (i.e., autonomous and controlled motivation, attitude, subjective norm, perceived behavioral control, and intention) and fluid intake (i.e., behavior) was measured prospectively at one month (Time 2). Path analysis showed that the positive association between autonomous motivation and intention was mediated by subjective norm and perceived behavioral control. Controlled motivation positively predicted the subjective norm. Intentions positively predicted pre-exercise fluid intake behavior. Overall, the pattern of results was generally consistent with the integrated model, and it was suggested that athletes’ pre-exercise fluid intake behaviors were associated with the motivational and social cognitive factors of the model. The research findings could be informative for coaches and sport scientists to promote athletes’ pre-exercise fluid intake behaviors.
A changing climate: impacts on human exposures to O3 using an integrated modeling methodology
Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...
ERIC Educational Resources Information Center
Moore, Corey L.; Wang, Ningning; Washington, Janique Tynez
2017-01-01
Purpose: This study assessed and demonstrated the efficacy of two select empirical forecast models (i.e., autoregressive integrated moving average [ARIMA] model vs. grey model [GM]) in accurately predicting state vocational rehabilitation agency (SVRA) rehabilitation success rate trends across six different racial and ethnic population cohorts…
Integrating models to predict regional haze from wildland fire.
D. McKenzie; S.M. O' Neill; N. Larkin; R.A. Norheim
2006-01-01
Visibility impairment from regional haze is a significant problem throughout the continental United States. A substantial portion of regional haze is produced by smoke from prescribed and wildland fires. Here we describe the integration of four simulation models, an array of GIS raster layers, and a set of algorithms for fire-danger calculations into a modeling...
An Exercise Health Simulation Method Based on Integrated Human Thermophysiological Model
Chen, Xiaohui; Yu, Liang; Yang, Kaixing
2017-01-01
Research of healthy exercise has garnered a keen research for the past few years. It is known that participation in a regular exercise program can help improve various aspects of cardiovascular function and reduce the risk of suffering from illness. But some exercise accidents like dehydration, exertional heatstroke, and even sudden death need to be brought to attention. If these exercise accidents can be analyzed and predicted before they happened, it will be beneficial to alleviate or avoid disease or mortality. To achieve this objective, an exercise health simulation approach is proposed, in which an integrated human thermophysiological model consisting of human thermal regulation model and a nonlinear heart rate regulation model is reported. The human thermoregulatory mechanism as well as the heart rate response mechanism during exercise can be simulated. On the basis of the simulated physiological indicators, a fuzzy finite state machine is constructed to obtain the possible health transition sequence and predict the exercise health status. The experiment results show that our integrated exercise thermophysiological model can numerically simulate the thermal and physiological processes of the human body during exercise and the predicted exercise health transition sequence from finite state machine can be used in healthcare. PMID:28702074
A model of interval timing by neural integration
Simen, Patrick; Balci, Fuat; deSouza, Laura; Cohen, Jonathan D.; Holmes, Philip
2011-01-01
We show that simple assumptions about neural processing lead to a model of interval timing as a temporal integration process, in which a noisy firing-rate representation of time rises linearly on average toward a response threshold over the course of an interval. Our assumptions include: that neural spike trains are approximately independent Poisson processes; that correlations among them can be largely cancelled by balancing excitation and inhibition; that neural populations can act as integrators; and that the objective of timed behavior is maximal accuracy and minimal variance. The model accounts for a variety of physiological and behavioral findings in rodents, monkeys and humans, including ramping firing rates between the onset of reward-predicting cues and the receipt of delayed rewards, and universally scale-invariant response time distributions in interval timing tasks. It furthermore makes specific, well-supported predictions about the skewness of these distributions, a feature of timing data that is usually ignored. The model also incorporates a rapid (potentially one-shot) duration-learning procedure. Human behavioral data support the learning rule’s predictions regarding learning speed in sequences of timed responses. These results suggest that simple, integration-based models should play as prominent a role in interval timing theory as they do in theories of perceptual decision making, and that a common neural mechanism may underlie both types of behavior. PMID:21697374
Eppinger, Ben; Walter, Maik; Li, Shu-Chen
2017-04-01
In this study, we investigated the interplay of habitual (model-free) and goal-directed (model-based) decision processes by using a two-stage Markov decision task in combination with event-related potentials (ERPs) and computational modeling. To manipulate the demands on model-based decision making, we applied two experimental conditions with different probabilities of transitioning from the first to the second stage of the task. As we expected, when the stage transitions were more predictable, participants showed greater model-based (planning) behavior. Consistent with this result, we found that stimulus-evoked parietal (P300) activity at the second stage of the task increased with the predictability of the state transitions. However, the parietal activity also reflected model-free information about the expected values of the stimuli, indicating that at this stage of the task both types of information are integrated to guide decision making. Outcome-related ERP components only reflected reward-related processes: Specifically, a medial prefrontal ERP component (the feedback-related negativity) was sensitive to negative outcomes, whereas a component that is elicited by reward (the feedback-related positivity) increased as a function of positive prediction errors. Taken together, our data indicate that stimulus-locked parietal activity reflects the integration of model-based and model-free information during decision making, whereas feedback-related medial prefrontal signals primarily reflect reward-related decision processes.
Baquero, Oswaldo Santos; Santana, Lidia Maria Reis; Chiaravalloti-Neto, Francisco
2018-01-01
Globally, the number of dengue cases has been on the increase since 1990 and this trend has also been found in Brazil and its most populated city-São Paulo. Surveillance systems based on predictions allow for timely decision making processes, and in turn, timely and efficient interventions to reduce the burden of the disease. We conducted a comparative study of dengue predictions in São Paulo city to test the performance of trained seasonal autoregressive integrated moving average models, generalized additive models and artificial neural networks. We also used a naïve model as a benchmark. A generalized additive model with lags of the number of cases and meteorological variables had the best performance, predicted epidemics of unprecedented magnitude and its performance was 3.16 times higher than the benchmark and 1.47 higher that the next best performing model. The predictive models captured the seasonal patterns but differed in their capacity to anticipate large epidemics and all outperformed the benchmark. In addition to be able to predict epidemics of unprecedented magnitude, the best model had computational advantages, since its training and tuning was straightforward and required seconds or at most few minutes. These are desired characteristics to provide timely results for decision makers. However, it should be noted that predictions are made just one month ahead and this is a limitation that future studies could try to reduce.
Designing and benchmarking the MULTICOM protein structure prediction system
2013-01-01
Background Predicting protein structure from sequence is one of the most significant and challenging problems in bioinformatics. Numerous bioinformatics techniques and tools have been developed to tackle almost every aspect of protein structure prediction ranging from structural feature prediction, template identification and query-template alignment to structure sampling, model quality assessment, and model refinement. How to synergistically select, integrate and improve the strengths of the complementary techniques at each prediction stage and build a high-performance system is becoming a critical issue for constructing a successful, competitive protein structure predictor. Results Over the past several years, we have constructed a standalone protein structure prediction system MULTICOM that combines multiple sources of information and complementary methods at all five stages of the protein structure prediction process including template identification, template combination, model generation, model assessment, and model refinement. The system was blindly tested during the ninth Critical Assessment of Techniques for Protein Structure Prediction (CASP9) in 2010 and yielded very good performance. In addition to studying the overall performance on the CASP9 benchmark, we thoroughly investigated the performance and contributions of each component at each stage of prediction. Conclusions Our comprehensive and comparative study not only provides useful and practical insights about how to select, improve, and integrate complementary methods to build a cutting-edge protein structure prediction system but also identifies a few new sources of information that may help improve the design of a protein structure prediction system. Several components used in the MULTICOM system are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:23442819
MollDE: a homology modeling framework you can click with.
Canutescu, Adrian A; Dunbrack, Roland L
2005-06-15
Molecular Integrated Development Environment (MolIDE) is an integrated application designed to provide homology modeling tools and protocols under a uniform, user-friendly graphical interface. Its main purpose is to combine the most frequent modeling steps in a semi-automatic, interactive way, guiding the user from the target protein sequence to the final three-dimensional protein structure. The typical basic homology modeling process is composed of building sequence profiles of the target sequence family, secondary structure prediction, sequence alignment with PDB structures, assisted alignment editing, side-chain prediction and loop building. All of these steps are available through a graphical user interface. MolIDE's user-friendly and streamlined interactive modeling protocol allows the user to focus on the important modeling questions, hiding from the user the raw data generation and conversion steps. MolIDE was designed from the ground up as an open-source, cross-platform, extensible framework. This allows developers to integrate additional third-party programs to MolIDE. http://dunbrack.fccc.edu/molide/molide.php rl_dunbrack@fccc.edu.
Improving wave forecasting by integrating ensemble modelling and machine learning
NASA Astrophysics Data System (ADS)
O'Donncha, F.; Zhang, Y.; James, S. C.
2017-12-01
Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.
Foliage Density Distribution and Prediction of Intensively Managed Loblolly Pine
Yujia Zhang; Bruce E. Borders; Rodney E. Will; Hector De Los Santos Posadas
2004-01-01
The pipe model theory says that foliage biomass is proportional to the sapwood area at the base of the live crown. This knowledge was incorporated in an effort to develop a foliage biomass prediction model from integrating a stipulated foliage biomass distribution function within the crown. This model was parameterized using data collected from intensively managed...
Xu, Yifang; Collins, Leslie M
2004-04-01
The incorporation of low levels of noise into an electrical stimulus has been shown to improve auditory thresholds in some human subjects (Zeng et al., 2000). In this paper, thresholds for noise-modulated pulse-train stimuli are predicted utilizing a stochastic neural-behavioral model of ensemble fiber responses to bi-phasic stimuli. The neural refractory effect is described using a Markov model for a noise-free pulse-train stimulus and a closed-form solution for the steady-state neural response is provided. For noise-modulated pulse-train stimuli, a recursive method using the conditional probability is utilized to track the neural responses to each successive pulse. A neural spike count rule has been presented for both threshold and intensity discrimination under the assumption that auditory perception occurs via integration over a relatively long time period (Bruce et al., 1999). An alternative approach originates from the hypothesis of the multilook model (Viemeister and Wakefield, 1991), which argues that auditory perception is based on several shorter time integrations and may suggest an NofM model for prediction of pulse-train threshold. This motivates analyzing the neural response to each individual pulse within a pulse train, which is considered to be the brief look. A logarithmic rule is hypothesized for pulse-train threshold. Predictions from the multilook model are shown to match trends in psychophysical data for noise-free stimuli that are not always matched by the long-time integration rule. Theoretical predictions indicate that threshold decreases as noise variance increases. Theoretical models of the neural response to pulse-train stimuli not only reduce calculational overhead but also facilitate utilization of signal detection theory and are easily extended to multichannel psychophysical tasks.
NASA Astrophysics Data System (ADS)
Quinn Thomas, R.; Brooks, Evan B.; Jersild, Annika L.; Ward, Eric J.; Wynne, Randolph H.; Albaugh, Timothy J.; Dinon-Aldridge, Heather; Burkhart, Harold E.; Domec, Jean-Christophe; Fox, Thomas R.; Gonzalez-Benecke, Carlos A.; Martin, Timothy A.; Noormets, Asko; Sampson, David A.; Teskey, Robert O.
2017-07-01
Predicting how forest carbon cycling will change in response to climate change and management depends on the collective knowledge from measurements across environmental gradients, ecosystem manipulations of global change factors, and mathematical models. Formally integrating these sources of knowledge through data assimilation, or model-data fusion, allows the use of past observations to constrain model parameters and estimate prediction uncertainty. Data assimilation (DA) focused on the regional scale has the opportunity to integrate data from both environmental gradients and experimental studies to constrain model parameters. Here, we introduce a hierarchical Bayesian DA approach (Data Assimilation to Predict Productivity for Ecosystems and Regions, DAPPER) that uses observations of carbon stocks, carbon fluxes, water fluxes, and vegetation dynamics from loblolly pine plantation ecosystems across the southeastern US to constrain parameters in a modified version of the Physiological Principles Predicting Growth (3-PG) forest growth model. The observations included major experiments that manipulated atmospheric carbon dioxide (CO2) concentration, water, and nutrients, along with nonexperimental surveys that spanned environmental gradients across an 8.6 × 105 km2 region. We optimized regionally representative posterior distributions for model parameters, which dependably predicted data from plots withheld from the data assimilation. While the mean bias in predictions of nutrient fertilization experiments, irrigation experiments, and CO2 enrichment experiments was low, future work needs to focus modifications to model structures that decrease the bias in predictions of drought experiments. Predictions of how growth responded to elevated CO2 strongly depended on whether ecosystem experiments were assimilated and whether the assimilated field plots in the CO2 study were allowed to have different mortality parameters than the other field plots in the region. We present predictions of stem biomass productivity under elevated CO2, decreased precipitation, and increased nutrient availability that include estimates of uncertainty for the southeastern US. Overall, we (1) demonstrated how three decades of research in southeastern US planted pine forests can be used to develop DA techniques that use multiple locations, multiple data streams, and multiple ecosystem experiment types to optimize parameters and (2) developed a tool for the development of future predictions of forest productivity for natural resource managers that leverage a rich dataset of integrated ecosystem observations across a region.
NASA Technical Reports Server (NTRS)
Gracey, Renee; Bartoszyk, Andrew; Cofie, Emmanuel; Comber, Brian; Hartig, George; Howard, Joseph; Sabatke, Derek; Wenzel, Greg; Ohl, Raymond
2016-01-01
The James Webb Space Telescope includes the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. We performed extensive structural, thermal, and optical performance(STOP) modeling in support of all phases of ISIM development. In this paper, we focus on modeling and results associated with test and verification. ISIMs test program is bound by ground environments, mostly notably the 1g and test chamber thermal environments. This paper describes STOP modeling used to predict ISIM system performance in 0g and at various on-orbit temperature environments. The predictions are used to project results obtained during testing to on-orbit performance.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan
2016-01-01
Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.
2016-01-01
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563
Postnova, Svetlana; Robinson, Peter A; Postnov, Dmitry D
2013-01-01
Shift work has become an integral part of our life with almost 20% of the population being involved in different shift schedules in developed countries. However, the atypical work times, especially the night shifts, are associated with reduced quality and quantity of sleep that leads to increase of sleepiness often culminating in accidents. It has been demonstrated that shift workers' sleepiness can be improved by a proper scheduling of light exposure and optimizing shifts timing. Here, an integrated physiologically-based model of sleep-wake cycles is used to predict adaptation to shift work in different light conditions and for different shift start times for a schedule of four consecutive days of work. The integrated model combines a model of the ascending arousal system in the brain that controls the sleep-wake switch and a human circadian pacemaker model. To validate the application of the integrated model and demonstrate its utility, its dynamics are adjusted to achieve a fit to published experimental results showing adaptation of night shift workers (n = 8) in conditions of either bright or regular lighting. Further, the model is used to predict the shift workers' adaptation to the same shift schedule, but for conditions not considered in the experiment. The model demonstrates that the intensity of shift light can be reduced fourfold from that used in the experiment and still produce good adaptation to night work. The model predicts that sleepiness of the workers during night shifts on a protocol with either bright or regular lighting can be significantly improved by starting the shift earlier in the night, e.g.; at 21:00 instead of 00:00. Finally, the study predicts that people of the same chronotype, i.e. with identical sleep times in normal conditions, can have drastically different responses to shift work depending on their intrinsic circadian and homeostatic parameters.
Postnova, Svetlana; Robinson, Peter A.; Postnov, Dmitry D.
2013-01-01
Shift work has become an integral part of our life with almost 20% of the population being involved in different shift schedules in developed countries. However, the atypical work times, especially the night shifts, are associated with reduced quality and quantity of sleep that leads to increase of sleepiness often culminating in accidents. It has been demonstrated that shift workers’ sleepiness can be improved by a proper scheduling of light exposure and optimizing shifts timing. Here, an integrated physiologically-based model of sleep-wake cycles is used to predict adaptation to shift work in different light conditions and for different shift start times for a schedule of four consecutive days of work. The integrated model combines a model of the ascending arousal system in the brain that controls the sleep-wake switch and a human circadian pacemaker model. To validate the application of the integrated model and demonstrate its utility, its dynamics are adjusted to achieve a fit to published experimental results showing adaptation of night shift workers (n = 8) in conditions of either bright or regular lighting. Further, the model is used to predict the shift workers’ adaptation to the same shift schedule, but for conditions not considered in the experiment. The model demonstrates that the intensity of shift light can be reduced fourfold from that used in the experiment and still produce good adaptation to night work. The model predicts that sleepiness of the workers during night shifts on a protocol with either bright or regular lighting can be significantly improved by starting the shift earlier in the night, e.g.; at 21∶00 instead of 00∶00. Finally, the study predicts that people of the same chronotype, i.e. with identical sleep times in normal conditions, can have drastically different responses to shift work depending on their intrinsic circadian and homeostatic parameters. PMID:23308206
[Application of ARIMA model on prediction of malaria incidence].
Jing, Xia; Hua-Xun, Zhang; Wen, Lin; Su-Jian, Pei; Ling-Cong, Sun; Xiao-Rong, Dong; Mu-Min, Cao; Dong-Ni, Wu; Shunxiang, Cai
2016-01-29
To predict the incidence of local malaria of Hubei Province applying the Autoregressive Integrated Moving Average model (ARIMA). SPSS 13.0 software was applied to construct the ARIMA model based on the monthly local malaria incidence in Hubei Province from 2004 to 2009. The local malaria incidence data of 2010 were used for model validation and evaluation. The model of ARIMA (1, 1, 1) (1, 1, 0) 12 was tested as relatively the best optimal with the AIC of 76.085 and SBC of 84.395. All the actual incidence data were in the range of 95% CI of predicted value of the model. The prediction effect of the model was acceptable. The ARIMA model could effectively fit and predict the incidence of local malaria of Hubei Province.
NASA Astrophysics Data System (ADS)
Aye, S. A.; Heyns, P. S.
2017-02-01
This paper proposes an optimal Gaussian process regression (GPR) for the prediction of remaining useful life (RUL) of slow speed bearings based on a novel degradation assessment index obtained from acoustic emission signal. The optimal GPR is obtained from an integration or combination of existing simple mean and covariance functions in order to capture the observed trend of the bearing degradation as well the irregularities in the data. The resulting integrated GPR model provides an excellent fit to the data and improves over the simple GPR models that are based on simple mean and covariance functions. In addition, it achieves a low percentage error prediction of the remaining useful life of slow speed bearings. These findings are robust under varying operating conditions such as loading and speed and can be applied to nonlinear and nonstationary machine response signals useful for effective preventive machine maintenance purposes.
CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN
2013-01-01
After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379
Amézquita, A; Weller, C L; Wang, L; Thippareddi, H; Burson, D E
2005-05-25
Numerous small meat processors in the United States have difficulties complying with the stabilization performance standards for preventing growth of Clostridium perfringens by 1 log10 cycle during cooling of ready-to-eat (RTE) products. These standards were established by the Food Safety and Inspection Service (FSIS) of the US Department of Agriculture in 1999. In recent years, several attempts have been made to develop predictive models for growth of C. perfringens within the range of cooling temperatures included in the FSIS standards. Those studies mainly focused on microbiological aspects, using hypothesized cooling rates. Conversely, studies dealing with heat transfer models to predict cooling rates in meat products do not address microbial growth. Integration of heat transfer relationships with C. perfringens growth relationships during cooling of meat products has been very limited. Therefore, a computer simulation scheme was developed to analyze heat transfer phenomena and temperature-dependent C. perfringens growth during cooling of cooked boneless cured ham. The temperature history of ham was predicted using a finite element heat diffusion model. Validation of heat transfer predictions used experimental data collected in commercial meat-processing facilities. For C. perfringens growth, a dynamic model was developed using Baranyi's nonautonomous differential equation. The bacterium's growth model was integrated into the computer program using predicted temperature histories as input values. For cooling cooked hams from 66.6 degrees C to 4.4 degrees C using forced air, the maximum deviation between predicted and experimental core temperature data was 2.54 degrees C. Predicted C. perfringens growth curves obtained from dynamic modeling showed good agreement with validated results for three different cooling scenarios. Mean absolute values of relative errors were below 6%, and deviations between predicted and experimental cell counts were within 0.37 log10 CFU/g. For a cooling process which was in exact compliance with the FSIS stabilization performance standards, a mean net growth of 1.37 log10 CFU/g was predicted. This study introduced the combination of engineering modeling and microbiological modeling as a useful quantitative tool for general food safety applications, such as risk assessment and hazard analysis and critical control points (HACCP) plans.
A Family of Well-Clear Boundary Models for the Integration of UAS in the NAS
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Narkawicz, Anthony; Chamberlain, James; Consiglio, Maria; Upchurch, Jason
2014-01-01
The FAA-sponsored Sense and Avoid Workshop for Unmanned Aircraft Systems (UAS) defines the concept of sense and avoid for remote pilots as "the capability of a UAS to remain well clear from and avoid collisions with other airborne traffic." Hence, a rigorous definition of well clear is fundamental to any separation assurance concept for the integration of UAS into civil airspace. This paper presents a family of well-clear boundary models based on the TCAS II Resolution Advisory logic. For these models, algorithms that predict well-clear violations along aircraft current trajectories are provided. These algorithms are analogous to conflict detection algorithms but instead of predicting loss of separation, they predict whether well-clear violations will occur during a given lookahead time interval. Analytical techniques are used to study the properties and relationships satisfied by the models.
Marsh, Herbert W; Pekrun, Reinhard; Murayama, Kou; Arens, A Katrin; Parker, Philip D; Guo, Jiesi; Dicke, Theresa
2018-02-01
Our newly proposed integrated academic self-concept model integrates 3 major theories of academic self-concept formation and developmental perspectives into a unified conceptual and methodological framework. Relations among math self-concept (MSC), school grades, test scores, and school-level contextual effects over 6 years, from the end of primary school through the first 5 years of secondary school (a representative sample of 3,370 German students, 42 secondary schools, 50% male, M age at grade 5 = 11.75) support the (1) internal/external frame of reference model: Math school grades had positive effects on MSC, but the effects of German grades were negative; (2) reciprocal effects (longitudinal panel) model: MSC was predictive of and predicted by math test scores and school grades; (3) big-fish-little-pond effect: The effects on MSC were negative for school-average achievement based on 4 indicators (primary school grades in math and German, school-track prior to the start of secondary school, math test scores in the first year of secondary school). Results for all 3 theoretical models were consistent across the 5 secondary school years: This supports the prediction of developmental equilibrium. This integration highlights the robustness of support over the potentially volatile early to middle adolescent period; the interconnectedness and complementarity of 3 ASC models; their counterbalancing strengths and weaknesses; and new theoretical, developmental, and substantive implications at their intersections. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
An integrated weather and sea-state forecasting system for the Arabian Peninsula (WASSF)
NASA Astrophysics Data System (ADS)
Kallos, George; Galanis, George; Spyrou, Christos; Mitsakou, Christina; Solomos, Stavros; Bartsotas, Nikolaos; Kalogrei, Christina; Athanaselis, Ioannis; Sofianos, Sarantis; Vervatis, Vassios; Axaopoulos, Panagiotis; Papapostolou, Alexandros; Qahtani, Jumaan Al; Alaa, Elyas; Alexiou, Ioannis; Beard, Daniel
2013-04-01
Nowadays, large industrial conglomerates such as the Saudi ARAMCO, require a series of weather and sea state forecasting products that cannot be found in state meteorological offices or even commercial data providers. The two major objectives of the system is prevention and mitigation of environmental problems and of course early warning of local conditions associated with extreme weather events. The management and operations part is related to early warning of weather and sea-state events that affect operations of various facilities. The environmental part is related to air quality and especially the desert dust levels in the atmosphere. The components of the integrated system include: (i) a weather and desert dust prediction system with forecasting horizon of 5 days, (ii) a wave analysis and prediction component for Red Sea and Arabian Gulf, (iii) an ocean circulation and tidal analysis and prediction of both Red Sea and Arabian Gulf and (iv) an Aviation part specializing in the vertical structure of the atmosphere and extreme events that affect air transport and other operations. Specialized data sets required for on/offshore operations are provided ate regular basis. State of the art modeling components are integrated to a unique system that distributes the produced analysis and forecasts to each department. The weather and dust prediction system is SKIRON/Dust, the wave analysis and prediction system is based on WAM cycle 4 model from ECMWF, the ocean circulation model is MICOM while the tidal analysis and prediction is a development of the Ocean Physics and Modeling Group of University of Athens, incorporating the Tidal Model Driver. A nowcasting subsystem is included. An interactive system based on Google Maps gives the capability to extract and display the necessary information for any location of the Arabian Peninsula, the Red Sea and Arabian Gulf.
Validation of Aircraft Noise Prediction Models at Low Levels of Exposure
NASA Technical Reports Server (NTRS)
Page, Juliet A.; Hobbs, Christopher M.; Plotkin, Kenneth J.; Stusnick, Eric; Shepherd, Kevin P. (Technical Monitor)
2000-01-01
Aircraft noise measurements were made at Denver International Airport for a period of four weeks. Detailed operational information was provided by airline operators which enabled noise levels to be predicted using the FAA's Integrated Noise Model. Several thrust prediction techniques were evaluated. Measured sound exposure levels for departure operations were found to be 4 to 10 dB higher than predicted, depending on the thrust prediction technique employed. Differences between measured and predicted levels are shown to be related to atmospheric conditions present at the aircraft altitude.
Development of the AFRL Aircrew Perfomance and Protection Data Bank
2007-12-01
Growth model and statistical model of hypobaric chamber simulations. It offers a quick and readily accessible online DCS risk assessment tool for...are used for the DCS prediction instead of the original model. ADRAC is based on more than 20 years of hypobaric chamber studies using human...prediction based on the combined Bubble Growth model and statistical model of hypobaric chamber simulations was integrated into the Data Bank. It
USDA-ARS?s Scientific Manuscript database
The Earth is a complex system comprised of many interacting spatial and temporal scales. Understanding, predicting, and managing for these dynamics requires a trans-disciplinary integrated approach. Although there have been calls for this integration, a general approach is needed. We developed a Tra...
On the effect of acoustic coupling on random and harmonic plate vibrations
NASA Technical Reports Server (NTRS)
Frendi, A.; Robinson, J. H.
1993-01-01
The effect of acoustic coupling on random and harmonic plate vibrations is studied using two numerical models. In the coupled model, the plate response is obtained by integration of the nonlinear plate equation coupled with the nonlinear Euler equations for the surrounding acoustic fluid. In the uncoupled model, the nonlinear plate equation with an equivalent linear viscous damping term is integrated to obtain the response of the plate subject to the same excitation field. For a low-level, narrow-band excitation, the two models predict the same plate response spectra. As the excitation level is increased, the response power spectrum predicted by the uncoupled model becomes broader and more shifted towards the high frequencies than that obtained by the coupled model. In addition, the difference in response between the coupled and uncoupled models at high frequencies becomes larger. When a high intensity harmonic excitation is used, causing a nonlinear plate response, both models predict the same frequency content of the response. However, the level of the harmonics and subharmonics are higher for the uncoupled model. Comparisons to earlier experimental and numerical results show that acoustic coupling has a significant effect on the plate response at high excitation levels. Its absence in previous models may explain the discrepancy between predicted and measured responses.
Wang, Junbai; Wu, Qianqian; Hu, Xiaohua Tony; Tian, Tianhai
2016-11-01
Investigating the dynamics of genetic regulatory networks through high throughput experimental data, such as microarray gene expression profiles, is a very important but challenging task. One of the major hindrances in building detailed mathematical models for genetic regulation is the large number of unknown model parameters. To tackle this challenge, a new integrated method is proposed by combining a top-down approach and a bottom-up approach. First, the top-down approach uses probabilistic graphical models to predict the network structure of DNA repair pathway that is regulated by the p53 protein. Two networks are predicted, namely a network of eight genes with eight inferred interactions and an extended network of 21 genes with 17 interactions. Then, the bottom-up approach using differential equation models is developed to study the detailed genetic regulations based on either a fully connected regulatory network or a gene network obtained by the top-down approach. Model simulation error, parameter identifiability and robustness property are used as criteria to select the optimal network. Simulation results together with permutation tests of input gene network structures indicate that the prediction accuracy and robustness property of the two predicted networks using the top-down approach are better than those of the corresponding fully connected networks. In particular, the proposed approach reduces computational cost significantly for inferring model parameters. Overall, the new integrated method is a promising approach for investigating the dynamics of genetic regulation. Copyright © 2016 Elsevier Inc. All rights reserved.
HIGH TIME-RESOLVED COMPARISONS FOR IN-DEPTH PROBING OF CMAQ FINE-PARTICLES AND GAS PREDICTIONS
Model evaluation is important to develop confidence in models and develop an understanding of their predictions. Most comparisons in the U.S. involve time-integrated measurements of 24-hours or longer. Comparisons against continuous or semi-continuous particle and gaseous measur...
Web-based Interspecies Correlation Estimation (Web-ICE) for Acute Toxicity: User Manual Version 3.1
Predictive toxicological models are integral to ecological risk assessment because data for most species are limited. Web-based Interspecies Correlation Estimation (Web-ICE) models are least square regressions that predict acute toxicity (LC50/LD50) of a chemical to a species, ge...
WEB-BASED INTERSPECIES CORRELATION ESTIMATION (WEB-ICE) FOR ACUTE TOXICITY: USER MANUAL V2
Predictive toxicological models are integral to environmental risk Assessment where data for most species are limited. Web-based Interspecies Correlation Estimation (Web-ICE) models are least square regressions that predict acute toxicity (LC50/LD50) of a chemical to a species, ...
NASA Astrophysics Data System (ADS)
Jia, Song; Xu, Tian-he; Sun, Zhang-zhen; Li, Jia-jing
2017-02-01
UT1-UTC is an important part of the Earth Orientation Parameters (EOP). The high-precision predictions of UT1-UTC play a key role in practical applications of deep space exploration, spacecraft tracking and satellite navigation and positioning. In this paper, a new prediction method with combination of Gray Model (GM(1, 1)) and Autoregressive Integrated Moving Average (ARIMA) is developed. The main idea is as following. Firstly, the UT1-UTC data are preprocessed by removing the leap second and Earth's zonal harmonic tidal to get UT1R-TAI data. Periodic terms are estimated and removed by the least square to get UT2R-TAI. Then the linear terms of UT2R-TAI data are modeled by the GM(1, 1), and the residual terms are modeled by the ARIMA. Finally, the UT2R-TAI prediction can be performed based on the combined model of GM(1, 1) and ARIMA, and the UT1-UTC predictions are obtained by adding the corresponding periodic terms, leap second correction and the Earth's zonal harmonic tidal correction. The results show that the proposed model can be used to predict UT1-UTC effectively with higher middle and long-term (from 32 to 360 days) accuracy than those of LS + AR, LS + MAR and WLS + MAR.
Evaluation of SSME test data reduction methods
NASA Technical Reports Server (NTRS)
Santi, L. Michael
1994-01-01
Accurate prediction of hardware and flow characteristics within the Space Shuttle Main Engine (SSME) during transient and main-stage operation requires a significant integration of ground test data, flight experience, and computational models. The process of integrating SSME test measurements with physical model predictions is commonly referred to as data reduction. Uncertainties within both test measurements and simplified models of the SSME flow environment compound the data integration problem. The first objective of this effort was to establish an acceptability criterion for data reduction solutions. The second objective of this effort was to investigate the data reduction potential of the ROCETS (Rocket Engine Transient Simulation) simulation platform. A simplified ROCETS model of the SSME was obtained from the MSFC Performance Analysis Branch . This model was examined and tested for physical consistency. Two modules were constructed and added to the ROCETS library to independently check the mass and energy balances of selected engine subsystems including the low pressure fuel turbopump, the high pressure fuel turbopump, the low pressure oxidizer turbopump, the high pressure oxidizer turbopump, the fuel preburner, the oxidizer preburner, the main combustion chamber coolant circuit, and the nozzle coolant circuit. A sensitivity study was then conducted to determine the individual influences of forty-two hardware characteristics on fourteen high pressure region prediction variables as returned by the SSME ROCETS model.
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.; ...
2017-11-26
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
Hieke, Stefanie; Benner, Axel; Schlenl, Richard F; Schumacher, Martin; Bullinger, Lars; Binder, Harald
2016-08-30
High-throughput technology allows for genome-wide measurements at different molecular levels for the same patient, e.g. single nucleotide polymorphisms (SNPs) and gene expression. Correspondingly, it might be beneficial to also integrate complementary information from different molecular levels when building multivariable risk prediction models for a clinical endpoint, such as treatment response or survival. Unfortunately, such a high-dimensional modeling task will often be complicated by a limited overlap of molecular measurements at different levels between patients, i.e. measurements from all molecular levels are available only for a smaller proportion of patients. We propose a sequential strategy for building clinical risk prediction models that integrate genome-wide measurements from two molecular levels in a complementary way. To deal with partial overlap, we develop an imputation approach that allows us to use all available data. This approach is investigated in two acute myeloid leukemia applications combining gene expression with either SNP or DNA methylation data. After obtaining a sparse risk prediction signature e.g. from SNP data, an automatically selected set of prognostic SNPs, by componentwise likelihood-based boosting, imputation is performed for the corresponding linear predictor by a linking model that incorporates e.g. gene expression measurements. The imputed linear predictor is then used for adjustment when building a prognostic signature from the gene expression data. For evaluation, we consider stability, as quantified by inclusion frequencies across resampling data sets. Despite an extremely small overlap in the application example with gene expression and SNPs, several genes are seen to be more stably identified when taking the (imputed) linear predictor from the SNP data into account. In the application with gene expression and DNA methylation, prediction performance with respect to survival also indicates that the proposed approach might work well. We consider imputation of linear predictor values to be a feasible and sensible approach for dealing with partial overlap in complementary integrative analysis of molecular measurements at different levels. More generally, these results indicate that a complementary strategy for integrating different molecular levels can result in more stable risk prediction signatures, potentially providing a more reliable insight into the underlying biology.
An integrated model of soil, hydrology, and vegetation for carbon dynamics in wetland ecosystems
Yu Zhang; Changsheng Li; Carl C. Trettin; Harbin Li; Ge Sun
2002-01-01
Wetland ecosystems are an important component in global carbon (C) cycles and may exert a large influence on global clinlate change. Predictions of C dynamics require us to consider interactions among many critical factors of soil, hydrology, and vegetation. However, few such integrated C models exist for wetland ecosystems. In this paper, we report a simulation model...
Xia, Kai; Dong, Dong; Han, Jing-Dong J
2006-01-01
Background Although protein-protein interaction (PPI) networks have been explored by various experimental methods, the maps so built are still limited in coverage and accuracy. To further expand the PPI network and to extract more accurate information from existing maps, studies have been carried out to integrate various types of functional relationship data. A frequently updated database of computationally analyzed potential PPIs to provide biological researchers with rapid and easy access to analyze original data as a biological network is still lacking. Results By applying a probabilistic model, we integrated 27 heterogeneous genomic, proteomic and functional annotation datasets to predict PPI networks in human. In addition to previously studied data types, we show that phenotypic distances and genetic interactions can also be integrated to predict PPIs. We further built an easy-to-use, updatable integrated PPI database, the Integrated Network Database (IntNetDB) online, to provide automatic prediction and visualization of PPI network among genes of interest. The networks can be visualized in SVG (Scalable Vector Graphics) format for zooming in or out. IntNetDB also provides a tool to extract topologically highly connected network neighborhoods from a specific network for further exploration and research. Using the MCODE (Molecular Complex Detections) algorithm, 190 such neighborhoods were detected among all the predicted interactions. The predicted PPIs can also be mapped to worm, fly and mouse interologs. Conclusion IntNetDB includes 180,010 predicted protein-protein interactions among 9,901 human proteins and represents a useful resource for the research community. Our study has increased prediction coverage by five-fold. IntNetDB also provides easy-to-use network visualization and analysis tools that allow biological researchers unfamiliar with computational biology to access and analyze data over the internet. The web interface of IntNetDB is freely accessible at . Visualization requires Mozilla version 1.8 (or higher) or Internet Explorer with installation of SVGviewer. PMID:17112386
Tan, Ting; Chen, Lizhang; Liu, Fuqiang
2014-11-01
To establish multiple seasonal autoregressive integrated moving average model (ARIMA) according to the hand-foot-mouth disease incidence in Changsha, and to explore the feasibility of the multiple seasonal ARIMA in predicting the hand-foot-mouth disease incidence. EVIEWS 6.0 was used to establish multiple seasonal ARIMA according to the hand-foot- mouth disease incidence from May 2008 to August 2013 in Changsha, and the data of the hand- foot-mouth disease incidence from September 2013 to February 2014 were served as the examined samples of the multiple seasonal ARIMA, then the errors were compared between the forecasted incidence and the real value. Finally, the incidence of hand-foot-mouth disease from March 2014 to August 2014 was predicted by the model. After the data sequence was handled by smooth sequence, model identification and model diagnosis, the multiple seasonal ARIMA (1, 0, 1)×(0, 1, 1)12 was established. The R2 value of the model fitting degree was 0.81, the root mean square prediction error was 8.29 and the mean absolute error was 5.83. The multiple seasonal ARIMA is a good prediction model, and the fitting degree is good. It can provide reference for the prevention and control work in hand-foot-mouth disease.
NASA Astrophysics Data System (ADS)
Atieh, M.; Mehltretter, S. L.; Gharabaghi, B.; Rudra, R.
2015-12-01
One of the most uncertain modeling tasks in hydrology is the prediction of ungauged stream sediment load and concentration statistics. This study presents integrated artificial neural networks (ANN) models for prediction of sediment rating curve parameters (rating curve coefficient α and rating curve exponent β) for ungauged basins. The ANN models integrate a comprehensive list of input parameters to improve the accuracy achieved; the input parameters used include: soil, land use, topographic, climatic, and hydrometric data sets. The ANN models were trained on the randomly selected 2/3 of the dataset of 94 gauged streams in Ontario, Canada and validated on the remaining 1/3. The developed models have high correlation coefficients of 0.92 and 0.86 for α and β, respectively. The ANN model for the rating coefficient α is directly proportional to rainfall erosivity factor, soil erodibility factor, and apportionment entropy disorder index, whereas it is inversely proportional to vegetation cover and mean annual snowfall. The ANN model for the rating exponent β is directly proportional to mean annual precipitation, the apportionment entropy disorder index, main channel slope, standard deviation of daily discharge, and inversely proportional to the fraction of basin area covered by wetlands and swamps. Sediment rating curves are essential tools for the calculation of sediment load, concentration-duration curve (CDC), and concentration-duration-frequency (CDF) analysis for more accurate assessment of water quality for ungauged basins.
Prediction and Informative Risk Factor Selection of Bone Diseases.
Li, Hui; Li, Xiaoyi; Ramanathan, Murali; Zhang, Aidong
2015-01-01
With the booming of healthcare industry and the overwhelming amount of electronic health records (EHRs) shared by healthcare institutions and practitioners, we take advantage of EHR data to develop an effective disease risk management model that not only models the progression of the disease, but also predicts the risk of the disease for early disease control or prevention. Existing models for answering these questions usually fall into two categories: the expert knowledge based model or the handcrafted feature set based model. To fully utilize the whole EHR data, we will build a framework to construct an integrated representation of features from all available risk factors in the EHR data and use these integrated features to effectively predict osteoporosis and bone fractures. We will also develop a framework for informative risk factor selection of bone diseases. A pair of models for two contrast cohorts (e.g., diseased patients versus non-diseased patients) will be established to discriminate their characteristics and find the most informative risk factors. Several empirical results on a real bone disease data set show that the proposed framework can successfully predict bone diseases and select informative risk factors that are beneficial and useful to guide clinical decisions.
Bayesian Integration of Isotope Ratio for Geographic Sourcing of Castor Beans
Webb-Robertson, Bobbie-Jo; Kreuzer, Helen; Hart, Garret; ...
2012-01-01
Recenmore » t years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 60.9 ± 2.1 % versus 55.9 ± 2.1 % and 40.2 ± 1.8 % for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.« less
Bayesian Integration of Isotope Ratios for Geographic Sourcing of Castor Beans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Hart, Garret L.
Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based onmore » the integrated model with a class accuracy of 6 0 . 9 {+-} 2 . 1 % versus 5 5 . 9 {+-} 2 . 1 % and 4 0 . 2 {+-} 1 . 8 % for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.« less
Bayesian Integration of Isotope Ratio for Geographic Sourcing of Castor Beans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo; Kreuzer, Helen; Hart, Garret
Recenmore » t years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 60.9 ± 2.1 % versus 55.9 ± 2.1 % and 40.2 ± 1.8 % for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.« less
Continuous track paths reveal additive evidence integration in multistep decision making.
Buc Calderon, Cristian; Dewulf, Myrtille; Gevers, Wim; Verguts, Tom
2017-10-03
Multistep decision making pervades daily life, but its underlying mechanisms remain obscure. We distinguish four prominent models of multistep decision making, namely serial stage, hierarchical evidence integration, hierarchical leaky competing accumulation (HLCA), and probabilistic evidence integration (PEI). To empirically disentangle these models, we design a two-step reward-based decision paradigm and implement it in a reaching task experiment. In a first step, participants choose between two potential upcoming choices, each associated with two rewards. In a second step, participants choose between the two rewards selected in the first step. Strikingly, as predicted by the HLCA and PEI models, the first-step decision dynamics were initially biased toward the choice representing the highest sum/mean before being redirected toward the choice representing the maximal reward (i.e., initial dip). Only HLCA and PEI predicted this initial dip, suggesting that first-step decision dynamics depend on additive integration of competing second-step choices. Our data suggest that potential future outcomes are progressively unraveled during multistep decision making.
Establishment of the Northeast Coastal Watershed Geospatial Data Network (NECWGDN)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hannigan, Robyn
The goals of NECWGDN were to establish integrated geospatial databases that interfaced with existing open-source (water.html) environmental data server technologies (e.g., HydroDesktop) and included ecological and human data to enable evaluation, prediction, and adaptation in coastal environments to climate- and human-induced threats to the coastal marine resources within the Gulf of Maine. We have completed the development and testing of a "test bed" architecture that is compatible with HydroDesktop and have identified key metadata structures that will enable seamless integration and delivery of environmental, ecological, and human data as well as models to predict threats to end-users. Uniquely this databasemore » integrates point as well as model data and so offers capacities to end-users that are unique among databases. Future efforts will focus on the development of integrated environmental-human dimension models that can serve, in near real time, visualizations of threats to coastal resources and habitats.« less
Bayesian Integration of Isotope Ratio for Geographic Sourcing of Castor Beans
Webb-Robertson, Bobbie-Jo; Kreuzer, Helen; Hart, Garret; Ehleringer, James; West, Jason; Gill, Gary; Duckworth, Douglas
2012-01-01
Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 60.9 ± 2.1% versus 55.9 ± 2.1% and 40.2 ± 1.8% for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model. PMID:22919270
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Reaction Time Correlations during Eye–Hand Coordination:Behavior and Modeling
Dean, Heather L.; Martí, Daniel; Tsui, Eva; Rinzel, John; Pesaran, Bijan
2011-01-01
During coordinated eye– hand movements, saccade reaction times (SRTs) and reach reaction times (RRTs) are correlated in humans and monkeys. Reaction times (RTs) measure the degree of movement preparation and can correlate with movement speed and accuracy. However, RTs can also reflect effector nonspecific influences, such as motivation and arousal. We use a combination of behavioral psychophysics and computational modeling to identify plausible mechanisms for correlations in SRTs and RRTs. To disambiguate nonspecific mechanisms from mechanisms specific to movement coordination, we introduce a dual-task paradigm in which a reach and a saccade are cued with a stimulus onset asynchrony (SOA). We then develop several variants of integrate-to-threshold models of RT, which postulate that responses are initiated when the neural activity encoding effector-specific movement preparation reaches a threshold. The integrator models formalize hypotheses about RT correlations and make predictions for how each RT should vary with SOA. To test these hypotheses, we trained three monkeys to perform the eye– hand SOA task and analyzed their SRTs and RRTs. In all three subjects, RT correlations decreased with increasing SOA duration. Additionally, mean SRT decreased with decreasing SOA, revealing facilitation of saccades with simultaneous reaches, as predicted by the model. These results are not consistent with the predictions of the models with common modulation or common input but are compatible with the predictions of a model with mutual excitation between two effector-specific integrators. We propose that RT correlations are not simply attributable to motivation and arousal and are a signature of coordination. PMID:21325507
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, R. Quinn; Brooks, Evan B.; Jersild, Annika L.
Predicting how forest carbon cycling will change in response to climate change and management depends on the collective knowledge from measurements across environmental gradients, ecosystem manipulations of global change factors, and mathematical models. Formally integrating these sources of knowledge through data assimilation, or model–data fusion, allows the use of past observations to constrain model parameters and estimate prediction uncertainty. Data assimilation (DA) focused on the regional scale has the opportunity to integrate data from both environmental gradients and experimental studies to constrain model parameters. Here, we introduce a hierarchical Bayesian DA approach (Data Assimilation to Predict Productivity for Ecosystems and Regions,more » DAPPER) that uses observations of carbon stocks, carbon fluxes, water fluxes, and vegetation dynamics from loblolly pine plantation ecosystems across the southeastern US to constrain parameters in a modified version of the Physiological Principles Predicting Growth (3-PG) forest growth model. The observations included major experiments that manipulated atmospheric carbon dioxide (CO 2) concentration, water, and nutrients, along with nonexperimental surveys that spanned environmental gradients across an 8.6 × 10 5 km 2 region. We optimized regionally representative posterior distributions for model parameters, which dependably predicted data from plots withheld from the data assimilation. While the mean bias in predictions of nutrient fertilization experiments, irrigation experiments, and CO 2 enrichment experiments was low, future work needs to focus modifications to model structures that decrease the bias in predictions of drought experiments. Predictions of how growth responded to elevated CO 2 strongly depended on whether ecosystem experiments were assimilated and whether the assimilated field plots in the CO 2 study were allowed to have different mortality parameters than the other field plots in the region. We present predictions of stem biomass productivity under elevated CO 2, decreased precipitation, and increased nutrient availability that include estimates of uncertainty for the southeastern US. Overall, we (1) demonstrated how three decades of research in southeastern US planted pine forests can be used to develop DA techniques that use multiple locations, multiple data streams, and multiple ecosystem experiment types to optimize parameters and (2) developed a tool for the development of future predictions of forest productivity for natural resource managers that leverage a rich dataset of integrated ecosystem observations across a region.« less
Thomas, R. Quinn; Brooks, Evan B.; Jersild, Annika L.; ...
2017-07-26
Predicting how forest carbon cycling will change in response to climate change and management depends on the collective knowledge from measurements across environmental gradients, ecosystem manipulations of global change factors, and mathematical models. Formally integrating these sources of knowledge through data assimilation, or model–data fusion, allows the use of past observations to constrain model parameters and estimate prediction uncertainty. Data assimilation (DA) focused on the regional scale has the opportunity to integrate data from both environmental gradients and experimental studies to constrain model parameters. Here, we introduce a hierarchical Bayesian DA approach (Data Assimilation to Predict Productivity for Ecosystems and Regions,more » DAPPER) that uses observations of carbon stocks, carbon fluxes, water fluxes, and vegetation dynamics from loblolly pine plantation ecosystems across the southeastern US to constrain parameters in a modified version of the Physiological Principles Predicting Growth (3-PG) forest growth model. The observations included major experiments that manipulated atmospheric carbon dioxide (CO 2) concentration, water, and nutrients, along with nonexperimental surveys that spanned environmental gradients across an 8.6 × 10 5 km 2 region. We optimized regionally representative posterior distributions for model parameters, which dependably predicted data from plots withheld from the data assimilation. While the mean bias in predictions of nutrient fertilization experiments, irrigation experiments, and CO 2 enrichment experiments was low, future work needs to focus modifications to model structures that decrease the bias in predictions of drought experiments. Predictions of how growth responded to elevated CO 2 strongly depended on whether ecosystem experiments were assimilated and whether the assimilated field plots in the CO 2 study were allowed to have different mortality parameters than the other field plots in the region. We present predictions of stem biomass productivity under elevated CO 2, decreased precipitation, and increased nutrient availability that include estimates of uncertainty for the southeastern US. Overall, we (1) demonstrated how three decades of research in southeastern US planted pine forests can be used to develop DA techniques that use multiple locations, multiple data streams, and multiple ecosystem experiment types to optimize parameters and (2) developed a tool for the development of future predictions of forest productivity for natural resource managers that leverage a rich dataset of integrated ecosystem observations across a region.« less
NASA Astrophysics Data System (ADS)
Ansari, R.; Torabi, J.; Norouzzadeh, A.
2018-04-01
Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling.
de Vries, Sjoerd J; Chauvot de Beauchêne, Isaure; Schindler, Christina E M; Zacharias, Martin
2016-02-23
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Cryo-EM Data Are Superior to Contact and Interface Information in Integrative Modeling
de Vries, Sjoerd J.; Chauvot de Beauchêne, Isaure; Schindler, Christina E.M.; Zacharias, Martin
2016-01-01
Protein-protein interactions carry out a large variety of essential cellular processes. Cryo-electron microscopy (cryo-EM) is a powerful technique for the modeling of protein-protein interactions at a wide range of resolutions, and recent developments have caused a revolution in the field. At low resolution, cryo-EM maps can drive integrative modeling of the interaction, assembling existing structures into the map. Other experimental techniques can provide information on the interface or on the contacts between the monomers in the complex. This inevitably raises the question regarding which type of data is best suited to drive integrative modeling approaches. Systematic comparison of the prediction accuracy and specificity of the different integrative modeling paradigms is unavailable to date. Here, we compare EM-driven, interface-driven, and contact-driven integrative modeling paradigms. Models were generated for the protein docking benchmark using the ATTRACT docking engine and evaluated using the CAPRI two-star criterion. At 20 Å resolution, EM-driven modeling achieved a success rate of 100%, outperforming the other paradigms even with perfect interface and contact information. Therefore, even very low resolution cryo-EM data is superior in predicting heterodimeric and heterotrimeric protein assemblies. Our study demonstrates that a force field is not necessary, cryo-EM data alone is sufficient to accurately guide the monomers into place. The resulting rigid models successfully identify regions of conformational change, opening up perspectives for targeted flexible remodeling. PMID:26846888
Evolution of flowering strategies in Oenothera glazioviana: an integral projection model approach.
Rees, Mark; Rose, Karen E
2002-01-01
The timing of reproduction is a key determinant of fitness. Here, we develop parameterized integral projection models of size-related flowering for the monocarpic perennial Oenothera glazioviana and use these to predict the evolutionarily stable strategy (ESS) for flowering. For the most part there is excellent agreement between the model predictions and the results of quantitative field studies. However, the model predicts a much steeper relationship between plant size and the probability of flowering than observed in the field, indicating selection for a 'threshold size' flowering function. Elasticity and sensitivity analysis of population growth rate lambda and net reproductive rate R(0) are used to identify the critical traits that determine fitness and control the ESS for flowering. Using the fitted model we calculate the fitness landscape for invading genotypes and show that this is characterized by a ridge of approximately equal fitness. The implications of these results for the maintenance of genetic variation are discussed. PMID:12137582
Evolution of flowering strategies in Oenothera glazioviana: an integral projection model approach.
Rees, Mark; Rose, Karen E
2002-07-22
The timing of reproduction is a key determinant of fitness. Here, we develop parameterized integral projection models of size-related flowering for the monocarpic perennial Oenothera glazioviana and use these to predict the evolutionarily stable strategy (ESS) for flowering. For the most part there is excellent agreement between the model predictions and the results of quantitative field studies. However, the model predicts a much steeper relationship between plant size and the probability of flowering than observed in the field, indicating selection for a 'threshold size' flowering function. Elasticity and sensitivity analysis of population growth rate lambda and net reproductive rate R(0) are used to identify the critical traits that determine fitness and control the ESS for flowering. Using the fitted model we calculate the fitness landscape for invading genotypes and show that this is characterized by a ridge of approximately equal fitness. The implications of these results for the maintenance of genetic variation are discussed.
NASA Technical Reports Server (NTRS)
Foyle, David C.
1993-01-01
Based on existing integration models in the psychological literature, an evaluation framework is developed to assess sensor fusion displays as might be implemented in an enhanced/synthetic vision system. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The pilot's performance with the sensor fusion image is compared to models' predictions based on the pilot's performance when viewing the original component sensor images prior to fusion. This allows for the determination as to when a sensor fusion system leads to: poorer performance than one of the original sensor displays, clearly an undesirable system in which the fused sensor system causes some distortion or interference; better performance than with either single sensor system alone, but at a sub-optimal level compared to model predictions; optimal performance compared to model predictions; or, super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays.
Research on reverse logistics location under uncertainty environment based on grey prediction
NASA Astrophysics Data System (ADS)
Zhenqiang, Bao; Congwei, Zhu; Yuqin, Zhao; Quanke, Pan
This article constructs reverse logistic network based on uncertain environment, integrates the reverse logistics network and distribution network, and forms a closed network. An optimization model based on cost is established to help intermediate center, manufacturing center and remanufacturing center make location decision. A gray model GM (1, 1) is used to predict the product holdings of the collection points, and then prediction results are carried into the cost optimization model and a solution is got. Finally, an example is given to verify the effectiveness and feasibility of the model.
NASA Astrophysics Data System (ADS)
Jepsen, S. M.; Harmon, T. C.; Ficklin, D. L.; Molotch, N. P.; Guan, B.
2018-01-01
Changes in long-term, montane actual evapotranspiration (ET) in response to climate change could impact future water supplies and forest species composition. For scenarios of atmospheric warming, predicted changes in long-term ET tend to differ between studies using space-for-time substitution (STS) models and integrated watershed models, and the influence of spatially varying factors on these differences is unclear. To examine this, we compared warming-induced (+2 to +6 °C) changes in ET simulated by an STS model and an integrated watershed model across zones of elevation, substrate available water capacity, and slope in the snow-influenced upper San Joaquin River watershed, Sierra Nevada, USA. We used the Soil Water and Assessment Tool (SWAT) for the watershed modeling and a Budyko-type relationship for the STS modeling. Spatially averaged increases in ET from the STS model increasingly surpassed those from the SWAT model in the higher elevation zones of the watershed, resulting in 2.3-2.6 times greater values from the STS model at the watershed scale. In sparse, deep colluvium or glacial soils on gentle slopes, the SWAT model produced ET increases exceeding those from the STS model. However, watershed areas associated with these conditions were too localized for SWAT to produce spatially averaged ET-gains comparable to the STS model. The SWAT model results nevertheless demonstrate that such soils on high-elevation, gentle slopes will form ET "hot spots" exhibiting disproportionately large increases in ET, and concomitant reductions in runoff yield, in response to warming. Predicted ET responses to warming from STS models and integrated watershed models may, in general, substantially differ (e.g., factor of 2-3) for snow-influenced watersheds exhibiting an elevational gradient in substrate water holding capacity and slope. Long-term water supplies in these settings may therefore be more resilient to warming than STS model predictions would suggest.
Evaluation of blocking performance in ensemble seasonal integrations
NASA Astrophysics Data System (ADS)
Casado, M. J.; Doblas-Reyes, F. J.; Pastor, M. A.
2003-04-01
EVALUATION OF BLOCKING PERFOMANCE IN ENSEMBLE SEASONAL INTEGRATIONS M. J. Casado (1), F. J. Doblas-Reyes (2), A. Pastor (1) (1) I Instituto Nacional de Meteorología, c/Leonardo Prieto Castro,8,28071 ,Madrid,Spain, mjcasado@inm.es (2) ECMWF, Shinfield Park,RG2 9AX, Reading, UK, f.doblas-reyes@ecmwf.int Climate models have shown a robust inability to reliably predict blocking onset and frequency. This systematic error has been evaluated using multi-model ensemble seasonal integrations carried out in the framework of the Prediction Of climate Variations On Seasonal and interanual Timescales (PROVOST) project and compared to a blocking features assessment of the NCEP re-analyses. The PROVOST GCMs are able to adequately reproduce the spatial NCEP teleconnection patterns over the Northern Hemisphere, being notorious the great spatial correlation coefficient with some of the corresponding NCEP patterns. In spite of that, the different models show a consistent underestimation of blocking frequency which may impact on the ability to predict the seasonal amplitude of the leading modes of variability over the Northern Hemisphere.
Preclinical models used for immunogenicity prediction of therapeutic proteins.
Brinks, Vera; Weinbuch, Daniel; Baker, Matthew; Dean, Yann; Stas, Philippe; Kostense, Stefan; Rup, Bonita; Jiskoot, Wim
2013-07-01
All therapeutic proteins are potentially immunogenic. Antibodies formed against these drugs can decrease efficacy, leading to drastically increased therapeutic costs and in rare cases to serious and sometimes life threatening side-effects. Many efforts are therefore undertaken to develop therapeutic proteins with minimal immunogenicity. For this, immunogenicity prediction of candidate drugs during early drug development is essential. Several in silico, in vitro and in vivo models are used to predict immunogenicity of drug leads, to modify potentially immunogenic properties and to continue development of drug candidates with expected low immunogenicity. Despite the extensive use of these predictive models, their actual predictive value varies. Important reasons for this uncertainty are the limited/insufficient knowledge on the immune mechanisms underlying immunogenicity of therapeutic proteins, the fact that different predictive models explore different components of the immune system and the lack of an integrated clinical validation. In this review, we discuss the predictive models in use, summarize aspects of immunogenicity that these models predict and explore the merits and the limitations of each of the models.
Balistrieri, Laurie S.; Nimick, David A.; Mebane, Christopher A.
2012-01-01
Evaluating water quality and the health of aquatic organisms is challenging in systems with systematic diel (24 hour) or less predictable runoff-induced changes in water composition. To advance our understanding of how to evaluate environmental health in these dynamic systems, field studies of diel cycling were conducted in two streams (Silver Bow Creek and High Ore Creek) affected by historical mining activities in southwestern Montana. A combination of sampling and modeling tools were used to assess the toxicity of metals in these systems. Diffusive Gradients in Thin Films (DGT) samplers were deployed at multiple time intervals during diel sampling to confirm that DGT integrates time-varying concentrations of dissolved metals. Thermodynamic speciation calculations using site specific water compositions, including time-integrated dissolved metal concentrations determined from DGT, and a competitive, multiple-metal biotic ligand model incorporated into the Windemere Humic Aqueous Model Version 6.0 (WHAM VI) were used to determine the chemical speciation of dissolved metals and biotic ligands. The model results were combined with previously collected toxicity data on cutthroat trout to derive a relationship that predicts the relative survivability of these fish at a given site. This integrative approach may prove useful for assessing water quality and toxicity of metals to aquatic organisms in dynamic systems and evaluating whether potential changes in environmental health of aquatic systems are due to anthropogenic activities or natural variability.
Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.
Durdu, Omer Faruk
2010-10-01
In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic statistics of observed data in terms of mean. The ARIMA modeling approach is recommended for predicting boron concentration series of a river.
[Predictive model based multimetric index of macroinvertebrates for river health assessment].
Chen, Kai; Yu, Hai Yan; Zhang, Ji Wei; Wang, Bei Xin; Chen, Qiu Wen
2017-06-18
Improving the stability of integrity of biotic index (IBI; i.e., multi-metric indices, MMI) across temporal and spatial scales is one of the most important issues in water ecosystem integrity bioassessment and water environment management. Using datasets of field-based macroinvertebrate and physicochemical variables and GIS-based natural predictors (e.g., geomorphology and climate) and land use variables collected at 227 river sites from 2004 to 2011 across the Zhejiang Province, China, we used random forests (RF) to adjust the effects of natural variations at temporal and spatial scales on macroinvertebrate metrics. We then developed natural variations adjusted (predictive) and unadjusted (null) MMIs and compared performance between them. The core me-trics selected for predictive and null MMIs were different from each other, and natural variations within core metrics in predictive MMI explained by RF models ranged between 11.4% and 61.2%. The predictive MMI was more precise and accurate, but less responsive and sensitive than null MMI. The multivariate nearest-neighbor test determined that 9 test sites and 1 most degraded site were flagged outside of the environmental space of the reference site network. We found that combination of predictive MMI developed by using predictive model and the nearest-neighbor test performed best and decreased risks of inferring type I (designating a water body as being in poor biological condition, when it was actually in good condition) and type II (designating a water body as being in good biological condition, when it was actually in poor condition) errors. Our results provided an effective method to improve the stability and performance of integrity of biotic index.
Xu, Yadong; Serre, Marc L; Reyes, Jeanette; Vizuete, William
2016-04-19
To improve ozone exposure estimates for ambient concentrations at a national scale, we introduce our novel Regionalized Air Quality Model Performance (RAMP) approach to integrate chemical transport model (CTM) predictions with the available ozone observations using the Bayesian Maximum Entropy (BME) framework. The framework models the nonlinear and nonhomoscedastic relation between air pollution observations and CTM predictions and for the first time accounts for variability in CTM model performance. A validation analysis using only noncollocated data outside of a validation radius rv was performed and the R(2) between observations and re-estimated values for two daily metrics, the daily maximum 8-h average (DM8A) and the daily 24-h average (D24A) ozone concentrations, were obtained with the OBS scenario using ozone observations only in contrast with the RAMP and a Constant Air Quality Model Performance (CAMP) scenarios. We show that, by accounting for the spatial and temporal variability in model performance, our novel RAMP approach is able to extract more information in terms of R(2) increase percentage, with over 12 times for the DM8A and over 3.5 times for the D24A ozone concentrations, from CTM predictions than the CAMP approach assuming that model performance does not change across space and time.
Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu
2012-02-01
In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.
Integrative sensing and prediction of urban water for sustainable cities (iSPUW)
NASA Astrophysics Data System (ADS)
Seo, D. J.; Fang, N. Z.; Yu, X.; Zink, M.; Gao, J.; Kerkez, B.
2014-12-01
We describe a newly launched project in the Dallas-Fort Worth Metroplex (DFW) area to develop a cyber-physical prototype system that integrates advanced sensing, modeling and prediction of urban water, to support its early adoption by a spectrum of users and stakeholders, and to educate a new generation of future sustainability scientists and engineers. The project utilizes the very high-resolution precipitation and other sensing capabilities uniquely available in DFW as well as crowdsourcing and cloud computing to advance understanding of the urban water cycle and to improve urban sustainability from transient shocks of heavy-to-extreme precipitation under climate change and urbanization. All available water information from observations and models will be fused objectively via advanced data assimilation to produce the best estimate of the state of the uncertain system. Modeling, prediction and decision support tools will be developed in the ensemble framework to increase the information content of the analysis and prediction and to support risk-based decision making.
Helbling, Ignacio M; Ibarra, Juan C D; Luna, Julio A
2012-02-28
A mathematical modeling of controlled release of drug from one-layer torus-shaped devices is presented. Analytical solutions based on Refined Integral Method (RIM) are derived. The validity and utility of the model are ascertained by comparison of the simulation results with matrix-type vaginal rings experimental release data reported in the literature. For the comparisons, the pair-wise procedure is used to measure quantitatively the fit of the theoretical predictions to the experimental data. A good agreement between the model prediction and the experimental data is observed. A comparison with a previously reported model is also presented. More accurate results are achieved for small A/C(s) ratios. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Branscum, Paul; Bhochhibhoya, Amir
2016-01-01
Background: The integrated behavioral model (IBM) is a new and emerging theory in the field of health promotion and health education, and more applications are needed to test the usefulness of the model for research and practice. Purpose: The purpose of this study was to operationalize the IBM as it relates to physical activity (PA) among children…
NASA Astrophysics Data System (ADS)
Lobuglio, Joseph N.; Characklis, Gregory W.; Serre, Marc L.
2007-03-01
Sparse monitoring data and error inherent in water quality models make the identification of waters not meeting regulatory standards uncertain. Additional monitoring can be implemented to reduce this uncertainty, but it is often expensive. These costs are currently a major concern, since developing total maximum daily loads, as mandated by the Clean Water Act, will require assessing tens of thousands of water bodies across the United States. This work uses the Bayesian maximum entropy (BME) method of modern geostatistics to integrate water quality monitoring data together with model predictions to provide improved estimates of water quality in a cost-effective manner. This information includes estimates of uncertainty and can be used to aid probabilistic-based decisions concerning the status of a water (i.e., impaired or not impaired) and the level of monitoring needed to characterize the water for regulatory purposes. This approach is applied to the Catawba River reservoir system in western North Carolina as a means of estimating seasonal chlorophyll a concentration. Mean concentration and confidence intervals for chlorophyll a are estimated for 66 reservoir segments over an 11-year period (726 values) based on 219 measured seasonal averages and 54 model predictions. Although the model predictions had a high degree of uncertainty, integration of modeling results via BME methods reduced the uncertainty associated with chlorophyll estimates compared with estimates made solely with information from monitoring efforts. Probabilistic predictions of future chlorophyll levels on one reservoir are used to illustrate the cost savings that can be achieved by less extensive and rigorous monitoring methods within the BME framework. While BME methods have been applied in several environmental contexts, employing these methods as a means of integrating monitoring and modeling results, as well as application of this approach to the assessment of surface water monitoring networks, represent unexplored areas of research.
Taylor, J M; Law, N
1998-10-30
We investigate the importance of the assumed covariance structure for longitudinal modelling of CD4 counts. We examine how individual predictions of future CD4 counts are affected by the covariance structure. We consider four covariance structures: one based on an integrated Ornstein-Uhlenbeck stochastic process; one based on Brownian motion, and two derived from standard linear and quadratic random-effects models. Using data from the Multicenter AIDS Cohort Study and from a simulation study, we show that there is a noticeable deterioration in the coverage rate of confidence intervals if we assume the wrong covariance. There is also a loss in efficiency. The quadratic random-effects model is found to be the best in terms of correctly calibrated prediction intervals, but is substantially less efficient than the others. Incorrectly specifying the covariance structure as linear random effects gives too narrow prediction intervals with poor coverage rates. Fitting using the model based on the integrated Ornstein-Uhlenbeck stochastic process is the preferred one of the four considered because of its efficiency and robustness properties. We also use the difference between the future predicted and observed CD4 counts to assess an appropriate transformation of CD4 counts; a fourth root, cube root and square root all appear reasonable choices.
Harnessing Big Data for Systems Pharmacology
Xie, Lei; Draizen, Eli J.; Bourne, Philip E.
2017-01-01
Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable. PMID:27814027
Harnessing Big Data for Systems Pharmacology.
Xie, Lei; Draizen, Eli J; Bourne, Philip E
2017-01-06
Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable.
Jovanovic, Milos; Radovanovic, Sandro; Vukicevic, Milan; Van Poucke, Sven; Delibasic, Boris
2016-09-01
Quantification and early identification of unplanned readmission risk have the potential to improve the quality of care during hospitalization and after discharge. However, high dimensionality, sparsity, and class imbalance of electronic health data and the complexity of risk quantification, challenge the development of accurate predictive models. Predictive models require a certain level of interpretability in order to be applicable in real settings and create actionable insights. This paper aims to develop accurate and interpretable predictive models for readmission in a general pediatric patient population, by integrating a data-driven model (sparse logistic regression) and domain knowledge based on the international classification of diseases 9th-revision clinical modification (ICD-9-CM) hierarchy of diseases. Additionally, we propose a way to quantify the interpretability of a model and inspect the stability of alternative solutions. The analysis was conducted on >66,000 pediatric hospital discharge records from California, State Inpatient Databases, Healthcare Cost and Utilization Project between 2009 and 2011. We incorporated domain knowledge based on the ICD-9-CM hierarchy in a data driven, Tree-Lasso regularized logistic regression model, providing the framework for model interpretation. This approach was compared with traditional Lasso logistic regression resulting in models that are easier to interpret by fewer high-level diagnoses, with comparable prediction accuracy. The results revealed that the use of a Tree-Lasso model was as competitive in terms of accuracy (measured by area under the receiver operating characteristic curve-AUC) as the traditional Lasso logistic regression, but integration with the ICD-9-CM hierarchy of diseases provided more interpretable models in terms of high-level diagnoses. Additionally, interpretations of models are in accordance with existing medical understanding of pediatric readmission. Best performing models have similar performances reaching AUC values 0.783 and 0.779 for traditional Lasso and Tree-Lasso, respectfully. However, information loss of Lasso models is 0.35 bits higher compared to Tree-Lasso model. We propose a method for building predictive models applicable for the detection of readmission risk based on Electronic Health records. Integration of domain knowledge (in the form of ICD-9-CM taxonomy) and a data-driven, sparse predictive algorithm (Tree-Lasso Logistic Regression) resulted in an increase of interpretability of the resulting model. The models are interpreted for the readmission prediction problem in general pediatric population in California, as well as several important subpopulations, and the interpretations of models comply with existing medical understanding of pediatric readmission. Finally, quantitative assessment of the interpretability of the models is given, that is beyond simple counts of selected low-level features. Copyright © 2016 Elsevier B.V. All rights reserved.
Dos Santos Vasconcelos, Crhisllane Rafaele; de Lima Campos, Túlio; Rezende, Antonio Mauro
2018-03-06
Systematic analysis of a parasite interactome is a key approach to understand different biological processes. It makes possible to elucidate disease mechanisms, to predict protein functions and to select promising targets for drug development. Currently, several approaches for protein interaction prediction for non-model species incorporate only small fractions of the entire proteomes and their interactions. Based on this perspective, this study presents an integration of computational methodologies, protein network predictions and comparative analysis of the protozoan species Leishmania braziliensis and Leishmania infantum. These parasites cause Leishmaniasis, a worldwide distributed and neglected disease, with limited treatment options using currently available drugs. The predicted interactions were obtained from a meta-approach, applying rigid body docking tests and template-based docking on protein structures predicted by different comparative modeling techniques. In addition, we trained a machine-learning algorithm (Gradient Boosting) using docking information performed on a curated set of positive and negative protein interaction data. Our final model obtained an AUC = 0.88, with recall = 0.69, specificity = 0.88 and precision = 0.83. Using this approach, it was possible to confidently predict 681 protein structures and 6198 protein interactions for L. braziliensis, and 708 protein structures and 7391 protein interactions for L. infantum. The predicted networks were integrated to protein interaction data already available, analyzed using several topological features and used to classify proteins as essential for network stability. The present study allowed to demonstrate the importance of integrating different methodologies of interaction prediction to increase the coverage of the protein interaction of the studied protocols, besides it made available protein structures and interactions not previously reported.
NASA Technical Reports Server (NTRS)
Gorski, Krzysztof M.
1993-01-01
Simple and easy to implement elementary function approximations are introduced to the spectral window functions needed in calculations of model predictions of the cosmic microwave backgrond (CMB) anisotropy. These approximations allow the investigator to obtain model delta T/T predictions in terms of single integrals over the power spectrum of cosmological perturbations and to avoid the necessity of performing the additional integrations. The high accuracy of these approximations is demonstrated here for the CDM theory-based calculations of the expected delta T/T signal in several experiments searching for the CMB anisotropy.
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Vergu, Elisabeta; Mallet, Alain; Golmard, Jean-Louis
2004-02-01
Because treatment failure in many HIV-infected persons may be due to multiple causes, including resistance to antiretroviral agents, it is important to better tailor drug therapy to individual patients. This improvement requires the prediction of treatment outcome from baseline immunological or virological factors, and from results of resistance tests. Here, we review briefly the available clinical factors that have an impact on therapy outcome, and discuss the role of a predictive modelling approach integrating these factors proposed in a previous work. Mathematical and statistical models could become essential tools to address questions that are difficult to study clinically and experimentally, thereby guiding decisions in the choice of individualized drug regimens.
NASA Astrophysics Data System (ADS)
Xu, Yadong; Serre, Marc L.; Reyes, Jeanette M.; Vizuete, William
2017-10-01
We have developed a Bayesian Maximum Entropy (BME) framework that integrates observations from a surface monitoring network and predictions from a Chemical Transport Model (CTM) to create improved exposure estimates that can be resolved into any spatial and temporal resolution. The flexibility of the framework allows for input of data in any choice of time scales and CTM predictions of any spatial resolution with varying associated degrees of estimation error and cost in terms of implementation and computation. This study quantifies the impact on exposure estimation error due to these choices by first comparing estimations errors when BME relied on ozone concentration data either as an hourly average, the daily maximum 8-h average (DM8A), or the daily 24-h average (D24A). Our analysis found that the use of DM8A and D24A data, although less computationally intensive, reduced estimation error more when compared to the use of hourly data. This was primarily due to the poorer CTM model performance in the hourly average predicted ozone. Our second analysis compared spatial variability and estimation errors when BME relied on CTM predictions with a grid cell resolution of 12 × 12 km2 versus a coarser resolution of 36 × 36 km2. Our analysis found that integrating the finer grid resolution CTM predictions not only reduced estimation error, but also increased the spatial variability in daily ozone estimates by 5 times. This improvement was due to the improved spatial gradients and model performance found in the finer resolved CTM simulation. The integration of observational and model predictions that is permitted in a BME framework continues to be a powerful approach for improving exposure estimates of ambient air pollution. The results of this analysis demonstrate the importance of also understanding model performance variability and its implications on exposure error.
Multi-omics facilitated variable selection in Cox-regression model for cancer prognosis prediction.
Liu, Cong; Wang, Xujun; Genchev, Georgi Z; Lu, Hui
2017-07-15
New developments in high-throughput genomic technologies have enabled the measurement of diverse types of omics biomarkers in a cost-efficient and clinically-feasible manner. Developing computational methods and tools for analysis and translation of such genomic data into clinically-relevant information is an ongoing and active area of investigation. For example, several studies have utilized an unsupervised learning framework to cluster patients by integrating omics data. Despite such recent advances, predicting cancer prognosis using integrated omics biomarkers remains a challenge. There is also a shortage of computational tools for predicting cancer prognosis by using supervised learning methods. The current standard approach is to fit a Cox regression model by concatenating the different types of omics data in a linear manner, while penalty could be added for feature selection. A more powerful approach, however, would be to incorporate data by considering relationships among omics datatypes. Here we developed two methods: a SKI-Cox method and a wLASSO-Cox method to incorporate the association among different types of omics data. Both methods fit the Cox proportional hazards model and predict a risk score based on mRNA expression profiles. SKI-Cox borrows the information generated by these additional types of omics data to guide variable selection, while wLASSO-Cox incorporates this information as a penalty factor during model fitting. We show that SKI-Cox and wLASSO-Cox models select more true variables than a LASSO-Cox model in simulation studies. We assess the performance of SKI-Cox and wLASSO-Cox using TCGA glioblastoma multiforme and lung adenocarcinoma data. In each case, mRNA expression, methylation, and copy number variation data are integrated to predict the overall survival time of cancer patients. Our methods achieve better performance in predicting patients' survival in glioblastoma and lung adenocarcinoma. Copyright © 2017. Published by Elsevier Inc.
Thermal Testing and Model Correlation for Advanced Topographic Laser Altimeter Instrument (ATLAS)
NASA Technical Reports Server (NTRS)
Patel, Deepak
2016-01-01
The Advanced Topographic Laser Altimeter System (ATLAS) part of the Ice Cloud and Land Elevation Satellite 2 (ICESat-2) is an upcoming Earth Science mission focusing on the effects of climate change. The flight instrument passed all environmental testing at GSFC (Goddard Space Flight Center) and is now ready to be shipped to the spacecraft vendor for integration and testing. This topic covers the analysis leading up to the test setup for ATLAS thermal testing as well as model correlation to flight predictions. Test setup analysis section will include areas where ATLAS could not meet flight like conditions and what were the limitations. Model correlation section will walk through changes that had to be made to the thermal model in order to match test results. The correlated model will then be integrated with spacecraft model for on-orbit predictions.
An integrative formal model of motivation and decision making: The MGPM*.
Ballard, Timothy; Yeo, Gillian; Loft, Shayne; Vancouver, Jeffrey B; Neal, Andrew
2016-09-01
We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Gliozzi, T M; Turri, F; Manes, S; Cassinelli, C; Pizzi, F
2017-11-01
Within recent years, there has been growing interest in the prediction of bull fertility through in vitro assessment of semen quality. A model for fertility prediction based on early evaluation of semen quality parameters, to exclude sires with potentially low fertility from breeding programs, would therefore be useful. The aim of the present study was to identify the most suitable parameters that would provide reliable prediction of fertility. Frozen semen from 18 Italian Holstein-Friesian proven bulls was analyzed using computer-assisted semen analysis (CASA) (motility and kinetic parameters) and flow cytometry (FCM) (viability, acrosomal integrity, mitochondrial function, lipid peroxidation, plasma membrane stability and DNA integrity). Bulls were divided into two groups (low and high fertility) based on the estimated relative conception rate (ERCR). Significant differences were found between fertility groups for total motility, active cells, straightness, linearity, viability and percentage of DNA fragmented sperm. Correlations were observed between ERCR and some kinetic parameters, and membrane instability and some DNA integrity indicators. In order to define a model with high relation between semen quality parameters and ERCR, backward stepwise multiple regression analysis was applied. Thus, we obtained a prediction model that explained almost half (R 2=0.47, P<0.05) of the variation in the conception rate and included nine variables: five kinetic parameters measured by CASA (total motility, active cells, beat cross frequency, curvilinear velocity and amplitude of lateral head displacement) and four parameters related to DNA integrity evaluated by FCM (degree of chromatin structure abnormality Alpha-T, extent of chromatin structure abnormality (Alpha-T standard deviation), percentage of DNA fragmented sperm and percentage of sperm with high green fluorescence representative of immature cells). A significant relationship (R 2=0.84, P<0.05) was observed between real and predicted fertility. Once the accuracy of fertility prediction has been confirmed, the model developed in the present study could be used by artificial insemination centers for bull selection or for elimination of poor fertility ejaculates.
Performance Enhancements Under Dual-task Conditions
NASA Technical Reports Server (NTRS)
Kramer, A. F.; Wickens, C. D.; Donchin, E.
1984-01-01
Research on dual-task performance has been concerned with delineating the antecedent conditions which lead to dual-task decrements. Capacity models of attention, which propose that a hypothetical resource structure underlies performance, have been employed as predictive devices. These models predict that tasks which require different processing resources can be more successfully time shared than tasks which require common resources. The conditions under which such dual-task integrality can be fostered were assessed in a study in which three factors likely to influence the integrality between tasks were manipulated: inter-task redundancy, the physical proximity of tasks and the task relevant objects. Twelve subjects participated in three experimental sessions in which they performed both single and dual-tasks. The primary task was a pursuit step tracking task. The secondary tasks required the discrimination between different intensities or different spatial positions of a stimulus. The results are discussed in terms of a model of dual-task integrality.
NASA Astrophysics Data System (ADS)
Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C.; Coletti, Janaine Z.; Read, Jordan S.; Ibelings, Bas W.; Valesini, Fiona J.; Brookes, Justin D.
2015-09-01
Maintaining the health of aquatic systems is an essential component of sustainable catchment management, however, degradation of water quality and aquatic habitat continues to challenge scientists and policy-makers. To support management and restoration efforts aquatic system models are required that are able to capture the often complex trajectories that these systems display in response to multiple stressors. This paper explores the abilities and limitations of current model approaches in meeting this challenge, and outlines a strategy based on integration of flexible model libraries and data from observation networks, within a learning framework, as a means to improve the accuracy and scope of model predictions. The framework is comprised of a data assimilation component that utilizes diverse data streams from sensor networks, and a second component whereby model structural evolution can occur once the model is assessed against theoretically relevant metrics of system function. Given the scale and transdisciplinary nature of the prediction challenge, network science initiatives are identified as a means to develop and integrate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to model assessment that can guide model adaptation. We outline how such a framework can help us explore the theory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry, and, in doing so, also advance the role of prediction in aquatic ecosystem management.
Using the Integrated Behavioral Model to Predict High-Risk Drinking among College Students
ERIC Educational Resources Information Center
Braun, Robert E.; Glassman, Tavis; Sheu, Jiunn-Jye; Dake, Joseph; Jordan, Tim; Yingling, Faith
2014-01-01
This study assessed the Integrated Behavioral Model's (IBM) utility in explaining high-risk drinking among college students. A total of 356 participants completed a four-page questionnaire based on the (IBM) theory and their drinking behavior. The results from a path analysis revealed three significant constructs (p<0.05) which predicted…
Predicting Attrition in a Military Special Program Training Command
2016-05-20
management, infrequency, and acquiescence. The 16PF Protective Services Dimensions include: emotional adjustment, integrity/control, intellectual...outcome (p = .10) Model 5: 16PF Protective Services Dimensions Only Predictors included emotional adjustment, integrity/control, intellectual...consciousness were not significantly related to GPA (ps = .01). Model 2A.5: 16PF Protective Services Dimensions Only Predictors included: emotional
ERIC Educational Resources Information Center
Bleakley, Amy; Hennessy, Michael; Fishbein, Martin; Jordan, Amy
2011-01-01
Published research demonstrates an association between exposure to media sexual content and a variety of sex-related outcomes for adolescents. What is not known is the mechanism through which sexual content produces this "media effect" on adolescent beliefs, attitudes, and behavior. Using the Integrative Model of Behavioral Prediction, this…
In this study, indirect aerosol effects on grid-scale clouds were implemented in the integrated WRF3.3-CMAQ5.0 modeling system by including parameterizations for both cloud droplet and ice number concentrations calculated from the CMAQ-predicted aerosol particles. The resulting c...
[Prediction of schistosomiasis infection rates of population based on ARIMA-NARNN model].
Ke-Wei, Wang; Yu, Wu; Jin-Ping, Li; Yu-Yu, Jiang
2016-07-12
To explore the effect of the autoregressive integrated moving average model-nonlinear auto-regressive neural network (ARIMA-NARNN) model on predicting schistosomiasis infection rates of population. The ARIMA model, NARNN model and ARIMA-NARNN model were established based on monthly schistosomiasis infection rates from January 2005 to February 2015 in Jiangsu Province, China. The fitting and prediction performances of the three models were compared. Compared to the ARIMA model and NARNN model, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model were the least with the values of 0.011 1, 0.090 0 and 0.282 4, respectively. The ARIMA-NARNN model could effectively fit and predict schistosomiasis infection rates of population, which might have a great application value for the prevention and control of schistosomiasis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J.H.; Roy, D.M.; Mann, B.
1995-12-31
This paper describes an integrated approach to developing a predictive computer model for long-term performance of concrete engineered barriers utilized in LLRW and ILRW disposal facilities. The model development concept consists of three major modeling schemes: hydration modeling of the binder phase, pore solution speciation, and transport modeling in the concrete barrier and service environment. Although still in its inception, the model development approach demonstrated that the chemical and physical properties of complex cementitious materials and their interactions with service environments can be described quantitatively. Applying the integrated model development approach to modeling alkali (Na and K) leaching from amore » concrete pad barrier in an above-grade tumulus disposal unit, it is predicted that, in a near-surface land disposal facility where water infiltration through the facility is normally minimal, the alkalis control the pore solution pH of the concrete barriers for much longer than most previous concrete barrier degradation studies assumed. The results also imply that a highly alkaline condition created by the alkali leaching will result in alteration of the soil mineralogy in the vicinity of the disposal facility.« less
NASA Technical Reports Server (NTRS)
Murch, Austin M.; Foster, John V.
2007-01-01
A simulation study was conducted to investigate aerodynamic modeling methods for prediction of post-stall flight dynamics of large transport airplanes. The research approach involved integrating dynamic wind tunnel data from rotary balance and forced oscillation testing with static wind tunnel data to predict aerodynamic forces and moments during highly dynamic departure and spin motions. Several state-of-the-art aerodynamic modeling methods were evaluated and predicted flight dynamics using these various approaches were compared. Results showed the different modeling methods had varying effects on the predicted flight dynamics and the differences were most significant during uncoordinated maneuvers. Preliminary wind tunnel validation data indicated the potential of the various methods for predicting steady spin motions.
Significance of Landsat-7 Spacecraft Level Thermal Balance and Thermal Test for ETM+Instrument
NASA Technical Reports Server (NTRS)
Choi, Michael K.
1999-01-01
The thermal design and the instrument thermal vacuum (T/V) test of the Landsat-7 Enhanced Thematic Mapper Plus (ETM+) instrument were based on the Landsat-4, 5 and 6 heritage. The ETM+ scanner thermal model was also inherited from Landsat-4, 5 and 6. The temperature predictions of many scanner components in the original thermal model had poor agreement with the spacecraft and instrument integrated sun-pointing safehold (SPSH) thermal balance (T/B) test results. The spacecraft and instrument integrated T/B test led to a change of the Full Aperture Calibrator (FAC) motor stack "solar shield" coating from MIL-C-5541 to multi-layer insulation (MLI) thermal blanket. The temperature predictions of the Auxiliary Electronics Module (AEM) in the thermal model also had poor agreement with the T/B test results. Modifications to the scanner and AEM thermal models were performed to give good agreement between the temperature predictions and the test results. The correlated ETM+ thermal model was used to obtain flight temperature predictions. The flight temperature predictions in the nominal 15-orbit mission profile, plus margins, were used as the yellow limits for most of the ETM+ components. The spacecraft and instrument integrated T/B and TN test also revealed that the standby heater capacity on the Scan Mirror Assembly (SMA) was insufficient when the Earth Background Simulator (EBS) was 1 50C or colder, and the baffle heater possibly caused the coherent noise in the narrow band data when it was on. Also, the cooler cool-down was significantly faster than that in the instrument T/V test, and the coldest Cold Focal Plane Array (CFPA) temperature achieved was colder.
Predicting sugar consumption: Application of an integrated dual-process, dual-phase model.
Hagger, Martin S; Trost, Nadine; Keech, Jacob J; Chan, Derwin K C; Hamilton, Kyra
2017-09-01
Excess consumption of added dietary sugars is related to multiple metabolic problems and adverse health conditions. Identifying the modifiable social cognitive and motivational constructs that predict sugar consumption is important to inform behavioral interventions aimed at reducing sugar intake. We tested the efficacy of an integrated dual-process, dual-phase model derived from multiple theories to predict sugar consumption. Using a prospective design, university students (N = 90) completed initial measures of the reflective (autonomous and controlled motivation, intentions, attitudes, subjective norm, perceived behavioral control), impulsive (implicit attitudes), volitional (action and coping planning), and behavioral (past sugar consumption) components of the proposed model. Self-reported sugar consumption was measured two weeks later. A structural equation model revealed that intentions, implicit attitudes, and, indirectly, autonomous motivation to reduce sugar consumption had small, significant effects on sugar consumption. Attitudes, subjective norm, and, indirectly, autonomous motivation to reduce sugar consumption predicted intentions. There were no effects of the planning constructs. Model effects were independent of the effects of past sugar consumption. The model identified the relative contribution of reflective and impulsive components in predicting sugar consumption. Given the prominent role of the impulsive component, interventions that assist individuals in managing cues-to-action and behavioral monitoring are likely to be effective in regulating sugar consumption. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Chai, Tianfeng; Crawford, Alice; Stunder, Barbara; Pavolonis, Michael J.; Draxler, Roland; Stein, Ariel
2017-02-01
Currently, the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) runs the HYSPLIT dispersion model with a unit mass release rate to predict the transport and dispersion of volcanic ash. The model predictions provide information for the Volcanic Ash Advisory Centers (VAAC) to issue advisories to meteorological watch offices, area control centers, flight information centers, and others. This research aims to provide quantitative forecasts of ash distributions generated by objectively and optimally estimating the volcanic ash source strengths, vertical distribution, and temporal variations using an observation-modeling inversion technique. In this top-down approach, a cost functional is defined to quantify the differences between the model predictions and the satellite measurements of column-integrated ash concentrations weighted by the model and observation uncertainties. Minimizing this cost functional by adjusting the sources provides the volcanic ash emission estimates. As an example, MODIS (Moderate Resolution Imaging Spectroradiometer) satellite retrievals of the 2008 Kasatochi volcanic ash clouds are used to test the HYSPLIT volcanic ash inverse system. Because the satellite retrievals include the ash cloud top height but not the bottom height, there are different model diagnostic choices for comparing the model results with the observed mass loadings. Three options are presented and tested. Although the emission estimates vary significantly with different options, the subsequent model predictions with the different release estimates all show decent skill when evaluated against the unassimilated satellite observations at later times. Among the three options, integrating over three model layers yields slightly better results than integrating from the surface up to the observed volcanic ash cloud top or using a single model layer. Inverse tests also show that including the ash-free region to constrain the model is not beneficial for the current case. In addition, extra constraints on the source terms can be given by explicitly enforcing no-ash
for the atmosphere columns above or below the observed ash cloud top height. However, in this case such extra constraints are not helpful for the inverse modeling. It is also found that simultaneously assimilating observations at different times produces better hindcasts than only assimilating the most recent observations.
He, Lian; Wu, Stephen G.; Wan, Ni; ...
2015-12-24
In this study, genome-scale models (GSMs) are widely used to predict cyanobacterial phenotypes in photobioreactors (PBRs). However, stoichiometric GSMs mainly focus on fluxome that result in maximal yields. Cyanobacterial metabolism is controlled by both intracellular enzymes and photobioreactor conditions. To connect both intracellular and extracellular information and achieve a better understanding of PBRs productivities, this study integrates a genome-scale metabolic model of Synechocystis 6803 with growth kinetics, cell movements, and a light distribution function. The hybrid platform not only maps flux dynamics in cells of sub-populations but also predicts overall production titer and rate in PBRs. Analysis of the integratedmore » GSM demonstrates several results. First, cyanobacteria are capable of reaching high biomass concentration (>20 g/L in 21 days) in PBRs without light and CO 2 mass transfer limitations. Second, fluxome in a single cyanobacterium may show stochastic changes due to random cell movements in PBRs. Third, insufficient light due to cell self-shading can activate the oxidative pentose phosphate pathway in subpopulation cells. Fourth, the model indicates that the removal of glycogen synthesis pathway may not improve cyanobacterial bio-production in large-size PBRs, because glycogen can support cell growth in the dark zones. Based on experimental data, the integrated GSM estimates that Synechocystis 6803 in shake flask conditions has a photosynthesis efficiency of ~2.7 %. Conclusions: The multiple-scale integrated GSM, which examines both intracellular and extracellular domains, can be used to predict production yield/rate/titer in large-size PBRs. More importantly, genetic engineering strategies predicted by a traditional GSM may work well only in optimal growth conditions. In contrast, the integrated GSM may reveal mutant physiologies in diverse bioreactor conditions, leading to the design of robust strains with high chances of success in industrial settings.« less
Degroeve, Sven; Maddelein, Davy; Martens, Lennart
2015-07-01
We present an MS(2) peak intensity prediction server that computes MS(2) charge 2+ and 3+ spectra from peptide sequences for the most common fragment ions. The server integrates the Unimod public domain post-translational modification database for modified peptides. The prediction model is an improvement of the previously published MS(2)PIP model for Orbitrap-LTQ CID spectra. Predicted MS(2) spectra can be downloaded as a spectrum file and can be visualized in the browser for comparisons with observations. In addition, we added prediction models for HCD fragmentation (Q-Exactive Orbitrap) and show that these models compute accurate intensity predictions on par with CID performance. We also show that training prediction models for CID and HCD separately improves the accuracy for each fragmentation method. The MS(2)PIP prediction server is accessible from http://iomics.ugent.be/ms2pip. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Optimization of processing parameters of UAV integral structural components based on yield response
NASA Astrophysics Data System (ADS)
Chen, Yunsheng
2018-05-01
In order to improve the overall strength of unmanned aerial vehicle (UAV), it is necessary to optimize the processing parameters of UAV structural components, which is affected by initial residual stress in the process of UAV structural components processing. Because machining errors are easy to occur, an optimization model for machining parameters of UAV integral structural components based on yield response is proposed. The finite element method is used to simulate the machining parameters of UAV integral structural components. The prediction model of workpiece surface machining error is established, and the influence of the path of walking knife on residual stress of UAV integral structure is studied, according to the stress of UAV integral component. The yield response of the time-varying stiffness is analyzed, and the yield response and the stress evolution mechanism of the UAV integral structure are analyzed. The simulation results show that this method is used to optimize the machining parameters of UAV integral structural components and improve the precision of UAV milling processing. The machining error is reduced, and the deformation prediction and error compensation of UAV integral structural parts are realized, thus improving the quality of machining.
NASA Astrophysics Data System (ADS)
Wasserman, Richard Marc
The radiation therapy treatment planning (RTTP) process may be subdivided into three planning stages: gross tumor delineation, clinical target delineation, and modality dependent target definition. The research presented will focus on the first two planning tasks. A gross tumor target delineation methodology is proposed which focuses on the integration of MRI, CT, and PET imaging data towards the generation of a mathematically optimal tumor boundary. The solution to this problem is formulated within a framework integrating concepts from the fields of deformable modelling, region growing, fuzzy logic, and data fusion. The resulting fuzzy fusion algorithm can integrate both edge and region information from multiple medical modalities to delineate optimal regions of pathological tissue content. The subclinical boundaries of an infiltrating neoplasm cannot be determined explicitly via traditional imaging methods and are often defined to extend a fixed distance from the gross tumor boundary. In order to improve the clinical target definition process an estimation technique is proposed via which tumor growth may be modelled and subclinical growth predicted. An in vivo, macroscopic primary brain tumor growth model is presented, which may be fit to each patient undergoing treatment, allowing for the prediction of future growth and consequently the ability to estimate subclinical local invasion. Additionally, the patient specific in vivo tumor model will be of significant utility in multiple diagnostic clinical applications.
Rodrigo, Guillermo; Jaramillo, Alfonso; Blázquez, Miguel A
2011-08-17
The interplay between hormone signaling and gene regulatory networks is instrumental in promoting the development of living organisms. In particular, plants have evolved mechanisms to sense gravity and orient themselves accordingly. Here, we present a mathematical model that reproduces plant gravitropic responses based on known molecular genetic interactions for auxin signaling coupled with a physical description of plant reorientation. The model allows one to analyze the spatiotemporal dynamics of the system, triggered by an auxin gradient that induces differential growth of the plant with respect to the gravity vector. Our model predicts two important features with strong biological implications: 1), robustness of the regulatory circuit as a consequence of integral control; and 2), a higher degree of plasticity generated by the molecular interplay between two classes of hormones. Our model also predicts the ability of gibberellins to modulate the tropic response and supports the integration of the hormonal role at the level of gene regulation. Copyright © 2011 Biophysical Society. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2010-05-23
The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.
Roberts, David W; Patlewicz, Grace
2018-01-01
There is an expectation that to meet regulatory requirements, and avoid or minimize animal testing, integrated approaches to testing and assessment will be needed that rely on assays representing key events (KEs) in the skin sensitization adverse outcome pathway. Three non-animal assays have been formally validated and regulatory adopted: the direct peptide reactivity assay (DPRA), the KeratinoSens™ assay and the human cell line activation test (h-CLAT). There have been many efforts to develop integrated approaches to testing and assessment with the "two out of three" approach attracting much attention. Here a set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the three individual non-animal assays, their binary combinations and the "two out of three" approach in predicting skin sensitization potential. The most predictive approach was to use both the DPRA and h-CLAT as follows: (1) perform DPRA - if positive, classify as sensitizing, and (2) if negative, perform h-CLAT - a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 85% (local lymph node assay) and 93% (human) of non-sensitizer predictions were correct, whereas the "two out of three" approach had 69% (local lymph node assay) and 79% (human) of non-sensitizer predictions correct. The findings are consistent with the argument, supported by published quantitative mechanistic models that only the first KE needs to be modeled. All three assays model this KE to an extent. The value of using more than one assay depends on how the different assays compensate for each other's technical limitations. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Assessment of SWE data assimilation for ensemble streamflow predictions
NASA Astrophysics Data System (ADS)
Franz, Kristie J.; Hogue, Terri S.; Barik, Muhammad; He, Minxue
2014-11-01
An assessment of data assimilation (DA) for Ensemble Streamflow Prediction (ESP) using seasonal water supply hindcasting in the North Fork of the American River Basin (NFARB) and the National Weather Service (NWS) hydrologic forecast models is undertaken. Two parameter sets, one from the California Nevada River Forecast Center (RFC) and one from the Differential Evolution Adaptive Metropolis (DREAM) algorithm, are tested. For each parameter set, hindcasts are generated using initial conditions derived with and without the inclusion of a DA scheme that integrates snow water equivalent (SWE) observations. The DREAM-DA scenario uses an Integrated Uncertainty and Ensemble-based data Assimilation (ICEA) framework that also considers model and parameter uncertainty. Hindcasts are evaluated using deterministic and probabilistic forecast verification metrics. In general, the impact of DA on the skill of the seasonal water supply predictions is mixed. For deterministic (ensemble mean) predictions, the Percent Bias (PBias) is improved with integration of the DA. DREAM-DA and the RFC-DA have the lowest biases and the RFC-DA has the lowest Root Mean Squared Error (RMSE). However, the RFC and DREAM-DA have similar RMSE scores. For the probabilistic predictions, the RFC and DREAM have the highest Continuous Ranked Probability Skill Scores (CRPSS) and the RFC has the best discrimination for low flows. Reliability results are similar between the non-DA and DA tests and the DREAM and DREAM-DA have better reliability than the RFC and RFC-DA for forecast dates February 1 and later. Despite producing improved streamflow simulations in previous studies, the hindcast analysis suggests that the DA method tested may not result in obvious improvements in streamflow forecasts. We advocate that integration of hindcasting and probabilistic metrics provides more rigorous insight on model performance for forecasting applications, such as in this study.
Yeari, Menahem; van den Broek, Paul
2016-09-01
It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.
Integrated Optical Design Analysis (IODA): New Test Data and Modeling Features
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; Patrick, Brian
2003-01-01
A general overview of the capabilities of the IODA ("Integrated Optical Design Analysis") exchange of data and modeling results between thermal, structures, optical design, and testing engineering disciplines. This presentation focuses on new features added to the software that allow measured test data to be imported into the IODA environment for post processing or comparisons with pretest model predictions. software is presented. IODA promotes efficient
A review on machine learning principles for multi-view biological data integration.
Li, Yifeng; Wu, Fang-Xiang; Ngom, Alioune
2018-03-01
Driven by high-throughput sequencing techniques, modern genomic and clinical studies are in a strong need of integrative machine learning models for better use of vast volumes of heterogeneous information in the deep understanding of biological systems and the development of predictive models. How data from multiple sources (called multi-view data) are incorporated in a learning system is a key step for successful analysis. In this article, we provide a comprehensive review on omics and clinical data integration techniques, from a machine learning perspective, for various analyses such as prediction, clustering, dimension reduction and association. We shall show that Bayesian models are able to use prior information and model measurements with various distributions; tree-based methods can either build a tree with all features or collectively make a final decision based on trees learned from each view; kernel methods fuse the similarity matrices learned from individual views together for a final similarity matrix or learning model; network-based fusion methods are capable of inferring direct and indirect associations in a heterogeneous network; matrix factorization models have potential to learn interactions among features from different views; and a range of deep neural networks can be integrated in multi-modal learning for capturing the complex mechanism of biological systems.
High-Performance Integrated Control of water quality and quantity in urban water reservoirs
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.; Goedbloed, A.
2015-11-01
This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).
Learning and inference using complex generative models in a spatial localization task.
Bejjanki, Vikranth R; Knill, David C; Aslin, Richard N
2016-01-01
A large body of research has established that, under relatively simple task conditions, human observers integrate uncertain sensory information with learned prior knowledge in an approximately Bayes-optimal manner. However, in many natural tasks, observers must perform this sensory-plus-prior integration when the underlying generative model of the environment consists of multiple causes. Here we ask if the Bayes-optimal integration seen with simple tasks also applies to such natural tasks when the generative model is more complex, or whether observers rely instead on a less efficient set of heuristics that approximate ideal performance. Participants localized a "hidden" target whose position on a touch screen was sampled from a location-contingent bimodal generative model with different variances around each mode. Over repeated exposure to this task, participants learned the a priori locations of the target (i.e., the bimodal generative model), and integrated this learned knowledge with uncertain sensory information on a trial-by-trial basis in a manner consistent with the predictions of Bayes-optimal behavior. In particular, participants rapidly learned the locations of the two modes of the generative model, but the relative variances of the modes were learned much more slowly. Taken together, our results suggest that human performance in a more complex localization task, which requires the integration of sensory information with learned knowledge of a bimodal generative model, is consistent with the predictions of Bayes-optimal behavior, but involves a much longer time-course than in simpler tasks.
Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark
2017-12-01
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Predicting Pilot Behavior in Medium Scale Scenarios Using Game Theory and Reinforcement Learning
NASA Technical Reports Server (NTRS)
Yildiz, Yildiray; Agogino, Adrian; Brat, Guillaume
2013-01-01
Effective automation is critical in achieving the capacity and safety goals of the Next Generation Air Traffic System. Unfortunately creating integration and validation tools for such automation is difficult as the interactions between automation and their human counterparts is complex and unpredictable. This validation becomes even more difficult as we integrate wide-reaching technologies that affect the behavior of different decision makers in the system such as pilots, controllers and airlines. While overt short-term behavior changes can be explicitly modeled with traditional agent modeling systems, subtle behavior changes caused by the integration of new technologies may snowball into larger problems and be very hard to detect. To overcome these obstacles, we show how integration of new technologies can be validated by learning behavior models based on goals. In this framework, human participants are not modeled explicitly. Instead, their goals are modeled and through reinforcement learning their actions are predicted. The main advantage to this approach is that modeling is done within the context of the entire system allowing for accurate modeling of all participants as they interact as a whole. In addition such an approach allows for efficient trade studies and feasibility testing on a wide range of automation scenarios. The goal of this paper is to test that such an approach is feasible. To do this we implement this approach using a simple discrete-state learning system on a scenario where 50 aircraft need to self-navigate using Automatic Dependent Surveillance-Broadcast (ADS-B) information. In this scenario, we show how the approach can be used to predict the ability of pilots to adequately balance aircraft separation and fly efficient paths. We present results with several levels of complexity and airspace congestion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography.Our derivation, which is based on the rate-summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills mature pine trees.more » This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
USDA-ARS?s Scientific Manuscript database
Relevant data about subsurface water flow and solute transport at relatively large scales that are of interest to the public are inherently laborious and in most cases simply impossible to obtain. Upscaling in which fine-scale models and data are used to predict changes at the coarser scales is the...
NASA Astrophysics Data System (ADS)
Dutta, Tanima
This dissertation focuses on the link between seismic amplitudes and reservoir properties. Prediction of reservoir properties, such as sorting, sand/shale ratio, and cement-volume from seismic amplitudes improves by integrating knowledge from multiple disciplines. The key contribution of this dissertation is to improve the prediction of reservoir properties by integrating sequence stratigraphy and rock physics. Sequence stratigraphy has been successfully used for qualitative interpretation of seismic amplitudes to predict reservoir properties. Rock physics modeling allows quantitative interpretation of seismic amplitudes. However, often there is uncertainty about selecting geologically appropriate rock physics model and its input parameters, away from the wells. In the present dissertation, we exploit the predictive power of sequence stratigraphy to extract the spatial trends of sedimentological parameters that control seismic amplitudes. These spatial trends of sedimentological parameters can serve as valuable constraints in rock physics modeling, especially away from the wells. Consequently, rock physics modeling, integrated with the trends from sequence stratigraphy, become useful for interpreting observed seismic amplitudes away from the wells in terms of underlying sedimentological parameters. We illustrate this methodology using a comprehensive dataset from channelized turbidite systems, deposited in minibasin settings in the offshore Equatorial Guinea, West Africa. First, we present a practical recipe for using closed-form expressions of effective medium models to predict seismic velocities in unconsolidated sandstones. We use an effective medium model that combines perfectly rough and smooth grains (the extended Walton model), and use that model to derive coordination number, porosity, and pressure relations for P and S wave velocities from experimental data. Our recipe provides reasonable fits to other experimental and borehole data, and specifically improves the predictions of shear wave velocities. In addition, we provide empirical relations on normal compaction depth trends of porosity, velocities, and VP/VS ratio for shale and clean sands in shallow, supra-salt sediments in the Gulf of Mexico. Next, we identify probable spatial trends of sand/shale ratio and sorting as predicted by the conventional sequence stratigraphic model in minibasin settings (spill-and-fill model). These spatial trends are evaluated using well data from offshore West Africa, and the same well data are used to calibrate rock physics models (modified soft-sand model) that provide links between P-impedance and quartz/clay ratio, and sorting. The spatial increase in sand/shale ratio and sorting corresponds to an overall increase in P-impedance, and AVO intercept and gradient. The results are used as a guide to interpret sedimentological parameters from seismic attributes, away from the well locations. We present a quantitative link between carbonate cement and seismic attributes by combining stratigraphie cycles and the rock physics model (modified differential effective medium model). The variation in carbonate cement volume in West Africa can be linked with two distinct stratigraphic cycles: the coarsening-upward cycles and the fining-upward cycles. Cemented sandstones associated with these cycles exhibit distinct signatures on P-impedance vs. porosity and AVO intercept vs. gradient crossplots. These observations are important for assessing reservoir properties in the West Africa as well as in other analogous depositional environments. Finally, we investigate the relationship between seismic velocities and time temperature index (TTI) using basin and petroleum system modeling at Rio Muni basin, West Africa. We find that both VP and VS increase exponentially with TTI. The results can be applied to predict TTI, and thereby thermal maturity, from observed velocities.
Corron, Louise; Marchal, François; Condemi, Silvana; Telmon, Norbert; Chaumoitre, Kathia; Adalian, Pascal
2018-05-31
Subadult age estimation should rely on sampling and statistical protocols capturing development variability for more accurate age estimates. In this perspective, measurements were taken on the fifth lumbar vertebrae and/or clavicles of 534 French males and females aged 0-19 years and the ilia of 244 males and females aged 0-12 years. These variables were fitted in nonparametric multivariate adaptive regression splines (MARS) models with 95% prediction intervals (PIs) of age. The models were tested on two independent samples from Marseille and the Luis Lopes reference collection from Lisbon. Models using ilium width and module, maximum clavicle length, and lateral vertebral body heights were more than 92% accurate. Precision was lower for postpubertal individuals. Integrating punctual nonlinearities of the relationship between age and the variables and dynamic prediction intervals incorporated the normal increase in interindividual growth variability (heteroscedasticity of variance) with age for more biologically accurate predictions. © 2018 American Academy of Forensic Sciences.
Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C.; Coletti, Janaine Z; Read, Jordan S.; Ibelings, Bas W; Valensini, Fiona J; Brookes, Justin D
2015-01-01
Maintaining the health of aquatic systems is an essential component of sustainable catchmentmanagement, however, degradation of water quality and aquatic habitat continues to challenge scientistsand policy-makers. To support management and restoration efforts aquatic system models are requiredthat are able to capture the often complex trajectories that these systems display in response to multiplestressors. This paper explores the abilities and limitations of current model approaches in meeting this chal-lenge, and outlines a strategy based on integration of flexible model libraries and data from observationnetworks, within a learning framework, as a means to improve the accuracy and scope of model predictions.The framework is comprised of a data assimilation component that utilizes diverse data streams from sensornetworks, and a second component whereby model structural evolution can occur once the model isassessed against theoretically relevant metrics of system function. Given the scale and transdisciplinarynature of the prediction challenge, network science initiatives are identified as a means to develop and inte-grate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to modelassessment that can guide model adaptation. We outline how such a framework can help us explore thetheory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry,and, in doing so, also advance the role of prediction in aquatic ecosystem management.
The GP problem: quantifying gene-to-phenotype relationships.
Cooper, Mark; Chapman, Scott C; Podlich, Dean W; Hammer, Graeme L
2002-01-01
In this paper we refer to the gene-to-phenotype modeling challenge as the GP problem. Integrating information across levels of organization within a genotype-environment system is a major challenge in computational biology. However, resolving the GP problem is a fundamental requirement if we are to understand and predict phenotypes given knowledge of the genome and model dynamic properties of biological systems. Organisms are consequences of this integration, and it is a major property of biological systems that underlies the responses we observe. We discuss the E(NK) model as a framework for investigation of the GP problem and the prediction of system properties at different levels of organization. We apply this quantitative framework to an investigation of the processes involved in genetic improvement of plants for agriculture. In our analysis, N genes determine the genetic variation for a set of traits that are responsible for plant adaptation to E environment-types within a target population of environments. The N genes can interact in epistatic NK gene-networks through the way that they influence plant growth and development processes within a dynamic crop growth model. We use a sorghum crop growth model, available within the APSIM agricultural production systems simulation model, to integrate the gene-environment interactions that occur during growth and development and to predict genotype-to-phenotype relationships for a given E(NK) model. Directional selection is then applied to the population of genotypes, based on their predicted phenotypes, to simulate the dynamic aspects of genetic improvement by a plant-breeding program. The outcomes of the simulated breeding are evaluated across cycles of selection in terms of the changes in allele frequencies for the N genes and the genotypic and phenotypic values of the populations of genotypes.
UK Environmental Prediction - integration and evaluation at the convective scale
NASA Astrophysics Data System (ADS)
Lewis, Huw; Brunet, Gilbert; Harris, Chris; Best, Martin; Saulter, Andrew; Holt, Jason; Bricheno, Lucy; Brerton, Ashley; Reynard, Nick; Blyth, Eleanor; Martinez de la Torre, Alberto
2015-04-01
It has long been understood that accurate prediction and warning of the impacts of severe weather requires an integrated approach to forecasting. This was well demonstrated in the UK throughout winter 2013/14 when an exceptional run of severe winter storms, often with damaging high winds and intense rainfall led to significant damage from the large waves and storm surge along coastlines, and from saturated soils, high river flows and significant flooding inland. The substantial impacts on individuals, businesses and infrastructure indicate a pressing need to understand better the value that might be delivered through more integrated environmental prediction. To address this need, the Met Office, Centre for Ecology & Hydrology and National Oceanography Centre have begun to develop the foundations of a coupled high resolution probabilistic forecast system for the UK at km-scale. This links together existing model components of the atmosphere, coastal ocean, land surface and hydrology. Our initial focus on a 2-year Prototype project will demonstrate the UK coupled prediction concept in research mode, including an analysis of the winter 2013/14 storms and its impacts. By linking science development to operational collaborations such as the UK Natural Hazards Partnership, we can ensure that science priorities are rooted in user requirements. This presentation will provide an overview of UK environmental prediction activities and an update on progress during the first year of the Prototype project. We will present initial results from the coupled model development and discuss the challenges to realise the potential of integrated regional coupled forecasting for improving predictions and applications.
Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model
NASA Astrophysics Data System (ADS)
Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna
2017-06-01
Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.
Adler, Philipp; Hugen, Thorsten; Wiewiora, Marzena; Kunz, Benno
2011-03-07
An unstructured model for an integrated fermentation/membrane extraction process for the production of the aroma compounds 2-phenylethanol and 2-phenylethylacetate by Kluyveromyces marxianus CBS 600 was developed. The extent to which this model, based only on data from the conventional fermentation and separation processes, provided an estimation of the integrated process was evaluated. The effect of product inhibition on specific growth rate and on biomass yield by both aroma compounds was approximated by multivariate regression. Simulations of the respective submodels for fermentation and the separation process matched well with experimental results. With respect to the in situ product removal (ISPR) process, the effect of reduced product inhibition due to product removal on specific growth rate and biomass yield was predicted adequately by the model simulations. Overall product yields were increased considerably in this process (4.0 g/L 2-PE+2-PEA vs. 1.4 g/L in conventional fermentation) and were even higher than predicted by the model. To describe the effect of product concentration on product formation itself, the model was extended using results from the conventional and the ISPR process, thus agreement between model and experimental data improved notably. Therefore, this model can be a useful tool for the development and optimization of an efficient integrated bioprocess. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.
2016-12-01
While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.
Improved prediction of biochemical recurrence after radical prostatectomy by genetic polymorphisms.
Morote, Juan; Del Amo, Jokin; Borque, Angel; Ars, Elisabet; Hernández, Carlos; Herranz, Felipe; Arruza, Antonio; Llarena, Roberto; Planas, Jacques; Viso, María J; Palou, Joan; Raventós, Carles X; Tejedor, Diego; Artieda, Marta; Simón, Laureano; Martínez, Antonio; Rioja, Luis A
2010-08-01
Single nucleotide polymorphisms are inherited genetic variations that can predispose or protect individuals against clinical events. We hypothesized that single nucleotide polymorphism profiling may improve the prediction of biochemical recurrence after radical prostatectomy. We performed a retrospective, multi-institutional study of 703 patients treated with radical prostatectomy for clinically localized prostate cancer who had at least 5 years of followup after surgery. All patients were genotyped for 83 prostate cancer related single nucleotide polymorphisms using a low density oligonucleotide microarray. Baseline clinicopathological variables and single nucleotide polymorphisms were analyzed to predict biochemical recurrence within 5 years using stepwise logistic regression. Discrimination was measured by ROC curve AUC, specificity, sensitivity, predictive values, net reclassification improvement and integrated discrimination index. The overall biochemical recurrence rate was 35%. The model with the best fit combined 8 covariates, including the 5 clinicopathological variables prostate specific antigen, Gleason score, pathological stage, lymph node involvement and margin status, and 3 single nucleotide polymorphisms at the KLK2, SULT1A1 and TLR4 genes. Model predictive power was defined by 80% positive predictive value, 74% negative predictive value and an AUC of 0.78. The model based on clinicopathological variables plus single nucleotide polymorphisms showed significant improvement over the model without single nucleotide polymorphisms, as indicated by 23.3% net reclassification improvement (p = 0.003), integrated discrimination index (p <0.001) and likelihood ratio test (p <0.001). Internal validation proved model robustness (bootstrap corrected AUC 0.78, range 0.74 to 0.82). The calibration plot showed close agreement between biochemical recurrence observed and predicted probabilities. Predicting biochemical recurrence after radical prostatectomy based on clinicopathological data can be significantly improved by including patient genetic information. Copyright (c) 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion.
Xu, Qinneng; Gel, Yulia R; Ramirez Ramirez, L Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient.
Predictive models of forest dynamics.
Purves, Drew; Pacala, Stephen
2008-06-13
Dynamic global vegetation models (DGVMs) have shown that forest dynamics could dramatically alter the response of the global climate system to increased atmospheric carbon dioxide over the next century. But there is little agreement between different DGVMs, making forest dynamics one of the greatest sources of uncertainty in predicting future climate. DGVM predictions could be strengthened by integrating the ecological realities of biodiversity and height-structured competition for light, facilitated by recent advances in the mathematics of forest modeling, ecological understanding of diverse forest communities, and the availability of forest inventory data.
NASA Technical Reports Server (NTRS)
Golubev, Vladimir; Mankbadi, Reda R.; Dahl, Milo D.; Kiraly, L. James (Technical Monitor)
2002-01-01
This paper provides preliminary results of the study of the acoustic radiation from the source model representing spatially-growing instability waves in a round jet at high speeds. The source model is briefly discussed first followed by the analysis of the produced acoustic directivity pattern. Two integral surface techniques are discussed and compared for prediction of the jet acoustic radiation field.
Legehar, Ashenafi; Xhaard, Henri; Ghemtio, Leo
2016-01-01
The disposition of a pharmaceutical compound within an organism, i.e. its Absorption, Distribution, Metabolism, Excretion, Toxicity (ADMET) properties and adverse effects, critically affects late stage failure of drug candidates and has led to the withdrawal of approved drugs. Computational methods are effective approaches to reduce the number of safety issues by analyzing possible links between chemical structures and ADMET or adverse effects, but this is limited by the size, quality, and heterogeneity of the data available from individual sources. Thus, large, clean and integrated databases of approved drug data, associated with fast and efficient predictive tools are desirable early in the drug discovery process. We have built a relational database (IDAAPM) to integrate available approved drug data such as drug approval information, ADMET and adverse effects, chemical structures and molecular descriptors, targets, bioactivity and related references. The database has been coupled with a searchable web interface and modern data analytics platform (KNIME) to allow data access, data transformation, initial analysis and further predictive modeling. Data were extracted from FDA resources and supplemented from other publicly available databases. Currently, the database contains information regarding about 19,226 FDA approval applications for 31,815 products (small molecules and biologics) with their approval history, 2505 active ingredients, together with as many ADMET properties, 1629 molecular structures, 2.5 million adverse effects and 36,963 experimental drug-target bioactivity data. IDAAPM is a unique resource that, in a single relational database, provides detailed information on FDA approved drugs including their ADMET properties and adverse effects, the corresponding targets with bioactivity data, coupled with a data analytics platform. It can be used to perform basic to complex drug-target ADMET or adverse effects analysis and predictive modeling. IDAAPM is freely accessible at http://idaapm.helsinki.fi and can be exploited through a KNIME workflow connected to the database.Graphical abstractFDA approved drug data integration for predictive modeling.
NASA Astrophysics Data System (ADS)
Wang, Y. P.; Lu, Z. P.; Sun, D. S.; Wang, N.
2016-01-01
In order to better express the characteristics of satellite clock bias (SCB) and improve SCB prediction precision, this paper proposed a new SCB prediction model which can take physical characteristics of space-borne atomic clock, the cyclic variation, and random part of SCB into consideration. First, the new model employs a quadratic polynomial model with periodic items to fit and extract the trend term and cyclic term of SCB; then based on the characteristics of fitting residuals, a time series ARIMA ~(Auto-Regressive Integrated Moving Average) model is used to model the residuals; eventually, the results from the two models are combined to obtain final SCB prediction values. At last, this paper uses precise SCB data from IGS (International GNSS Service) to conduct prediction tests, and the results show that the proposed model is effective and has better prediction performance compared with the quadratic polynomial model, grey model, and ARIMA model. In addition, the new method can also overcome the insufficiency of the ARIMA model in model recognition and order determination.
An integrative approach to ortholog prediction for disease-focused and other functional studies.
Hu, Yanhui; Flockhart, Ian; Vinayagam, Arunachalam; Bergwitz, Clemens; Berger, Bonnie; Perrimon, Norbert; Mohr, Stephanie E
2011-08-31
Mapping of orthologous genes among species serves an important role in functional genomics by allowing researchers to develop hypotheses about gene function in one species based on what is known about the functions of orthologs in other species. Several tools for predicting orthologous gene relationships are available. However, these tools can give different results and identification of predicted orthologs is not always straightforward. We report a simple but effective tool, the Drosophila RNAi Screening Center Integrative Ortholog Prediction Tool (DIOPT; http://www.flyrnai.org/diopt), for rapid identification of orthologs. DIOPT integrates existing approaches, facilitating rapid identification of orthologs among human, mouse, zebrafish, C. elegans, Drosophila, and S. cerevisiae. As compared to individual tools, DIOPT shows increased sensitivity with only a modest decrease in specificity. Moreover, the flexibility built into the DIOPT graphical user interface allows researchers with different goals to appropriately 'cast a wide net' or limit results to highest confidence predictions. DIOPT also displays protein and domain alignments, including percent amino acid identity, for predicted ortholog pairs. This helps users identify the most appropriate matches among multiple possible orthologs. To facilitate using model organisms for functional analysis of human disease-associated genes, we used DIOPT to predict high-confidence orthologs of disease genes in Online Mendelian Inheritance in Man (OMIM) and genes in genome-wide association study (GWAS) data sets. The results are accessible through the DIOPT diseases and traits query tool (DIOPT-DIST; http://www.flyrnai.org/diopt-dist). DIOPT and DIOPT-DIST are useful resources for researchers working with model organisms, especially those who are interested in exploiting model organisms such as Drosophila to study the functions of human disease genes.
NASA Astrophysics Data System (ADS)
Niu, Mingfei; Wang, Yufang; Sun, Shaolong; Li, Yongwu
2016-06-01
To enhance prediction reliability and accuracy, a hybrid model based on the promising principle of "decomposition and ensemble" and a recently proposed meta-heuristic called grey wolf optimizer (GWO) is introduced for daily PM2.5 concentration forecasting. Compared with existing PM2.5 forecasting methods, this proposed model has improved the prediction accuracy and hit rates of directional prediction. The proposed model involves three main steps, i.e., decomposing the original PM2.5 series into several intrinsic mode functions (IMFs) via complementary ensemble empirical mode decomposition (CEEMD) for simplifying the complex data; individually predicting each IMF with support vector regression (SVR) optimized by GWO; integrating all predicted IMFs for the ensemble result as the final prediction by another SVR optimized by GWO. Seven benchmark models, including single artificial intelligence (AI) models, other decomposition-ensemble models with different decomposition methods and models with the same decomposition-ensemble method but optimized by different algorithms, are considered to verify the superiority of the proposed hybrid model. The empirical study indicates that the proposed hybrid decomposition-ensemble model is remarkably superior to all considered benchmark models for its higher prediction accuracy and hit rates of directional prediction.
Winchell, Michael F; Peranginangin, Natalia; Srinivasan, Raghavan; Chen, Wenlin
2018-05-01
Recent national regulatory assessments of potential pesticide exposure of threatened and endangered species in aquatic habitats have led to increased need for watershed-scale predictions of pesticide concentrations in flowing water bodies. This study was conducted to assess the ability of the uncalibrated Soil and Water Assessment Tool (SWAT) to predict annual maximum pesticide concentrations in the flowing water bodies of highly vulnerable small- to medium-sized watersheds. The SWAT was applied to 27 watersheds, largely within the midwest corn belt of the United States, ranging from 20 to 386 km 2 , and evaluated using consistent input data sets and an uncalibrated parameterization approach. The watersheds were selected from the Atrazine Ecological Exposure Monitoring Program and the Heidelberg Tributary Loading Program, both of which contain high temporal resolution atrazine sampling data from watersheds with exceptionally high vulnerability to atrazine exposure. The model performance was assessed based upon predictions of annual maximum atrazine concentrations in 1-d and 60-d durations, predictions critical in pesticide-threatened and endangered species risk assessments when evaluating potential acute and chronic exposure to aquatic organisms. The simulation results showed that for nearly half of the watersheds simulated, the uncalibrated SWAT model was able to predict annual maximum pesticide concentrations within a narrow range of uncertainty resulting from atrazine application timing patterns. An uncalibrated model's predictive performance is essential for the assessment of pesticide exposure in flowing water bodies, the majority of which have insufficient monitoring data for direct calibration, even in data-rich countries. In situations in which SWAT over- or underpredicted the annual maximum concentrations, the magnitude of the over- or underprediction was commonly less than a factor of 2, indicating that the model and uncalibrated parameterization approach provide a capable method for predicting the aquatic exposure required to support pesticide regulatory decision making. Integr Environ Assess Manag 2018;14:358-368. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Interoceptive predictions in the brain
Barrett, Lisa Feldman; Simmons, W. Kyle
2016-01-01
Intuition suggests that perception follows sensation and therefore bodily feelings originate in the body. However, recent evidence goes against this logic: interoceptive experience may largely reflect limbic predictions about the expected state of the body that are constrained by ascending visceral sensations. In this Opinion article, we introduce the Embodied Predictive Interoception Coding model, which integrates an anatomical model of corticocortical connections with Bayesian active inference principles, to propose that agranular visceromotor cortices contribute to interoception by issuing interoceptive predictions. We then discuss how disruptions in interoceptive predictions could function as a common vulnerability for mental and physical illness. PMID:26016744
Model-based influences on humans’ choices and striatal prediction errors
Daw, Nathaniel D.; Gershman, Samuel J.; Seymour, Ben; Dayan, Peter; Dolan, Raymond J.
2011-01-01
Summary The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making. PMID:21435563
Validation of Fatigue Modeling Predictions in Aviation Operations
NASA Technical Reports Server (NTRS)
Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin
2017-01-01
Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.
Semi-empirical model for prediction of unsteady forces on an airfoil with application to flutter
NASA Technical Reports Server (NTRS)
Mahajan, Aparajit J.; Kaza, Krishna Rao V.
1992-01-01
A semi-empirical model is described for predicting unsteady aerodynamic forces on arbitrary airfoils under mildly stalled and unstalled conditions. Aerodynamic forces are modeled using second order ordinary differential equations for lift and moment with airfoil motion as the input. This model is simultaneously integrated with structural dynamics equations to determine flutter characteristics for a two degrees-of-freedom system. Results for a number of cases are presented to demonstrate the suitability of this model to predict flutter. Comparison is made to the flutter characteristics determined by a Navier-Stokes solver and also the classical incompressible potential flow theory.
Semi-empirical model for prediction of unsteady forces on an airfoil with application to flutter
NASA Technical Reports Server (NTRS)
Mahajan, A. J.; Kaza, K. R. V.; Dowell, E. H.
1993-01-01
A semi-empirical model is described for predicting unsteady aerodynamic forces on arbitrary airfoils under mildly stalled and unstalled conditions. Aerodynamic forces are modeled using second order ordinary differential equations for lift and moment with airfoil motion as the input. This model is simultaneously integrated with structural dynamics equations to determine flutter characteristics for a two degrees-of-freedom system. Results for a number of cases are presented to demonstrate the suitability of this model to predict flutter. Comparison is made to the flutter characteristics determined by a Navier-Stokes solver and also the classical incompressible potential flow theory.
O'Brien, Nicola; Philpott-Morgan, Siôn; Dixon, Diane
2016-02-01
First, this study compares the ability of an integrated model of activity and activity limitations, the International Classification of Functioning, Disability and Health (ICF), and the Theory of Planned Behaviour (TPB) to predict walking within individuals with osteoarthritis. Second, the effectiveness of a walking intervention in these individuals is determined. A series of n-of-1 studies with an AB intervention design was used. Diary methods were used to study four community-dwelling individuals with lower-limb osteoarthritis. Data on impairment symptoms (pain, pain on movement, and joint stiffness), cognitions (intention, self-efficacy, and perceived controllability), and walking (step count) were collected twice daily for 12 weeks. At 6 weeks, an individually tailored, data-driven walking intervention using action planning or a control cognition manipulation was delivered. Simulation modelling analysis examined cross-correlations and differences in baseline and intervention phase means. Post-hoc mediation analyses examined theoretical relationships and multiple regression analyses compared theoretical models. Cognitions, intention in particular, were better and more consistent within individual predictors of walking than impairment. The walking intervention did not increase walking in any of the three participants receiving it. The integrated model and the TPB, which recognize a predictive role for cognitions, were significant predictors of walking variance in all participants, whilst the biomedical ICF model was only predictive for one participant. Despite the lack of evidence for an individually tailored walking intervention, predictive data suggest that interventions for people with osteoarthritis that address cognitions are likely to be more effective than those that address impairment only. Further within-individual investigation, including testing mediational relationships, is warranted. What is already known on this subject? N-of-1 methods have been used to study within-individual predictors of walking in healthy and chronic pain populations An integrated biomedical and behavioural model of activity and activity limitations recognizes the roles of impairment and psychology (cognitions) Interventions modifying cognitions can increase physical activity in people with mobility limitations What does this study add? N-of-1 methods are suitable to study within-individual predictors of walking and interventions in osteoarthritis An integrated and a psychological model are better predictors of walking in osteoarthritis than a biomedical model There was no support for an individually tailored, data-driven walking intervention. © 2015 The British Psychological Society.
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2011-01-01
As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.
Incorporating learning goals about modeling into an upper-division physics laboratory experiment
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Finkelstein, Noah; Lewandowski, H. J.
2014-09-01
Implementing a laboratory activity involves a complex interplay among learning goals, available resources, feedback about the existing course, best practices for teaching, and an overall philosophy about teaching labs. Building on our previous work, which described a process of transforming an entire lab course, we now turn our attention to how an individual lab activity on the polarization of light was redesigned to include a renewed emphasis on one broad learning goal: modeling. By using this common optics lab as a concrete case study of a broadly applicable approach, we highlight many aspects of the activity development and show how modeling is used to integrate sophisticated conceptual and quantitative reasoning into the experimental process through the various aspects of modeling: constructing models, making predictions, interpreting data, comparing measurements with predictions, and refining models. One significant outcome is a natural way to integrate an analysis and discussion of systematic error into a lab activity.
Azadi, Sama; Karimi-Jashni, Ayoub
2016-02-01
Predicting the mass of solid waste generation plays an important role in integrated solid waste management plans. In this study, the performance of two predictive models, Artificial Neural Network (ANN) and Multiple Linear Regression (MLR) was verified to predict mean Seasonal Municipal Solid Waste Generation (SMSWG) rate. The accuracy of the proposed models is illustrated through a case study of 20 cities located in Fars Province, Iran. Four performance measures, MAE, MAPE, RMSE and R were used to evaluate the performance of these models. The MLR, as a conventional model, showed poor prediction performance. On the other hand, the results indicated that the ANN model, as a non-linear model, has a higher predictive accuracy when it comes to prediction of the mean SMSWG rate. As a result, in order to develop a more cost-effective strategy for waste management in the future, the ANN model could be used to predict the mean SMSWG rate. Copyright © 2015 Elsevier Ltd. All rights reserved.
Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka
2017-04-09
Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM 2 . 5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM 2 . 5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web.
Adeleke, Jude Adekunle; Moodley, Deshendran; Rens, Gavin; Adewumi, Aderemi Oluyinka
2017-01-01
Proactive monitoring and control of our natural and built environments is important in various application scenarios. Semantic Sensor Web technologies have been well researched and used for environmental monitoring applications to expose sensor data for analysis in order to provide responsive actions in situations of interest. While these applications provide quick response to situations, to minimize their unwanted effects, research efforts are still necessary to provide techniques that can anticipate the future to support proactive control, such that unwanted situations can be averted altogether. This study integrates a statistical machine learning based predictive model in a Semantic Sensor Web using stream reasoning. The approach is evaluated in an indoor air quality monitoring case study. A sliding window approach that employs the Multilayer Perceptron model to predict short term PM2.5 pollution situations is integrated into the proactive monitoring and control framework. Results show that the proposed approach can effectively predict short term PM2.5 pollution situations: precision of up to 0.86 and sensitivity of up to 0.85 is achieved over half hour prediction horizons, making it possible for the system to warn occupants or even to autonomously avert the predicted pollution situations within the context of Semantic Sensor Web. PMID:28397776
Wave Energy Potential in the Eastern Mediterranean Levantine Basin. An Integrated 10-year Study
2014-01-01
SUBTITLE Wave energy potential in the Eastern Mediterranean Levantine Basin. An integrated 10-year study 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c... Cardone CV, Ewing JA, et al. The WAM model e a third generation ocean wave prediction model. J Phys Oceanogr 1988;18(12):1775e810. [70] Varinou M
Analysis of bacterial migration. 2: Studies with multiple attractant gradients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strauss, I.; Frymier, P.D.; Hahn, C.M.
1995-02-01
Many motile bacteria exhibit chemotaxis, the ability to bias their random motion toward or away from increasing concentrations of chemical substances which benefit or inhibit their survival, respectively. Since bacteria encounter numerous chemical concentration gradients simultaneously in natural surroundings, it is necessary to know quantitatively how a bacterial population responds in the presence of more than one chemical stimulus to develop predictive mathematical models describing bacterial migration in natural systems. This work evaluates three hypothetical models describing the integration of chemical signals from multiple stimuli: high sensitivity, maximum signal, and simple additivity. An expression for the tumbling probability for individualmore » stimuli is modified according to the proposed models and incorporated into the cell balance equation for a 1-D attractant gradient. Random motility and chemotactic sensitivity coefficients, required input parameters for the model, are measured for single stimulus responses. Theoretical predictions with the three signal integration models are compared to the net chemotactic response of Escherichia coli to co- and antidirectional gradients of D-fucose and [alpha]-methylaspartate in the stopped-flow diffusion chamber assay. Results eliminate the high-sensitivity model and favor the simple additivity over the maximum signal. None of the simple models, however, accurately predict the observed behavior, suggesting a more complex model with more steps in the signal processing mechanism is required to predict responses to multiple stimuli.« less
Hsu, David
2015-09-27
Clustering methods are often used to model energy consumption for two reasons. First, clustering is often used to process data and to improve the predictive accuracy of subsequent energy models. Second, stable clusters that are reproducible with respect to non-essential changes can be used to group, target, and interpret observed subjects. However, it is well known that clustering methods are highly sensitive to the choice of algorithms and variables. This can lead to misleading assessments of predictive accuracy and mis-interpretation of clusters in policymaking. This paper therefore introduces two methods to the modeling of energy consumption in buildings: clusterwise regression,more » also known as latent class regression, which integrates clustering and regression simultaneously; and cluster validation methods to measure stability. Using a large dataset of multifamily buildings in New York City, clusterwise regression is compared to common two-stage algorithms that use K-means and model-based clustering with linear regression. Predictive accuracy is evaluated using 20-fold cross validation, and the stability of the perturbed clusters is measured using the Jaccard coefficient. These results show that there seems to be an inherent tradeoff between prediction accuracy and cluster stability. This paper concludes by discussing which clustering methods may be appropriate for different analytical purposes.« less
NASA Astrophysics Data System (ADS)
Kariniotakis, G.; Anemos Team
2003-04-01
Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for offshore wind farms taking into account advances in marine meteorology (interaction between wind and waves, coastal effects). The benefits from the use of satellite radar images for modeling local weather patterns are investigated. A next generation forecasting software, ANEMOS, will be developed to integrate the various models. The tool is enhanced by advanced Information Communication Technology (ICT) functionality and can operate both in stand alone, or remote mode, or be interfaced with standard Energy or Distribution Management Systems (EMS/DMS) systems. Contribution: The project provides an advanced technology for wind resource forecasting applicable in a large scale: at a single wind farm, regional or national level and for both interconnected and island systems. A major milestone is the on-line operation of the developed software by the participating utilities for onshore and offshore wind farms and the demonstration of the economic benefits. The outcome of the ANEMOS project will help consistently the increase of wind integration in two levels; in an operational level due to better management of wind farms, but also, it will contribute to increasing the installed capacity of wind farms. This is because accurate prediction of the resource reduces the risk of wind farm developers, who are then more willing to undertake new wind farm installations especially in a liberalized electricity market environment.
Li, Jian; Wu, Huan-Yu; Li, Yan-Ting; Jin, Hui-Ming; Gu, Bao-Ke; Yuan, Zheng-An
2010-01-01
To explore the feasibility of establishing and applying of autoregressive integrated moving average (ARIMA) model to predict the incidence rate of dysentery in Shanghai, so as to provide the theoretical basis for prevention and control of dysentery. ARIMA model was established based on the monthly incidence rate of dysentery of Shanghai from 1990 to 2007. The parameters of model were estimated through unconditional least squares method, the structure was determined according to criteria of residual un-correlation and conclusion, and the model goodness-of-fit was determined through Akaike information criterion (AIC) and Schwarz Bayesian criterion (SBC). The constructed optimal model was applied to predict the incidence rate of dysentery of Shanghai in 2008 and evaluate the validity of model through comparing the difference of predicted incidence rate and actual one. The incidence rate of dysentery in 2010 was predicted by ARIMA model based on the incidence rate from January 1990 to June 2009. The model ARIMA (1, 1, 1) (0, 1, 2)(12) had a good fitness to the incidence rate with both autoregressive coefficient (AR1 = 0.443) during the past time series, moving average coefficient (MA1 = 0.806) and seasonal moving average coefficient (SMA1 = 0.543, SMA2 = 0.321) being statistically significant (P < 0.01). AIC and SBC were 2.878 and 16.131 respectively and predicting error was white noise. The mathematic function was (1-0.443B) (1-B) (1-B(12))Z(t) = (1-0.806B) (1-0.543B(12)) (1-0.321B(2) x 12) micro(t). The predicted incidence rate in 2008 was consistent with the actual one, with the relative error of 6.78%. The predicted incidence rate of dysentery in 2010 based on the incidence rate from January 1990 to June 2009 would be 9.390 per 100 thousand. ARIMA model can be used to fit the changes of incidence rate of dysentery and to forecast the future incidence rate in Shanghai. It is a predicted model of high precision for short-time forecast.
Hertäg, Loreen; Hass, Joachim; Golovko, Tatiana; Durstewitz, Daniel
2012-01-01
For large-scale network simulations, it is often desirable to have computationally tractable, yet in a defined sense still physiologically valid neuron models. In particular, these models should be able to reproduce physiological measurements, ideally in a predictive sense, and under different input regimes in which neurons may operate in vivo. Here we present an approach to parameter estimation for a simple spiking neuron model mainly based on standard f-I curves obtained from in vitro recordings. Such recordings are routinely obtained in standard protocols and assess a neuron's response under a wide range of mean-input currents. Our fitting procedure makes use of closed-form expressions for the firing rate derived from an approximation to the adaptive exponential integrate-and-fire (AdEx) model. The resulting fitting process is simple and about two orders of magnitude faster compared to methods based on numerical integration of the differential equations. We probe this method on different cell types recorded from rodent prefrontal cortex. After fitting to the f-I current-clamp data, the model cells are tested on completely different sets of recordings obtained by fluctuating ("in vivo-like") input currents. For a wide range of different input regimes, cell types, and cortical layers, the model could predict spike times on these test traces quite accurately within the bounds of physiological reliability, although no information from these distinct test sets was used for model fitting. Further analyses delineated some of the empirical factors constraining model fitting and the model's generalization performance. An even simpler adaptive LIF neuron was also examined in this context. Hence, we have developed a "high-throughput" model fitting procedure which is simple and fast, with good prediction performance, and which relies only on firing rate information and standard physiological data widely and easily available.
Neale, Patrick J; Thomas, Brian C
2017-01-01
Phytoplankton photosynthesis is often inhibited by ultraviolet (UV) and intense photosynthetically available radiation (PAR), but the effects on ocean productivity have received little consideration aside from polar areas subject to periodic enhanced UV-B due to depletion of stratospheric ozone. A more comprehensive assessment is important for understanding the contribution of phytoplankton production to the global carbon budget, present and future. Here, we consider responses in the temperate and tropical mid-ocean regions typically dominated by picophytoplankton including the prokaryotic lineages, Prochlorococcus and Synechococcus. Spectral models of photosynthetic response for each lineage were constructed using model strains cultured at different growth irradiances and temperatures. In the model, inhibition becomes more severe once exposure exceeds a threshold (E max ) related to repair capacity. Model parameters are presented for Prochlorococcus adding to those previously presented for Synechococcus. The models were applied to estimate midday, water column photosynthesis based on an atmospheric model of spectral radiation, satellite-derived spectral water transparency and temperature. Based on a global survey of inhibitory exposure severity, a full-latitude section of the mid-Pacific and near-equatorial region of the east Pacific were identified as representative regions for prediction of responses over the entire water column. Comparing predictions integrated over the water column including versus excluding inhibition, production was 7-28% lower due to inhibition depending on strain and site conditions. Inhibition was consistently greater for Prochlorococcus compared to two strains of Synechococcus. Considering only the surface mixed layer, production was inhibited 7-73%. On average, including inhibition lowered estimates of midday productivity around 20% for the modeled region of the Pacific with UV accounting for two-thirds of the reduction. In contrast, most other productivity models either ignore inhibition or only include PAR inhibition. Incorporation of E max model responses into an existing spectral model of depth-integrated, daily production will enable efficient global predictions of picophytoplankton productivity including inhibition. © 2016 John Wiley & Sons Ltd.
How health leaders can benefit from predictive analytics.
Giga, Aliyah
2017-11-01
Predictive analytics can support a better integrated health system providing continuous, coordinated, and comprehensive person-centred care to those who could benefit most. In addition to dollars saved, using a predictive model in healthcare can generate opportunities for meaningful improvements in efficiency, productivity, costs, and better population health with targeted interventions toward patients at risk.
Comparison of across-frequency integration strategies in a binaural detection model.
Breebaart, Jeroen
2013-11-01
Breebaart et al. [J. Acoust. Soc. Am. 110, 1089-1104 (2001)] reported that the masker bandwidth dependence of detection thresholds for an out-of-phase signal and an in-phase noise masker (N0Sπ) can be explained by principles of integration of information across critical bands. In this paper, different methods for such across-frequency integration process are evaluated as a function of the bandwidth and notch width of the masker. The results indicate that an "optimal detector" model assuming independent internal noise in each critical band provides a better fit to experimental data than a best filter or a simple across-frequency integrator model. Furthermore, the exponent used to model peripheral compression influences the accuracy of predictions in notched conditions.
Wang, Zhuo; Danziger, Samuel A; Heavner, Benjamin D; Ma, Shuyi; Smith, Jennifer J; Li, Song; Herricks, Thurston; Simeonidis, Evangelos; Baliga, Nitin S; Aitchison, John D; Price, Nathan D
2017-05-01
Gene regulatory and metabolic network models have been used successfully in many organisms, but inherent differences between them make networks difficult to integrate. Probabilistic Regulation Of Metabolism (PROM) provides a partial solution, but it does not incorporate network inference and underperforms in eukaryotes. We present an Integrated Deduced And Metabolism (IDREAM) method that combines statistically inferred Environment and Gene Regulatory Influence Network (EGRIN) models with the PROM framework to create enhanced metabolic-regulatory network models. We used IDREAM to predict phenotypes and genetic interactions between transcription factors and genes encoding metabolic activities in the eukaryote, Saccharomyces cerevisiae. IDREAM models contain many fewer interactions than PROM and yet produce significantly more accurate growth predictions. IDREAM consistently outperformed PROM using any of three popular yeast metabolic models and across three experimental growth conditions. Importantly, IDREAM's enhanced accuracy makes it possible to identify subtle synthetic growth defects. With experimental validation, these novel genetic interactions involving the pyruvate dehydrogenase complex suggested a new role for fatty acid-responsive factor Oaf1 in regulating acetyl-CoA production in glucose grown cells.
UK Environmental Prediction - integration and evaluation at the convective scale
NASA Astrophysics Data System (ADS)
Fallmann, Joachim; Lewis, Huw; Castillo, Juan Manuel; Pearson, David; Harris, Chris; Saulter, Andy; Bricheno, Lucy; Blyth, Eleanor
2016-04-01
Traditionally, the simulation of regional ocean, wave and atmosphere components of the Earth System have been considered separately, with some information on other components provided by means of boundary or forcing conditions. More recently, the potential value of a more integrated approach, as required for global climate and Earth System prediction, for regional short-term applications has begun to gain increasing research effort. In the UK, this activity is motivated by an understanding that accurate prediction and warning of the impacts of severe weather requires an integrated approach to forecasting. The substantial impacts on individuals, businesses and infrastructure of such events indicate a pressing need to understand better the value that might be delivered through more integrated environmental prediction. To address this need, the Met Office, NERC Centre for Ecology & Hydrology and NERC National Oceanography Centre have begun to develop the foundations of a coupled high resolution probabilistic forecast system for the UK at km-scale. This links together existing model components of the atmosphere, coastal ocean, land surface and hydrology. Our initial focus has been on a 2-year Prototype project to demonstrate the UK coupled prediction concept in research mode. This presentation will provide an update on UK environmental prediction activities. We will present the results from the initial implementation of an atmosphere-land-ocean coupled system, including a new eddy-permitting resolution ocean component, and discuss progress and initial results from further development to integrate wave interactions in this relatively high resolution system. We will discuss future directions and opportunities for collaboration in environmental prediction, and the challenges to realise the potential of integrated regional coupled forecasting for improving predictions and applications.
Crawford, Brian A.; Moore, Clinton; Norton, Terry M.; Maerz, John C.
2018-01-01
A challenge for making conservation decisions is predicting how wildlife populations respond to multiple, concurrent threats and potential management strategies, usually under substantial uncertainty. Integrated modeling approaches can improve estimation of demographic rates necessary for making predictions, even for rare or cryptic species with sparse data, but their use in management applications is limited. We developed integrated models for a population of diamondback terrapins (Malaclemys terrapin) impacted by road-associated threats to (i) jointly estimate demographic rates from two mark-recapture datasets, while directly estimating road mortality and the impact of management actions deployed during the study; and (ii) project the population using population viability analysis under simulated management strategies to inform decision-making. Without management, population extirpation was nearly certain due to demographic impacts of road mortality, predators, and vegetation. Installation of novel flashing signage increased survival of terrapins that crossed roads by 30%. Signage, along with small roadside barriers installed during the study, increased population persistence probability, but the population was still predicted to decline. Management strategies that included actions targeting multiple threats and demographic rates resulted in the highest persistence probability, and roadside barriers, which increased adult survival, were predicted to increase persistence more than other actions. Our results support earlier findings showing mitigation of multiple threats is likely required to increase the viability of declining populations. Our approach illustrates how integrated models may be adapted to use limited data efficiently, represent system complexity, evaluate impacts of threats and management actions, and provide decision-relevant information for conservation of at-risk populations.
2017-02-01
Reports an error in "An integrative formal model of motivation and decision making: The MGPM*" by Timothy Ballard, Gillian Yeo, Shayne Loft, Jeffrey B. Vancouver and Andrew Neal ( Journal of Applied Psychology , 2016[Sep], Vol 101[9], 1240-1265). Equation A3 contained an error. This correct equation is provided in the erratum. (The following abstract of the original article appeared in record 2016-28692-001.) We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Presence of indicator plant species as a predictor of wetland vegetation integrity
Stapanian, Martin A.; Adams, Jean V.; Gara, Brian
2013-01-01
We fit regression and classification tree models to vegetation data collected from Ohio (USA) wetlands to determine (1) which species best predict Ohio vegetation index of biotic integrity (OVIBI) score and (2) which species best predict high-quality wetlands (OVIBI score >75). The simplest regression tree model predicted OVIBI score based on the occurrence of three plant species: skunk-cabbage (Symplocarpus foetidus), cinnamon fern (Osmunda cinnamomea), and swamp rose (Rosa palustris). The lowest OVIBI scores were best predicted by the absence of the selected plant species rather than by the presence of other species. The simplest classification tree model predicted high-quality wetlands based on the occurrence of two plant species: skunk-cabbage and marsh-fern (Thelypteris palustris). The overall misclassification rate from this tree was 13 %. Again, low-quality wetlands were better predicted than high-quality wetlands by the absence of selected species rather than the presence of other species using the classification tree model. Our results suggest that a species’ wetland status classification and coefficient of conservatism are of little use in predicting wetland quality. A simple, statistically derived species checklist such as the one created in this study could be used by field biologists to quickly and efficiently identify wetland sites likely to be regulated as high-quality, and requiring more intensive field assessments. Alternatively, it can be used for advanced determinations of low-quality wetlands. Agencies can save considerable money by screening wetlands for the presence/absence of such “indicator” species before issuing permits.
Li, Longhai; Feng, Cindy X; Qiu, Shi
2017-06-30
An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Expanded modeling of temperature-dependent dielectric properties for microwave thermal ablation
Ji, Zhen; Brace, Christopher L
2011-01-01
Microwaves are a promising source for thermal tumor ablation due to their ability to rapidly heat dispersive biological tissues, often to temperatures in excess of 100 °C. At these high temperatures, tissue dielectric properties change rapidly and, thus, so do the characteristics of energy delivery. Precise knowledge of how tissue dielectric properties change during microwave heating promises to facilitate more accurate simulation of device performance and helps optimize device geometry and energy delivery parameters. In this study, we measured the dielectric properties of liver tissue during high-temperature microwave heating. The resulting data were compiled into either a sigmoidal function of temperature or an integration of the time–temperature curve for both relative permittivity and effective conductivity. Coupled electromagnetic–thermal simulations of heating produced by a single monopole antenna using the new models were then compared to simulations with existing linear and static models, and experimental temperatures in liver tissue. The new sigmoidal temperature-dependent model more accurately predicted experimental temperatures when compared to temperature–time integrated or existing models. The mean percent differences between simulated and experimental temperatures over all times were 4.2% for sigmoidal, 10.1% for temperature–time integration, 27.0% for linear and 32.8% for static models at the antenna input power of 50 W. Correcting for tissue contraction improved agreement for powers up to 75 W. The sigmoidal model also predicted substantial changes in heating pattern due to dehydration. We can conclude from these studies that a sigmoidal model of tissue dielectric properties improves prediction of experimental results. More work is needed to refine and generalize this model. PMID:21791728
NASA Astrophysics Data System (ADS)
Mundher Yaseen, Zaher; Abdulmohsin Afan, Haitham; Tran, Minh-Tung
2018-04-01
Scientifically evidenced that beam-column joints are a critical point in the reinforced concrete (RC) structure under the fluctuation loads effects. In this novel hybrid data-intelligence model developed to predict the joint shear behavior of exterior beam-column structure frame. The hybrid data-intelligence model is called genetic algorithm integrated with deep learning neural network model (GA-DLNN). The genetic algorithm is used as prior modelling phase for the input approximation whereas the DLNN predictive model is used for the prediction phase. To demonstrate this structural problem, experimental data is collected from the literature that defined the dimensional and specimens’ properties. The attained findings evidenced the efficitveness of the hybrid GA-DLNN in modelling beam-column joint shear problem. In addition, the accurate prediction achived with less input variables owing to the feasibility of the evolutionary phase.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Howard, Joseph M.; Mosier, Gary E.; Parrish, Keith A.; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal-optical, often referred to as STOP, analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. Temperatures predicted using geometric and thermal math models are mapped to a structural finite element model in order to predict thermally induced deformations. Motions and deformations at optical surfaces are then input to optical models, and optical performance is predicted using either an optical ray trace or a linear optical analysis tool. In addition to baseline performance predictions, a process for performing sensitivity studies to assess modeling uncertainties is described.
Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study
Hosseinyalamdary, Siavash
2018-01-01
Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy. PMID:29695119
Deep Kalman Filter: Simultaneous Multi-Sensor Integration and Modelling; A GNSS/IMU Case Study.
Hosseinyalamdary, Siavash
2018-04-24
Bayes filters, such as the Kalman and particle filters, have been used in sensor fusion to integrate two sources of information and obtain the best estimate of unknowns. The efficient integration of multiple sensors requires deep knowledge of their error sources. Some sensors, such as Inertial Measurement Unit (IMU), have complicated error sources. Therefore, IMU error modelling and the efficient integration of IMU and Global Navigation Satellite System (GNSS) observations has remained a challenge. In this paper, we developed deep Kalman filter to model and remove IMU errors and, consequently, improve the accuracy of IMU positioning. To achieve this, we added a modelling step to the prediction and update steps of the Kalman filter, so that the IMU error model is learned during integration. The results showed our deep Kalman filter outperformed the conventional Kalman filter and reached a higher level of accuracy.
Monolithic integrated circuit charge amplifier and comparator for MAMA readout
NASA Technical Reports Server (NTRS)
Cole, Edward H.; Smeins, Larry G.
1991-01-01
Prototype ICs for the Solar Heliospheric Observatory's Multi-Anode Microchannel Array (MAMA) have been developed; these ICs' charge-amplifier and comparator components were then tested with a view to pulse response and noise performance. All model performance predictions have been exceeded. Electrostatic discharge protection has been included on all IC connections; device operation over temperature has been consistent with model predictions.
Behavioral Change Theories Can Inform the Prediction of Young Adults' Adoption of a Plant-Based Diet
ERIC Educational Resources Information Center
Wyker, Brett A.; Davison, Kirsten K.
2010-01-01
Objective: Drawing on the Theory of Planned Behavior (TPB) and the Transtheoretical Model (TTM), this study (1) examines links between stages of change for following a plant-based diet (PBD) and consuming more fruits and vegetables (FV); (2) tests an integrated theoretical model predicting intention to follow a PBD; and (3) identifies associated…
NASA Technical Reports Server (NTRS)
Molthan, Andrew; Case, Jonathan; Venner, Jason; Moreno-Madrinan, Max J.; Delgado, Francisco
2012-01-01
Two projects at NASA Marshall Space Flight Center have collaborated to develop a high resolution weather forecast model for Mesoamerica: The NASA Short-term Prediction Research and Transition (SPoRT) Center, which integrates unique NASA satellite and weather forecast modeling capabilities into the operational weather forecasting community. NASA's SERVIR Program, which integrates satellite observations, ground-based data, and forecast models to improve disaster response in Central America, the Caribbean, Africa, and the Himalayas.
An updated view of global water cycling
NASA Astrophysics Data System (ADS)
Houser, P. R.; Schlosser, A.; Lehr, J.
2009-04-01
Unprecedented new observation capacities combined with revolutions in modeling, we are poised to make huge advances in water cycle assessment, understanding, and prediction. To realize this goal, we must develop a discipline of prediction and verification through the integration of water and energy cycle observations and models, and to verify model predictions against observed phenomena to ensure that research delivers reliable improvements in prediction skill. Accomplishing these goals will require, in part, an accurate accounting of the key reservoirs and fluxes associated with the global water and energy cycle, including their spatial and temporal variability, through integration of all necessary observations and research tools. A brief history of the lineage of the conventional water balance and a summary accounting of all major parameters of the water balance using highly respected secondary sources will be presented. Principally, recently published peer reviewed papers reporting results of original work involving direct measurements and new data generated by high-tech devices (e.g. satellite / airborne instruments, supercomputers, geophysical tools) will be employed. This work lends credence to the conventional water balance ideas, but also reveals anachronistic scientific concepts/models, questionable underlying data, longstanding oversights and outright errors in the water balance.
Predicting Seagrass Occurrence in a Changing Climate Using Random Forests
NASA Astrophysics Data System (ADS)
Aydin, O.; Butler, K. A.
2017-12-01
Seagrasses are marine plants that can quickly sequester vast amounts of carbon (up to 100 times more and 12 times faster than tropical forests). In this work, we present an integrated GIS and machine learning approach to build a data-driven model of seagrass presence-absence. We outline a random forest approach that avoids the prevalence bias in many ecological presence-absence models. One of our goals is to predict global seagrass occurrence from a spatially limited training sample. In addition, we conduct a sensitivity study which investigates the vulnerability of seagrass to changing climate conditions. We integrate multiple data sources including fine-scale seagrass data from MarineCadastre.gov and the recently available globally extensive publicly available Ecological Marine Units (EMU) dataset. These data are used to train a model for seagrass occurrence along the U.S. coast. In situ oceans data are interpolated using Empirical Bayesian Kriging (EBK) to produce globally extensive prediction variables. A neural network is used to estimate probable future values of prediction variables such as ocean temperature to assess the impact of a warming climate on seagrass occurrence. The proposed workflow can be generalized to many presence-absence models.
NASA Technical Reports Server (NTRS)
Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl
2004-01-01
This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.
Modeling the effect of shroud contact and friction dampers on the mistuned response of turbopumps
NASA Technical Reports Server (NTRS)
Griffin, Jerry H.; Yang, M.-T.
1994-01-01
The contract has been revised. Under the revised scope of work a reduced order model has been developed that can be used to predict the steady-state response of mistuned bladed disks. The approach has been implemented in a computer code, LMCC. It is concluded that: the reduced order model displays structural fidelity comparable to that of a finite element model of an entire bladed disk system with significantly improved computational efficiency; and, when the disk is stiff, both the finite element model and LMCC predict significantly more amplitude variation than was predicted by earlier models. This second result may have important practical ramifications, especially in the case of integrally bladed disks.
NASA Astrophysics Data System (ADS)
Habibi, H.; Norouzi, A.; Habib, A.; Seo, D. J.
2016-12-01
To produce accurate predictions of flooding in urban areas, it is necessary to model both natural channel and storm drain networks. While there exist many urban hydraulic models of varying sophistication, most of them are not practical for real-time application for large urban areas. On the other hand, most distributed hydrologic models developed for real-time applications lack the ability to explicitly simulate storm drains. In this work, we develop a storm drain model that can be coupled with distributed hydrologic models such as the National Weather Service Hydrology Laboratory's Distributed Hydrologic Model, for real-time flash flood prediction in large urban areas to improve prediction and to advance the understanding of integrated response of natural channels and storm drains to rainfall events of varying magnitude and spatiotemporal extent in urban catchments of varying sizes. The initial study area is the Johnson Creek Catchment (40.1 km2) in the City of Arlington, TX. For observed rainfall, the high-resolution (500 m, 1 min) precipitation data from the Dallas-Fort Worth Demonstration Network of the Collaborative Adaptive Sensing of the Atmosphere radars is used.
Stempler, Shiri; Yizhak, Keren; Ruppin, Eytan
2014-01-01
Accumulating evidence links numerous abnormalities in cerebral metabolism with the progression of Alzheimer's disease (AD), beginning in its early stages. Here, we integrate transcriptomic data from AD patients with a genome-scale computational human metabolic model to characterize the altered metabolism in AD, and employ state-of-the-art metabolic modelling methods to predict metabolic biomarkers and drug targets in AD. The metabolic descriptions derived are first tested and validated on a large scale versus existing AD proteomics and metabolomics data. Our analysis shows a significant decrease in the activity of several key metabolic pathways, including the carnitine shuttle, folate metabolism and mitochondrial transport. We predict several metabolic biomarkers of AD progression in the blood and the CSF, including succinate and prostaglandin D2. Vitamin D and steroid metabolism pathways are enriched with predicted drug targets that could mitigate the metabolic alterations observed. Taken together, this study provides the first network wide view of the metabolic alterations associated with AD progression. Most importantly, it offers a cohort of new metabolic leads for the diagnosis of AD and its treatment. PMID:25127241
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Hugh D.; Eisfeld, Amie J.; Sims, Amy
Respiratory infections stemming from influenza viruses and the Severe Acute Respiratory Syndrome corona virus (SARS-CoV) represent a serious public health threat as emerging pandemics. Despite efforts to identify the critical interactions of these viruses with host machinery, the key regulatory events that lead to disease pathology remain poorly targeted with therapeutics. Here we implement an integrated network interrogation approach, in which proteome and transcriptome datasets from infection of both viruses in human lung epithelial cells are utilized to predict regulatory genes involved in the host response. We take advantage of a novel “crowd-based” approach to identify and combine ranking metricsmore » that isolate genes/proteins likely related to the pathogenicity of SARS-CoV and influenza virus. Subsequently, a multivariate regression model is used to compare predicted lung epithelial regulatory influences with data derived from other respiratory virus infection models. We predicted a small set of regulatory factors with conserved behavior for consideration as important components of viral pathogenesis that might also serve as therapeutic targets for intervention. Our results demonstrate the utility of integrating diverse ‘omic datasets to predict and prioritize regulatory features conserved across multiple pathogen infection models.« less
NASA Astrophysics Data System (ADS)
Luo, Y.; Huang, Y.; Jiang, J.; MA, S.; Saruta, V.; Liang, G.; Hanson, P. J.; Ricciuto, D. M.; Milcu, A.; Roy, J.
2017-12-01
The past two decades have witnessed rapid development in sensor technology. Built upon the sensor development, large research infrastructure facilities, such as National Ecological Observatory Network (NEON) and FLUXNET, have been established. Through networking different kinds of sensors and other data collections at many locations all over the world, those facilities generate large volumes of ecological data every day. The big data from those facilities offer an unprecedented opportunity for advancing our understanding of ecological processes, educating teachers and students, supporting decision-making, and testing ecological theory. The big data from the major research infrastructure facilities also provides foundation for developing predictive ecology. Indeed, the capability to predict future changes in our living environment and natural resources is critical to decision making in a world where the past is no longer a clear guide to the future. We are living in a period marked by rapid climate change, profound alteration of biogeochemical cycles, unsustainable depletion of natural resources, and deterioration of air and water quality. Projecting changes in future ecosystem services to the society becomes essential not only for science but also for policy making. We will use this panel format to outline major opportunities and challenges in integrating research infrastructure and ecosystem models toward developing predictive ecology. Meanwhile, we will also show results from an interactive model-experiment System - Ecological Platform for Assimilating Data into models (EcoPAD) - that have been implemented at the Spruce and Peatland Responses Under Climatic and Environmental change (SPRUCE) experiment in Northern Minnesota and Montpellier Ecotron, France. EcoPAD is developed by integrating web technology, eco-informatics, data assimilation techniques, and ecosystem modeling. EcoPAD is designed to streamline data transfer seamlessly from research infrastructure facilities to model simulation, data assimilation, and ecological forecasting.
Conditional dissipation of scalars in homogeneous turbulence: Closure for MMC modelling
NASA Astrophysics Data System (ADS)
Wandel, Andrew P.
2013-08-01
While the mean and unconditional variance are to be predicted well by any reasonable turbulent combustion model, these are generally not sufficient for the accurate modelling of complex phenomena such as extinction/reignition. An additional criterion has been recently introduced: accurate modelling of the dissipation timescales associated with fluctuations of scalars about their conditional mean (conditional dissipation timescales). Analysis of Direct Numerical Simulation (DNS) results for a passive scalar shows that the conditional dissipation timescale is of the order of the integral timescale and smaller than the unconditional dissipation timescale. A model is proposed: the conditional dissipation timescale is proportional to the integral timescale. This model is used in Multiple Mapping Conditioning (MMC) modelling for a passive scalar case and a reactive scalar case, comparing to DNS results for both. The results show that this model improves the accuracy of MMC predictions so as to match the DNS results more closely using a relatively-coarse spatial resolution compared to other turbulent combustion models.
Integration of Air Quality & Exposure Models for Health Studies
The presentation describes a new community-scale tool called exposure model for individuals (EMI), which predicts five tiers of individual-level exposure metrics for ambient PM using outdoor concentrations, questionnaires, weather, and time-location information. In this modeling ...
UK Environmental Prediction - integration and evaluation at the convective scale
NASA Astrophysics Data System (ADS)
Fallmann, Joachim; Lewis, Huw; Castillo, Juan Manuel; Pearson, David; Harris, Chris; Saulter, Andy; Bricheno, Lucy; Blyth, Eleanor
2016-04-01
It has long been understood that accurate prediction and warning of the impacts of severe weather requires an integrated approach to forecasting. For example, high impact weather is typically manifested through various interactions and feedbacks between different components of the Earth System. Damaging high winds can lead to significant damage from the large waves and storm surge along coastlines. The impact of intense rainfall can be translated through saturated soils and land surface processes, high river flows and flooding inland. The substantial impacts on individuals, businesses and infrastructure of such events indicate a pressing need to understand better the value that might be delivered through more integrated environmental prediction. To address this need, the Met Office, NERC Centre for Ecology & Hydrology and NERC National Oceanography Centre have begun to develop the foundations of a coupled high resolution probabilistic forecast system for the UK at km-scale. This links together existing model components of the atmosphere, coastal ocean, land surface and hydrology. Our initial focus has been on a 2-year Prototype project to demonstrate the UK coupled prediction concept in research mode. This presentation will provide an update on UK environmental prediction activities. We will present the results from the initial implementation of an atmosphere-land-ocean coupled system and discuss progress and initial results from further development to integrate wave interactions. We will discuss future directions and opportunities for collaboration in environmental prediction, and the challenges to realise the potential of integrated regional coupled forecasting for improving predictions and applications.
Integrated Data-Archive and Distributed Hydrological Modelling System for Optimized Dam Operation
NASA Astrophysics Data System (ADS)
Shibuo, Yoshihiro; Jaranilla-Sanchez, Patricia Ann; Koike, Toshio
2013-04-01
In 2012, typhoon Bopha, which passed through the southern part of the Philippines, devastated the nation leaving hundreds of death tolls and significant destruction of the country. Indeed the deadly events related to cyclones occur almost every year in the region. Such extremes are expected to increase both in frequency and magnitude around Southeast Asia, during the course of global climate change. Our ability to confront such hazardous events is limited by the best available engineering infrastructure and performance of weather prediction. An example of the countermeasure strategy is, for instance, early release of reservoir water (lowering the dam water level) during the flood season to protect the downstream region of impending flood. However, over release of reservoir water affect the regional economy adversely by losing water resources, which still have value for power generation, agricultural and industrial water use. Furthermore, accurate precipitation forecast itself is conundrum task, due to the chaotic nature of the atmosphere yielding uncertainty in model prediction over time. Under these circumstances we present a novel approach to optimize contradicting objectives of: preventing flood damage via priori dam release; while sustaining sufficient water supply, during the predicted storm events. By evaluating forecast performance of Meso-Scale Model Grid Point Value against observed rainfall, uncertainty in model prediction is probabilistically taken into account, and it is then applied to the next GPV issuance for generating ensemble rainfalls. The ensemble rainfalls drive the coupled land-surface- and distributed-hydrological model to derive the ensemble flood forecast. Together with dam status information taken into account, our integrated system estimates the most desirable priori dam release through the shuffled complex evolution algorithm. The strength of the optimization system is further magnified by the online link to the Data Integration and Analysis System, a Japanese national project for collecting, integrating and analyzing massive amount of global scale observation data, meaning that the present system is applicable worldwide. We demonstrate the integrated system with observed extreme events in Angat Watershed, the Philippines, and Upper Tone River basin, Japan. The results show promising performance for operational use of the system to support river and dam managers' decision-making.
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
Using Pareto points for model identification in predictive toxicology
2013-01-01
Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649
NASA Technical Reports Server (NTRS)
Koenig, Herbert A.; Chan, Kwai S.; Cassenti, Brice N.; Weber, Richard
1988-01-01
A unified numerical method for the integration of stiff time dependent constitutive equations is presented. The solution process is directly applied to a constitutive model proposed by Bodner. The theory confronts time dependent inelastic behavior coupled with both isotropic hardening and directional hardening behaviors. Predicted stress-strain responses from this model are compared to experimental data from cyclic tests on uniaxial specimens. An algorithm is developed for the efficient integration of the Bodner flow equation. A comparison is made with the Euler integration method. An analysis of computational time is presented for the three algorithms.
NASA Astrophysics Data System (ADS)
Schafbuch, Paul Jay
The boundary element method (BEM) is used to numerically simulate the interaction of ultrasonic waves with material defects such as voids, inclusions, and open cracks. The time harmonic formulation is in 3D and therefore allows flaws of arbitrary shape to be modeled. The BEM makes such problems feasible because the underlying boundary integral equation only requires a surface (2D) integration and difficulties associated with the seemingly infinite extent of the host domain are not encountered. The computer code utilized in this work is built upon recent advances in elastodynamic boundary element theory such as a scheme for self adjusting integration order and singular integration regularization. Incident fields may be taken as compressional or shear plane waves or predicted by an approximate Gauss -Hermite beam model. The code is highly optimized for voids and has been coupled with computer aided engineering packages for automated flaw shape definition and mesh generation. Subsequent graphical display of intermediate results supports model refinement and physical interpretation. Final results are typically cast in a nondestructive evaluation (NDE) context as either scattering amplitudes or flaw signals (via a measurement model based on a reciprocity integral). The near field is also predicted which allows for improved physical insight into the scattering process and the evaluation of certain modeling approximations. The accuracy of the BEM approach is first examined by comparing its predictions to those of other models for single, isolated scatterers. The comparisons are with the predictions of analytical solutions for spherical defects and with MOOT and T-matrix calculations for axisymmetric flaws. Experimental comparisons are also made for volumetric shapes with different characteristic dimensions in all three directions, since no other numerical approach has yet produced results of this type. Theoretical findings regarding the fictitious eigenfrequency difficulty are substantiated through the analytical solution of a fundamental elastodynamics problem and corresponding BEM studies. Given the confidence in the BEM technique engendered by these comparisons, it is then used to investigate the modeling of "open", cracklike defects amenable to a volumetric formulation. The limits of applicability of approximate theories (e.g., quasistatic, Kirchhoff, and geometric theory of diffraction) are explored for elliptical cracks, from this basis. The problem of two interacting scatterers is then considered. Results from a fully implicit approach and from a more efficient hybrid scheme are compared with generalized Born and farfield approximate interaction theories.
NASA Astrophysics Data System (ADS)
Schafbuch, Paul Jay
1991-02-01
The boundary element method (BEM) is used to numerically simulate the interaction of ultrasonic waves with material defects such as voids, inclusions, and open cracks. The time harmonic formulation is in 3D and therefore allows flaws of arbitrary shape to be modeled. The BEM makes such problems feasible because the underlying boundary integral equation only requires a surface (2D) integration and difficulties associated with the seemingly infinite extent of the host domain are not encountered. The computer code utilized in this work is built upon recent advances in elastodynamic boundary element theory such as a scheme for self adjusting integration order and singular integration regularization. Incident fields may be taken as compressional or shear plane waves or predicted by an approximate Gauss-Hermite beam model. The code is highly optimized for voids and has been coupled with computer aided engineering packages for automated flaw shape definition and mesh generation. Subsequent graphical display of intermediate results supports model refinement and physical interpretation. Final results are typically cast in a nondestructive evaluation (NDE) context as either scattering amplitudes or flaw signals (via a measurement model based on a reciprocity integral). The near field is also predicted which allows for improved physical insight into the scattering process and the evaluation of certain modeling approximations. The accuracy of the BEM approach is first examined by comparing its predictions to those of other models for single, isolated scatters. The comparisons are with the predictions of analytical solutions for spherical defects and with MOOT and T-matrix calculations for axisymmetric flaws. Experimental comparisons are also made for volumetric shapes with different characteristic dimensions in all three directions, since no other numerical approach has yet produced results of this type. Theoretical findings regarding the fictitious eigenfrequency difficulty are substantiated through the analytical solution of a fundamental elastodynamics problem and corresponding BEM studies. Given the confidence in the BEM technique engendered by these comparisons, it is then used to investigate the modeling of 'open', cracklike defects amenable to a volumetric formulation. The limits of applicability of approximate theories (e.g., quasistatic, Kirchhoff, and geometric theory of diffraction) are explored for elliptical cracks, from this basis. The problem of two interacting scatterers is then considered. Results from a fully implicit approach and from a more efficient hybrid scheme are compared with generalized Born and farfield approximate interaction theories.
Kukona, Anuenue; Cho, Pyeong Whan; Magnuson, James S.; Tabor, Whitney
2014-01-01
Psycholinguistic research spanning a number of decades has produced diverging results with regard to the nature of constraint integration in online sentence processing. For example, evidence that language users anticipatorily fixate likely upcoming referents in advance of evidence in the speech signal supports rapid context integration. By contrast, evidence that language users activate representations that conflict with contextual constraints, or only indirectly satisfy them, supports non-integration or late integration. Here, we report on a self-organizing neural network framework that addresses one aspect of constraint integration: the integration of incoming lexical information (i.e., an incoming word) with sentence context information (i.e., from preceding words in an unfolding utterance). In two simulations, we show that the framework predicts both classic results concerned with lexical ambiguity resolution (Swinney, 1979; Tanenhaus, Leiman, & Seidenberg, 1979), which suggest late context integration, and results demonstrating anticipatory eye movements (e.g., Altmann & Kamide, 1999), which support rapid context integration. We also report two experiments using the visual world paradigm that confirm a new prediction of the framework. Listeners heard sentences like “The boy will eat the white…,” while viewing visual displays with objects like a white cake (i.e., a predictable direct object of “eat”), white car (i.e., an object not predicted by “eat,” but consistent with “white”), and distractors. Consistent with our simulation predictions, we found that while listeners fixated white cake most, they also fixated white car more than unrelated distractors in this highly constraining sentence (and visual) context. PMID:24245535
Probability-based collaborative filtering model for predicting gene-disease associations.
Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan
2017-12-28
Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
White, J.D.; Running, S.W.; Thornton, P.E.; Keane, R.E.; Ryan, K.C.; Fagre, D.B.; Key, C.H.
1998-01-01
Glacier National Park served as a test site for ecosystem analyses than involved a suite of integrated models embedded within a geographic information system. The goal of the exercise was to provide managers with maps that could illustrate probable shifts in vegetation, net primary production (NPP), and hydrologic responses associated with two selected climatic scenarios. The climatic scenarios were (a) a recent 12-yr record of weather data, and (b) a reconstituted set that sequentially introduced in repeated 3-yr intervals wetter-cooler, drier-warmer, and typical conditions. To extrapolate the implications of changes in ecosystem processes and resulting growth and distribution of vegetation and snowpack, the model incorporated geographic data. With underlying digital elevation maps, soil depth and texture, extrapolated climate, and current information on vegetation types and satellite-derived estimates of a leaf area indices, simulations were extended to envision how the park might look after 120 yr. The predictions of change included underlying processes affecting the availability of water and nitrogen. Considerable field data were acquired to compare with model predictions under current climatic conditions. In general, the integrated landscape models of ecosystem processes had good agreement with measured NPP, snowpack, and streamflow, but the exercise revealed the difficulty and necessity of averaging point measurements across landscapes to achieve comparable results with modeled values. Under the extremely variable climate scenario significant changes in vegetation composition and growth as well as hydrologic responses were predicted across the park. In particular, a general rise in both the upper and lower limits of treeline was predicted. These shifts would probably occur along with a variety of disturbances (fire, insect, and disease outbreaks) as predictions of physiological stress (water, nutrients, light) altered competitive relations and hydrologic responses. The use of integrated landscape models applied in this exercise should provide managers with insights into the underlying processes important in maintaining community structure, and at the same time, locate where changes on the landscape are most likely to occur.
Berlinguer, Fiammetta; Madeddu, Manuela; Pasciu, Valeria; Succu, Sara; Spezzigu, Antonio; Satta, Valentina; Mereu, Paolo; Leoni, Giovanni G; Naitana, Salvatore
2009-01-01
Currently, the assessment of sperm function in a raw or processed semen sample is not able to reliably predict sperm ability to withstand freezing and thawing procedures and in vivo fertility and/or assisted reproductive biotechnologies (ART) outcome. The aim of the present study was to investigate which parameters among a battery of analyses could predict subsequent spermatozoa in vitro fertilization ability and hence blastocyst output in a goat model. Ejaculates were obtained by artificial vagina from 3 adult goats (Capra hircus) aged 2 years (A, B and C). In order to assess the predictive value of viability, computer assisted sperm analyzer (CASA) motility parameters and ATP intracellular concentration before and after thawing and of DNA integrity after thawing on subsequent embryo output after an in vitro fertility test, a logistic regression analysis was used. Individual differences in semen parameters were evident for semen viability after thawing and DNA integrity. Results of IVF test showed that spermatozoa collected from A and B lead to higher cleavage rates (0 < 0.01) and blastocysts output (p < 0.05) compared with C. Logistic regression analysis model explained a deviance of 72% (p < 0.0001), directly related with the mean percentage of rapid spermatozoa in fresh semen (p < 0.01), semen viability after thawing (p < 0.01), and with two of the three comet parameters considered, i.e tail DNA percentage and comet length (p < 0.0001). DNA integrity alone had a high predictive value on IVF outcome with frozen/thawed semen (deviance explained: 57%). The model proposed here represents one of the many possible ways to explain differences found in embryo output following IVF with different semen donors and may represent a useful tool to select the most suitable donors for semen cryopreservation. PMID:19900288
Malloy, Timothy; Zaunbrecher, Virginia; Beryt, Elizabeth; Judson, Richard; Tice, Raymond; Allard, Patrick; Blake, Ann; Cote, Ila; Godwin, Hilary; Heine, Lauren; Kerzic, Patrick; Kostal, Jakub; Marchant, Gary; McPartland, Jennifer; Moran, Kelly; Nel, Andre; Ogunseitan, Oladele; Rossi, Mark; Thayer, Kristina; Tickner, Joel; Whittaker, Margaret; Zarker, Ken
2017-09-01
Alternatives analysis (AA) is a method used in regulation and product design to identify, assess, and evaluate the safety and viability of potential substitutes for hazardous chemicals. It requires toxicological data for the existing chemical and potential alternatives. Predictive toxicology uses in silico and in vitro approaches, computational models, and other tools to expedite toxicological data generation in a more cost-effective manner than traditional approaches. The present article briefly reviews the challenges associated with using predictive toxicology in regulatory AA, then presents 4 recommendations for its advancement. It recommends using case studies to advance the integration of predictive toxicology into AA, adopting a stepwise process to employing predictive toxicology in AA beginning with prioritization of chemicals of concern, leveraging existing resources to advance the integration of predictive toxicology into the practice of AA, and supporting transdisciplinary efforts. The further incorporation of predictive toxicology into AA would advance the ability of companies and regulators to select alternatives to harmful ingredients, and potentially increase the use of predictive toxicology in regulation more broadly. Integr Environ Assess Manag 2017;13:915-925. © 2017 SETAC. © 2017 SETAC.
Predicting Chemical Toxicity from Proteomics and Computational Chemistry
2008-07-30
similarity spaces, BD Gute and SC Basak, SAR QSAR Environ. Res., 17, 37-51 (2006). Predicting pharmacological and toxicological activity of heterocyclic...affinity of dibenzofurans: a hierarchical QSAR approach, authored jointly by Basak and Mills; Division of Chemical Toxicology iii. Prediction of blood...biodescriptors vis-ä-vis chemodescriptors in predictive toxicology e) Development of integrated QSTR models using the combined set of chemodescriptors and
A watershed model of individual differences in fluid intelligence.
Kievit, Rogier A; Davis, Simon W; Griffiths, John; Correia, Marta M; Cam-Can; Henson, Richard N
2016-10-01
Fluid intelligence is a crucial cognitive ability that predicts key life outcomes across the lifespan. Strong empirical links exist between fluid intelligence and processing speed on the one hand, and white matter integrity and processing speed on the other. We propose a watershed model that integrates these three explanatory levels in a principled manner in a single statistical model, with processing speed and white matter figuring as intermediate endophenotypes. We fit this model in a large (N=555) adult lifespan cohort from the Cambridge Centre for Ageing and Neuroscience (Cam-CAN) using multiple measures of processing speed, white matter health and fluid intelligence. The model fit the data well, outperforming competing models and providing evidence for a many-to-one mapping between white matter integrity, processing speed and fluid intelligence. The model can be naturally extended to integrate other cognitive domains, endophenotypes and genotypes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Toward “optimal” integration of terrestrial biosphere models
Schwalm, Christopher R.; Huntzinger, Deborah N.; Fisher, Joshua B.; ...
2015-06-10
Multimodel ensembles (MME) are commonplace in Earth system modeling. Here we perform MME integration using a 10-member ensemble of terrestrial biosphere models (TBMs) from the Multiscale synthesis and Terrestrial Model Intercomparison Project (MsTMIP). We contrast optimal (skill based for present-day carbon cycling) versus naive (one model-one vote) integration. MsTMIP optimal and naive mean land sink strength estimates (-1.16 versus -1.15 Pg C per annum respectively) are statistically indistinguishable. This holds also for grid cell values and extends to gross uptake, biomass, and net ecosystem productivity. TBM skill is similarly indistinguishable. The added complexity of skill-based integration does not materially changemore » MME values. This suggests that carbon metabolism has predictability limits and/or that all models and references are misspecified. Finally, resolving this issue requires addressing specific uncertainty types (initial conditions, structure, and references) and a change in model development paradigms currently dominant in the TBM community.« less
Cloud Based Metalearning System for Predictive Modeling of Biomedical Data
Vukićević, Milan
2014-01-01
Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101
Interpreting incremental value of markers added to risk prediction models.
Pencina, Michael J; D'Agostino, Ralph B; Pencina, Karol M; Janssens, A Cecile J W; Greenland, Philip
2012-09-15
The discrimination of a risk prediction model measures that model's ability to distinguish between subjects with and without events. The area under the receiver operating characteristic curve (AUC) is a popular measure of discrimination. However, the AUC has recently been criticized for its insensitivity in model comparisons in which the baseline model has performed well. Thus, 2 other measures have been proposed to capture improvement in discrimination for nested models: the integrated discrimination improvement and the continuous net reclassification improvement. In the present study, the authors use mathematical relations and numerical simulations to quantify the improvement in discrimination offered by candidate markers of different strengths as measured by their effect sizes. They demonstrate that the increase in the AUC depends on the strength of the baseline model, which is true to a lesser degree for the integrated discrimination improvement. On the other hand, the continuous net reclassification improvement depends only on the effect size of the candidate variable and its correlation with other predictors. These measures are illustrated using the Framingham model for incident atrial fibrillation. The authors conclude that the increase in the AUC, integrated discrimination improvement, and net reclassification improvement offer complementary information and thus recommend reporting all 3 alongside measures characterizing the performance of the final model.
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
Earing Prediction in Cup Drawing using the BBC2008 Yield Criterion
NASA Astrophysics Data System (ADS)
Vrh, Marko; Halilovič, Miroslav; Starman, Bojan; Štok, Boris; Comsa, Dan-Sorin; Banabic, Dorel
2011-08-01
The paper deals with constitutive modelling of highly anisotropic sheet metals. It presents FEM based earing predictions in cup drawing simulation of highly anisotropic aluminium alloys where more than four ears occur. For that purpose the BBC2008 yield criterion, which is a plane-stress yield criterion formulated in the form of a finite series, is used. Thus defined criterion can be expanded to retain more or less terms, depending on the amount of given experimental data. In order to use the model in sheet metal forming simulations we have implemented it in a general purpose finite element code ABAQUS/Explicit via VUMAT subroutine, considering alternatively eight or sixteen parameters (8p and 16p version). For the integration of the constitutive model the explicit NICE (Next Increment Corrects Error) integration scheme has been used. Due to the scheme effectiveness the CPU time consumption for a simulation is comparable to the time consumption of built-in constitutive models. Two aluminium alloys, namely AA5042-H2 and AA2090-T3, have been used for a validation of the model. For both alloys the parameters of the BBC2008 model have been identified with a developed numerical procedure, based on a minimization of the developed cost function. For both materials, the predictions of the BBC2008 model prove to be in very good agreement with the experimental results. The flexibility and the accuracy of the model together with the identification and integration procedure guarantee the applicability of the BBC2008 yield criterion in industrial applications.
Wave models for turbulent free shear flows
NASA Technical Reports Server (NTRS)
Liou, W. W.; Morris, P. J.
1991-01-01
New predictive closure models for turbulent free shear flows are presented. They are based on an instability wave description of the dominant large scale structures in these flows using a quasi-linear theory. Three model were developed to study the structural dynamics of turbulent motions of different scales in free shear flows. The local characteristics of the large scale motions are described using linear theory. Their amplitude is determined from an energy integral analysis. The models were applied to the study of an incompressible free mixing layer. In all cases, predictions are made for the development of the mean flow field. In the last model, predictions of the time dependent motion of the large scale structure of the mixing region are made. The predictions show good agreement with experimental observations.
Vivek-Ananth, R P; Samal, Areejit
2016-09-01
A major goal of systems biology is to build predictive computational models of cellular metabolism. Availability of complete genome sequences and wealth of legacy biochemical information has led to the reconstruction of genome-scale metabolic networks in the last 15 years for several organisms across the three domains of life. Due to paucity of information on kinetic parameters associated with metabolic reactions, the constraint-based modelling approach, flux balance analysis (FBA), has proved to be a vital alternative to investigate the capabilities of reconstructed metabolic networks. In parallel, advent of high-throughput technologies has led to the generation of massive amounts of omics data on transcriptional regulation comprising mRNA transcript levels and genome-wide binding profile of transcriptional regulators. A frontier area in metabolic systems biology has been the development of methods to integrate the available transcriptional regulatory information into constraint-based models of reconstructed metabolic networks in order to increase the predictive capabilities of computational models and understand the regulation of cellular metabolism. Here, we review the existing methods to integrate transcriptional regulatory information into constraint-based models of metabolic networks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A grey NGM(1,1, k) self-memory coupling prediction model for energy consumption prediction.
Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling
2014-01-01
Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span.
Potter, Adam W; Blanchard, Laurie A; Friedl, Karl E; Cadarette, Bruce S; Hoyt, Reed W
2017-02-01
Physiological models provide useful summaries of complex interrelated regulatory functions. These can often be reduced to simple input requirements and simple predictions for pragmatic applications. This paper demonstrates this modeling efficiency by tracing the development of one such simple model, the Heat Strain Decision Aid (HSDA), originally developed to address Army needs. The HSDA, which derives from the Givoni-Goldman equilibrium body core temperature prediction model, uses 16 inputs from four elements: individual characteristics, physical activity, clothing biophysics, and environmental conditions. These inputs are used to mathematically predict core temperature (T c ) rise over time and can estimate water turnover from sweat loss. Based on a history of military applications such as derivation of training and mission planning tools, we conclude that the HSDA model is a robust integration of physiological rules that can guide a variety of useful predictions. The HSDA model is limited to generalized predictions of thermal strain and does not provide individualized predictions that could be obtained from physiological sensor data-driven predictive models. This fully transparent physiological model should be improved and extended with new findings and new challenging scenarios. Published by Elsevier Ltd.
An Interoceptive Predictive Coding Model of Conscious Presence
Seth, Anil K.; Suzuki, Keisuke; Critchley, Hugo D.
2011-01-01
We describe a theoretical model of the neurocognitive mechanisms underlying conscious presence and its disturbances. The model is based on interoceptive prediction error and is informed by predictive models of agency, general models of hierarchical predictive coding and dopaminergic signaling in cortex, the role of the anterior insular cortex (AIC) in interoception and emotion, and cognitive neuroscience evidence from studies of virtual reality and of psychiatric disorders of presence, specifically depersonalization/derealization disorder. The model associates presence with successful suppression by top-down predictions of informative interoceptive signals evoked by autonomic control signals and, indirectly, by visceral responses to afferent sensory signals. The model connects presence to agency by allowing that predicted interoceptive signals will depend on whether afferent sensory signals are determined, by a parallel predictive-coding mechanism, to be self-generated or externally caused. Anatomically, we identify the AIC as the likely locus of key neural comparator mechanisms. Our model integrates a broad range of previously disparate evidence, makes predictions for conjoint manipulations of agency and presence, offers a new view of emotion as interoceptive inference, and represents a step toward a mechanistic account of a fundamental phenomenological property of consciousness. PMID:22291673
Vlot, Anna H C; de Witte, Wilhelmus E A; Danhof, Meindert; van der Graaf, Piet H; van Westen, Gerard J P; de Lange, Elizabeth C M
2017-12-04
Selectivity is an important attribute of effective and safe drugs, and prediction of in vivo target and tissue selectivity would likely improve drug development success rates. However, a lack of understanding of the underlying (pharmacological) mechanisms and availability of directly applicable predictive methods complicates the prediction of selectivity. We explore the value of combining physiologically based pharmacokinetic (PBPK) modeling with quantitative structure-activity relationship (QSAR) modeling to predict the influence of the target dissociation constant (K D ) and the target dissociation rate constant on target and tissue selectivity. The K D values of CB1 ligands in the ChEMBL database are predicted by QSAR random forest (RF) modeling for the CB1 receptor and known off-targets (TRPV1, mGlu5, 5-HT1a). Of these CB1 ligands, rimonabant, CP-55940, and Δ 8 -tetrahydrocanabinol, one of the active ingredients of cannabis, were selected for simulations of target occupancy for CB1, TRPV1, mGlu5, and 5-HT1a in three brain regions, to illustrate the principles of the combined PBPK-QSAR modeling. Our combined PBPK and target binding modeling demonstrated that the optimal values of the K D and k off for target and tissue selectivity were dependent on target concentration and tissue distribution kinetics. Interestingly, if the target concentration is high and the perfusion of the target site is low, the optimal K D value is often not the lowest K D value, suggesting that optimization towards high drug-target affinity can decrease the benefit-risk ratio. The presented integrative structure-pharmacokinetic-pharmacodynamic modeling provides an improved understanding of tissue and target selectivity.
Uncertainty aggregation and reduction in structure-material performance prediction
NASA Astrophysics Data System (ADS)
Hu, Zhen; Mahadevan, Sankaran; Ao, Dan
2018-02-01
An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.
A multi-model framework for simulating wildlife population response to land-use and climate change
McRae, B.H.; Schumaker, N.H.; McKane, R.B.; Busing, R.T.; Solomon, A.M.; Burdick, C.A.
2008-01-01
Reliable assessments of how human activities will affect wildlife populations are essential for making scientifically defensible resource management decisions. A principle challenge of predicting effects of proposed management, development, or conservation actions is the need to incorporate multiple biotic and abiotic factors, including land-use and climate change, that interact to affect wildlife habitat and populations through time. Here we demonstrate how models of land-use, climate change, and other dynamic factors can be integrated into a coherent framework for predicting wildlife population trends. Our framework starts with land-use and climate change models developed for a region of interest. Vegetation changes through time under alternative future scenarios are predicted using an individual-based plant community model. These predictions are combined with spatially explicit animal habitat models to map changes in the distribution and quality of wildlife habitat expected under the various scenarios. Animal population responses to habitat changes and other factors are then projected using a flexible, individual-based animal population model. As an example application, we simulated animal population trends under three future land-use scenarios and four climate change scenarios in the Cascade Range of western Oregon. We chose two birds with contrasting habitat preferences for our simulations: winter wrens (Troglodytes troglodytes), which are most abundant in mature conifer forests, and song sparrows (Melospiza melodia), which prefer more open, shrubby habitats. We used climate and land-use predictions from previously published studies, as well as previously published predictions of vegetation responses using FORCLIM, an individual-based forest dynamics simulator. Vegetation predictions were integrated with other factors in PATCH, a spatially explicit, individual-based animal population simulator. Through incorporating effects of landscape history and limited dispersal, our framework predicted population changes that typically exceeded those expected based on changes in mean habitat suitability alone. Although land-use had greater impacts on habitat quality than did climate change in our simulations, we found that small changes in vital rates resulting from climate change or other stressors can have large consequences for population trajectories. The ability to integrate bottom-up demographic processes like these with top-down constraints imposed by climate and land-use in a dynamic modeling environment is a key advantage of our approach. The resulting framework should allow researchers to synthesize existing empirical evidence, and to explore complex interactions that are difficult or impossible to capture through piecemeal modeling approaches. ?? 2008 Elsevier B.V.
Gu, Deqing; Jian, Xingxing; Zhang, Cheng; Hua, Qiang
2017-01-01
Genome-scale metabolic network models (GEMs) have played important roles in the design of genetically engineered strains and helped biologists to decipher metabolism. However, due to the complex gene-reaction relationships that exist in model systems, most algorithms have limited capabilities with respect to directly predicting accurate genetic design for metabolic engineering. In particular, methods that predict reaction knockout strategies leading to overproduction are often impractical in terms of gene manipulations. Recently, we proposed a method named logical transformation of model (LTM) to simplify the gene-reaction associations by introducing intermediate pseudo reactions, which makes it possible to generate genetic design. Here, we propose an alternative method to relieve researchers from deciphering complex gene-reactions by adding pseudo gene controlling reactions. In comparison to LTM, this new method introduces fewer pseudo reactions and generates a much smaller model system named as gModel. We showed that gModel allows two seldom reported applications: identification of minimal genomes and design of minimal cell factories within a modified OptKnock framework. In addition, gModel could be used to integrate expression data directly and improve the performance of the E-Fmin method for predicting fluxes. In conclusion, the model transformation procedure will facilitate genetic research based on GEMs, extending their applications.
A process-based model for cattle manure compost windrows: Model performance and application
USDA-ARS?s Scientific Manuscript database
A model was developed and incorporated in the Integrated Farm System Model (IFSM, v.4.3) that simulates important processes occurring during windrow composting of manure. The model, documented in an accompanying paper, predicts changes in windrow properties and conditions and the resulting emissions...
Bayesian Knowledge Fusion in Prognostics and Health Management—A Case Study
NASA Astrophysics Data System (ADS)
Rabiei, Masoud; Modarres, Mohammad; Mohammad-Djafari, Ali
2011-03-01
In the past few years, a research effort has been in progress at University of Maryland to develop a Bayesian framework based on Physics of Failure (PoF) for risk assessment and fleet management of aging airframes. Despite significant achievements in modelling of crack growth behavior using fracture mechanics, it is still of great interest to find practical techniques for monitoring the crack growth instances using nondestructive inspection and to integrate such inspection results with the fracture mechanics models to improve the predictions. The ultimate goal of this effort is to develop an integrated probabilistic framework for utilizing all of the available information to come up with enhanced (less uncertain) predictions for structural health of the aircraft in future missions. Such information includes material level fatigue models and test data, health monitoring measurements and inspection field data. In this paper, a case study of using Bayesian fusion technique for integrating information from multiple sources in a structural health management problem is presented.
Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor
2016-01-01
A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Boyun; Duguid, Andrew; Nygaard, Ronar
The objective of this project is to develop a computerized statistical model with the Integrated Neural-Genetic Algorithm (INGA) for predicting the probability of long-term leak of wells in CO 2 sequestration operations. This object has been accomplished by conducting research in three phases: 1) data mining of CO 2-explosed wells, 2) INGA computer model development, and 3) evaluation of the predictive performance of the computer model with data from field tests. Data mining was conducted for 510 wells in two CO 2 sequestration projects in the Texas Gulf Coast region. They are the Hasting West field and Oyster Bayou fieldmore » in the Southern Texas. Missing wellbore integrity data were estimated using an analytical and Finite Element Method (FEM) model. The INGA was first tested for performances of convergence and computing efficiency with the obtained data set of high dimension. It was concluded that the INGA can handle the gathered data set with good accuracy and reasonable computing time after a reduction of dimension with a grouping mechanism. A computerized statistical model with the INGA was then developed based on data pre-processing and grouping. Comprehensive training and testing of the model were carried out to ensure that the model is accurate and efficient enough for predicting the probability of long-term leak of wells in CO 2 sequestration operations. The Cranfield in the southern Mississippi was select as the test site. Observation wells CFU31F2 and CFU31F3 were used for pressure-testing, formation-logging, and cement-sampling. Tools run in the wells include Isolation Scanner, Slim Cement Mapping Tool (SCMT), Cased Hole Formation Dynamics Tester (CHDT), and Mechanical Sidewall Coring Tool (MSCT). Analyses of the obtained data indicate no leak of CO 2 cross the cap zone while it is evident that the well cement sheath was invaded by the CO 2 from the storage zone. This observation is consistent with the result predicted by the INGA model which indicates the well has a CO 2 leak-safe probability of 72%. This comparison implies that the developed INGA model is valid for future use in predicting well leak probability.« less
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef
2017-04-01
Hydrogeophysics is an interdisciplinary field of sciences aiming at a better understanding of subsurface hydrological processes. If geophysical surveys have been successfully used to qualitatively characterize the subsurface, two important challenges remain for a better quantification of hydrological processes: (1) the inversion of geophysical data and (2) their integration in hydrological subsurface models. The classical inversion approach using regularization suffers from spatially and temporally varying resolution and yields geologically unrealistic solutions without uncertainty quantification, making their utilization for hydrogeological calibration less consistent. More advanced techniques such as coupled inversion allow for a direct use of geophysical data for conditioning groundwater and solute transport model calibration. However, the technique is difficult to apply in complex cases and remains computationally demanding to estimate uncertainty. In a recent study, we investigate a prediction-focused approach (PFA) to directly estimate subsurface physical properties from geophysical data, circumventing the need for classic inversions. In PFA, we seek a direct relationship between the data and the subsurface variables we want to predict (the forecast). This relationship is obtained through a prior set of subsurface models for which both data and forecast are computed. A direct relationship can often be derived through dimension reduction techniques. PFA offers a framework for both hydrogeophysical "inversion" and hydrogeophysical data integration. For hydrogeophysical "inversion", the considered forecast variable is the subsurface variable, such as the salinity. An ensemble of possible solutions is generated, allowing uncertainty quantification. For hydrogeophysical data integration, the forecast variable becomes the prediction we want to make with our subsurface models, such as the concentration of contaminant in a drinking water production well. Geophysical and hydrological data are combined to derive a direct relationship between data and forecast. We illustrate the process for the design of an aquifer thermal energy storage (ATES) system. An ATES system can theoretically recover in winter the heat stored in the aquifer during summer. In practice, the energy efficiency is often lower than expected due to spatial heterogeneity of hydraulic properties combined to a non-favorable hydrogeological gradient. A proper design of ATES systems should consider the uncertainty of the prediction related to those parameters. With a global sensitivity analysis, we identify sensitive parameters for heat storage prediction and validate the use of a short term heat tracing experiment monitored with geophysics to generate informative data. First, we illustrate how PFA can be used to successfully derive the distribution of temperature in the aquifer from ERT during the heat tracing experiment. Then, we successfully integrate the geophysical data to predict medium-term heat storage in the aquifer using PFA. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data in a relatively limited time budget.
1978-09-01
AWACS EMP Guidelines presents two different models to predict the damage pcwer of the dev-ce and the circuit damage EMP voltage ( VEMP ). Neither of...calculated as K P~ I V BD 6. The damage EMP voltage ( VEMP ) is calculated KZ EMP +IZ =D +BD VBD1F 7. The damage EMP voltage is calculated for collector
NASA Technical Reports Server (NTRS)
Jones, J. E.; Richmond, J. H.
1974-01-01
An integral equation formulation is applied to predict pitch- and roll-plane radiation patterns of a thin VHF/UHF (very high frequency/ultra high frequency) annular slot communications antenna operating at several locations in the nose region of the space shuttle orbiter. Digital computer programs used to compute radiation patterns are given and the use of the programs is illustrated. Experimental verification of computed patterns is given from measurements made on 1/35-scale models of the orbiter.
Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas
2012-01-01
Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice. PMID:23181048
Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas
2012-01-01
Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.
The Global Integrated Drought Monitoring and Prediction System (GIDMaPS): Overview and Capabilities
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Hao, Z.; Farahmand, A.; Nakhjiri, N.
2013-12-01
Development of reliable monitoring and prediction indices and tools are fundamental to drought preparedness and management. Motivated by the Global Drought Information Systems (GDIS) activities, this paper presents the Global Integrated Drought Monitoring and Prediction System (GIDMaPS) which provides near real-time drought information using both remote sensing observations and model simulations. The monthly data from the NASA Modern-Era Retrospective analysis for Research and Applications (MERRA-Land), North American Land Data Assimilation System (NLDAS), and remotely sensed precipitation data are used as input to GIDMaPS. Numerous indices have been developed for drought monitoring based on various indicator variables (e.g., precipitation, soil moisture, water storage). Defining droughts based on a single variable (e.g., precipitation, soil moisture or runoff) may not be sufficient for reliable risk assessment and decision making. GIDMaPS provides drought information based on multiple indices including Standardized Precipitation Index (SPI), Standardized Soil Moisture Index (SSI) and the Multivariate Standardized Drought Index (MSDI) which combines SPI and SSI probabilistically. In other words, MSDI incorporates the meteorological and agricultural drought conditions for overall characterization of droughts. The seasonal prediction component of GIDMaPS is based on a persistence model which requires historical data and near-past observations. The seasonal drought prediction component is based on two input data sets (MERRA and NLDAS) and three drought indicators (SPI, SSI and MSDI). The drought prediction model provides the empirical probability of drought for different severity levels. In this presentation, both monitoring and prediction components of GIDMaPS will be discussed, and the results from several major droughts including the 2013 Namibia, 2012-2013 United States, 2011-2012 Horn of Africa, and 2010 Amazon Droughts will be presented. The results indicate that GIDMaPS advances our drought monitoring and prediction capabilities through integration of multiple data and indicators.
Hughes, Anne K; Rostant, Ola S; Curran, Paul G
2014-07-01
Talking about sexual health can be a challenge for some older women. This project was initiated to identify key factors that improve communication between aging women and their primary care providers. A sample of women (aged 60+) completed an online survey regarding their intent to communicate with a provider about sexual health. Using the integrative model of behavioral prediction as a guide, the survey instrument captured data on attitudes, perceived norms, self-efficacy, and intent to communicate with a provider about sexual health. Data were analyzed using structural equation modeling. Self-efficacy and perceived norms were the most important factors predicting intent to communicate for this sample of women. Intent did not vary with race, but mean scores of the predictors of intent varied for African American and White women. Results can guide practice and intervention with ethnically diverse older women who may be struggling to communicate about their sexual health concerns. © The Author(s) 2013.
Passenger Flow Forecasting Research for Airport Terminal Based on SARIMA Time Series Model
NASA Astrophysics Data System (ADS)
Li, Ziyu; Bi, Jun; Li, Zhiyin
2017-12-01
Based on the data of practical operating of Kunming Changshui International Airport during2016, this paper proposes Seasonal Autoregressive Integrated Moving Average (SARIMA) model to predict the passenger flow. This article not only considers the non-stationary and autocorrelation of the sequence, but also considers the daily periodicity of the sequence. The prediction results can accurately describe the change trend of airport passenger flow and provide scientific decision support for the optimal allocation of airport resources and optimization of departure process. The result shows that this model is applicable to the short-term prediction of airport terminal departure passenger traffic and the average error ranges from 1% to 3%. The difference between the predicted and the true values of passenger traffic flow is quite small, which indicates that the model has fairly good passenger traffic flow prediction ability.
Briët, Olivier J T; Amerasinghe, Priyanie H; Vounatsou, Penelope
2013-01-01
With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions' impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during "consolidation" and "pre-elimination" phases. Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low.
Briët, Olivier J. T.; Amerasinghe, Priyanie H.; Vounatsou, Penelope
2013-01-01
Introduction With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions’ impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during “consolidation” and “pre-elimination” phases. Methods Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. Results The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. Conclusions G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low. PMID:23785448
Wang, Shuangquan; Sun, Huiyong; Liu, Hui; Li, Dan; Li, Youyong; Hou, Tingjun
2016-08-01
Blockade of human ether-à-go-go related gene (hERG) channel by compounds may lead to drug-induced QT prolongation, arrhythmia, and Torsades de Pointes (TdP), and therefore reliable prediction of hERG liability in the early stages of drug design is quite important to reduce the risk of cardiotoxicity-related attritions in the later development stages. In this study, pharmacophore modeling and machine learning approaches were combined to construct classification models to distinguish hERG active from inactive compounds based on a diverse data set. First, an optimal ensemble of pharmacophore hypotheses that had good capability to differentiate hERG active from inactive compounds was identified by the recursive partitioning (RP) approach. Then, the naive Bayesian classification (NBC) and support vector machine (SVM) approaches were employed to construct classification models by integrating multiple important pharmacophore hypotheses. The integrated classification models showed improved predictive capability over any single pharmacophore hypothesis, suggesting that the broad binding polyspecificity of hERG can only be well characterized by multiple pharmacophores. The best SVM model achieved the prediction accuracies of 84.7% for the training set and 82.1% for the external test set. Notably, the accuracies for the hERG blockers and nonblockers in the test set reached 83.6% and 78.2%, respectively. Analysis of significant pharmacophores helps to understand the multimechanisms of action of hERG blockers. We believe that the combination of pharmacophore modeling and SVM is a powerful strategy to develop reliable theoretical models for the prediction of potential hERG liability.
Mohr, Philip; Golley, Sinéad
2016-01-25
This study examined community responses to use of genetically modified (GM) content in food in the context of responses to familiar food additives by testing an empirically and theoretically derived model of the predictors of responses to both GM content and food integrity issues generally. A nationwide sample of 849 adults, selected at random from the Australian Electoral Roll, responded to a postal Food and Health Survey. Structural equation modelling analyses confirmed that ratings of general concern about food integrity (related to the presence of preservatives and other additives) strongly predicted negativity towards GM content. Concern about food integrity was, in turn, predicted by environmental concern and health engagement. In addition, both concern about food integrity generally and responses to GM content specifically were weakly predicted by attitudes to benefits of science and an intuitive (i.e., emotionally-based) reasoning style. Data from a follow-up survey conducted under the same conditions (N=1184) revealed that ratings of concern were significantly lower for use of genetic engineering in food than for four other common food integrity issues examined. Whereas the question of community responses to GM is often treated as a special issue, these findings support the conclusion that responses to the concept of GM content in food in Australia are substantially a specific instance of a general sensitivity towards the integrity of the food supply. They indicate that the origins of responses to GM content may be largely indistinguishable from those of general responses to preservatives and other common food additives. Copyright © 2015 Elsevier B.V. All rights reserved.
Robust PBPK/PD-Based Model Predictive Control of Blood Glucose.
Schaller, Stephan; Lippert, Jorg; Schaupp, Lukas; Pieber, Thomas R; Schuppert, Andreas; Eissing, Thomas
2016-07-01
Automated glucose control (AGC) has not yet reached the point where it can be applied clinically [3]. Challenges are accuracy of subcutaneous (SC) glucose sensors, physiological lag times, and both inter- and intraindividual variability. To address above issues, we developed a novel scheme for MPC that can be applied to AGC. An individualizable generic whole-body physiology-based pharmacokinetic and dynamics (PBPK/PD) model of the glucose, insulin, and glucagon metabolism has been used as the predictive kernel. The high level of mechanistic detail represented by the model takes full advantage of the potential of MPC and may make long-term prediction possible as it captures at least some relevant sources of variability [4]. Robustness against uncertainties was increased by a control cascade relying on proportional-integrative derivative-based offset control. The performance of this AGC scheme was evaluated in silico and retrospectively using data from clinical trials. This analysis revealed that our approach handles sensor noise with a MARD of 10%-14%, and model uncertainties and disturbances. The results suggest that PBPK/PD models are well suited for MPC in a glucose control setting, and that their predictive power in combination with the integrated database-driven (a priori individualizable) model framework will help overcome current challenges in the development of AGC systems. This study provides a new, generic, and robust mechanistic approach to AGC using a PBPK platform with extensive a priori (database) knowledge for individualization.
Dynamic Modeling, Controls, and Testing for Electrified Aircraft
NASA Technical Reports Server (NTRS)
Connolly, Joseph; Stalcup, Erik
2017-01-01
Electrified aircraft have the potential to provide significant benefits for efficiency and emissions reductions. To assess these potential benefits, modeling tools are needed to provide rapid evaluation of diverse concepts and to ensure safe operability and peak performance over the mission. The modeling challenge for these vehicles is the ability to show significant benefits over the current highly refined aircraft systems. The STARC-ABL (single-aisle turbo-electric aircraft with an aft boundary layer propulsor) is a new test proposal that builds upon previous N3-X team hybrid designs. This presentation describes the STARC-ABL concept, the NASA Electric Aircraft Testbed (NEAT) which will allow testing of the STARC-ABL powertrain, and the related modeling and simulation efforts to date. Modeling and simulation includes a turbofan simulation, Numeric Propulsion System Simulation (NPSS), which has been integrated with NEAT; and a power systems and control model for predicting testbed performance and evaluating control schemes. Model predictions provide good comparisons with testbed data for an NPSS-integrated test of the single-string configuration of NEAT.
Dynamic substrate preferences predict metabolic properties of a simple microbial consortium
Erbilgin, Onur; Bowen, Benjamin P.; Kosina, Suzanne M.; ...
2017-01-23
Mixed cultures of different microbial species are increasingly being used to carry out a specific biochemical function in lieu of engineering a single microbe to do the same t ask. However, knowing how different species' metabolisms will integrate to reach a desired outcome is a difficult problem that has been studied in great detail using steady-state models. However, many biotechnological processes, as well as natural habitats, represent a more dynamic system. Examining how individual species use resources in their growth medium or environment (exometabolomics) over time in batch culture conditions can provide rich phenotypic data that encompasses regulation and transporters,more » creating an opportunity to integrate the data into a predictive model of resource use by a mixed community. Here we use exometabolomic profiling to examine the time-varying substrate depletion from a mixture of 19 amino acids and glucose by two Pseudomonas and one Bacillus species isolated from ground water. Contrary to studies in model organisms, we found surprisingly few correlations between resource preferences and maximal growth rate or biomass composition. We then modeled patterns of substrate depletion, and used these models to examine if substrate usage preferences and substrate depletion kinetics of individual isolates can be used to predict the metabolism of a co-culture of the isolates. We found that most of the substrates fit the model predictions, except for glucose and histidine, which were depleted more slowly than predicted, and proline, glycine, glutamate, lysine and arginine, which were all consumed significantly faster. Our results indicate that a significant portion of a model community's overall metabolism can be predicted based on the metabolism of the individuals. Based on the nature of our model, the resources that significantly deviate from the prediction highlight potential metabolic pathways affected by species-species interactions, which when further studied can potentially be used to modulate microbial community structure and/or function.« less
NASA Astrophysics Data System (ADS)
Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.
2017-12-01
US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.
Scheinfeld, Emily; Shim, Minsun
2017-05-01
Emerging adulthood (EA) is an important yet overlooked period for developing long-term health behaviors. During these years, emerging adults adopt health behaviors that persist throughout life. This study applies the Integrative Model of Behavioral Prediction (IMBP) to examine the role of childhood parental communication in predicting engagement in healthful eating during EA. Participants included 239 college students, ages 18 to 25, from a large university in the southern United States. Participants were recruited and data collection occurred spring 2012. Participants responded to measures to assess perceived parental communication, eating behaviors, attitudes, subjective norms, and behavioral control over healthful eating. SEM and mediation analyses were used to address the hypotheses posited. Data demonstrated that perceived parent-child communication - specifically, its quality and target-specific content - significantly predicted emerging adults' eating behaviors, mediated through subjective norm and perceived behavioral control. This study sets the stage for further exploration and understanding of different ways parental communication influences emerging adults' healthy behavior enactment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salajegheh, Nima; Abedrabbo, Nader; Pourboghrat, Farhang
An efficient integration algorithm for continuum damage based elastoplastic constitutive equations is implemented in LS-DYNA. The isotropic damage parameter is defined as the ratio of the damaged surface area over the total cross section area of the representative volume element. This parameter is incorporated into the integration algorithm as an internal variable. The developed damage model is then implemented in the FEM code LS-DYNA as user material subroutine (UMAT). Pure stretch experiments of a hemispherical punch are carried out for copper sheets and the results are compared against the predictions of the implemented damage model. Evaluation of damage parameters ismore » carried out and the optimized values that correctly predicted the failure in the sheet are reported. Prediction of failure in the numerical analysis is performed through element deletion using the critical damage value. The set of failure parameters which accurately predict the failure behavior in copper sheets compared to experimental data is reported as well.« less
Information-theoretic approach to interactive learning
NASA Astrophysics Data System (ADS)
Still, S.
2009-01-01
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer's world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process's causal organization in the presence of the learner's actions. A fundamental consequence of the proposed principle is that the learner's optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.
Exploring the social dimension of sandy beaches through predictive modelling.
Domínguez-Tejo, Elianny; Metternicht, Graciela; Johnston, Emma L; Hedge, Luke
2018-05-15
Sandy beaches are unique ecosystems increasingly exposed to human-induced pressures. Consistent with emerging frameworks promoting this holistic approach towards beach management, is the need to improve the integration of social data into management practices. This paper aims to increase understanding of links between demographics and community values and preferred beach activities, as key components of the social dimension of the beach environment. A mixed method approach was adopted to elucidate users' opinions on beach preferences and community values through a survey carried out in Manly Local Government Area in Sydney Harbour, Australia. A proposed conceptual model was used to frame demographic models (using age, education, employment, household income and residence status) as predictors of these two community responses. All possible regression-model combinations were compared using Akaike's information criterion. Best models were then used to calculate quantitative likelihoods of the responses, presented as heat maps. Findings concur with international research indicating the relevance of social and restful activities as important social links between the community and the beach environment. Participant's age was a significant variable in the four predictive models. The use of predictive models informed by demographics could potentially increase our understanding of interactions between the social and ecological systems of the beach environment, as a prelude to integrated beach management approaches. The research represents a practical demonstration of how demographic predictive models could support proactive approaches to beach management. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Application of Modeling and Simulation in Capacity Management within the ITIL Framework
NASA Technical Reports Server (NTRS)
Rahmani, Sonya; vonderHoff, Otto
2010-01-01
Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.
Developing a predictive tropospheric ozone model for Tabriz
NASA Astrophysics Data System (ADS)
Khatibi, Rahman; Naghipour, Leila; Ghorbani, Mohammad A.; Smith, Michael S.; Karimi, Vahid; Farhoudi, Reza; Delafrouz, Hadi; Arvanaghi, Hadi
2013-04-01
Predictive ozone models are becoming indispensable tools by providing a capability for pollution alerts to serve people who are vulnerable to the risks. We have developed a tropospheric ozone prediction capability for Tabriz, Iran, by using the following five modeling strategies: three regression-type methods: Multiple Linear Regression (MLR), Artificial Neural Networks (ANNs), and Gene Expression Programming (GEP); and two auto-regression-type models: Nonlinear Local Prediction (NLP) to implement chaos theory and Auto-Regressive Integrated Moving Average (ARIMA) models. The regression-type modeling strategies explain the data in terms of: temperature, solar radiation, dew point temperature, and wind speed, by regressing present ozone values to their past values. The ozone time series are available at various time intervals, including hourly intervals, from August 2010 to March 2011. The results for MLR, ANN and GEP models are not overly good but those produced by NLP and ARIMA are promising for the establishing a forecasting capability.
Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf
2016-02-01
Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).
Predicting growth of the healthy infant using a genome scale metabolic model.
Nilsson, Avlant; Mardinoglu, Adil; Nielsen, Jens
2017-01-01
An estimated 165 million children globally have stunted growth, and extensive growth data are available. Genome scale metabolic models allow the simulation of molecular flux over each metabolic enzyme, and are well adapted to analyze biological systems. We used a human genome scale metabolic model to simulate the mechanisms of growth and integrate data about breast-milk intake and composition with the infant's biomass and energy expenditure of major organs. The model predicted daily metabolic fluxes from birth to age 6 months, and accurately reproduced standard growth curves and changes in body composition. The model corroborates the finding that essential amino and fatty acids do not limit growth, but that energy is the main growth limiting factor. Disruptions to the supply and demand of energy markedly affected the predicted growth, indicating that elevated energy expenditure may be detrimental. The model was used to simulate the metabolic effect of mineral deficiencies, and showed the greatest growth reduction for deficiencies in copper, iron, and magnesium ions which affect energy production through oxidative phosphorylation. The model and simulation method were integrated to a platform and shared with the research community. The growth model constitutes another step towards the complete representation of human metabolism, and may further help improve the understanding of the mechanisms underlying stunting.
Wilkin, John L.; Rosenfeld, Leslie; Allen, Arthur; Baltes, Rebecca; Baptista, Antonio; He, Ruoying; Hogan, Patrick; Kurapov, Alexander; Mehra, Avichal; Quintrell, Josie; Schwab, David; Signell, Richard; Smith, Jane
2017-01-01
This paper outlines strategies that would advance coastal ocean modelling, analysis and prediction as a complement to the observing and data management activities of the coastal components of the US Integrated Ocean Observing System (IOOS®) and the Global Ocean Observing System (GOOS). The views presented are the consensus of a group of US-based researchers with a cross-section of coastal oceanography and ocean modelling expertise and community representation drawn from Regional and US Federal partners in IOOS. Priorities for research and development are suggested that would enhance the value of IOOS observations through model-based synthesis, deliver better model-based information products, and assist the design, evaluation, and operation of the observing system itself. The proposed priorities are: model coupling, data assimilation, nearshore processes, cyberinfrastructure and model skill assessment, modelling for observing system design, evaluation and operation, ensemble prediction, and fast predictors. Approaches are suggested to accomplish substantial progress in a 3–8-year timeframe. In addition, the group proposes steps to promote collaboration between research and operations groups in Regional Associations, US Federal Agencies, and the international ocean research community in general that would foster coordination on scientific and technical issues, and strengthen federal–academic partnerships benefiting IOOS stakeholders and end users.
Analysis and Modeling of DIII-D Experiments With OMFIT and Neural Networks
NASA Astrophysics Data System (ADS)
Meneghini, O.; Luna, C.; Smith, S. P.; Lao, L. L.; GA Theory Team
2013-10-01
The OMFIT integrated modeling framework is designed to facilitate experimental data analysis and enable integrated simulations. This talk introduces this framework and presents a selection of its applications to the DIII-D experiment. Examples include kinetic equilibrium reconstruction analysis; evaluation of MHD stability in the core and in the edge; and self-consistent predictive steady-state transport modeling. The OMFIT framework also provides the platform for an innovative approach based on neural networks to predict electron and ion energy fluxes. In our study a multi-layer feed-forward back-propagation neural network is built and trained over a database of DIII-D data. It is found that given the same parameters that the highest fidelity models use, the neural network model is able to predict to a large degree the heat transport profiles observed in the DIII-D experiments. Once the network is built, the numerical cost of evaluating the transport coefficients is virtually nonexistent, thus making the neural network model particularly well suited for plasma control and quick exploration of operational scenarios. The implementation of the neural network model and benchmark with experimental results and gyro-kinetic models will be discussed. Work supported in part by the US DOE under DE-FG02-95ER54309.
Mathematical Modeling Of A Nuclear/Thermionic Power Source
NASA Technical Reports Server (NTRS)
Vandersande, Jan W.; Ewell, Richard C.
1992-01-01
Report discusses mathematical modeling to predict performance and lifetime of spacecraft power source that is integrated combination of nuclear-fission reactor and thermionic converters. Details of nuclear reaction, thermal conditions in core, and thermionic performance combined with model of swelling of fuel.
Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches.
Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael
2015-09-08
As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras.
Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches
Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael
2015-01-01
As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras. PMID:26370997
Using integrated modeling for generating watershed-scale dynamic flood maps for Hurricane Harvey
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.; Singhofen, P. J.
2017-12-01
Hurricane Harvey, which was categorized as a 1000-year return period event, produced unprecedented rainfall and flooding in Houston. Although the expected rainfall was forecasted much before the event, there was no way to identify which regions were at higher risk of flooding, the magnitude of flooding, and when the impacts of rainfall would be highest. The inability to predict the location, duration, and depth of flooding created uncertainty over evacuation planning and preparation. This catastrophic event highlighted that the conventional approach to managing flood risk using 100-year static flood inundation maps is inadequate because of its inability to predict flood duration and extents for 500-year or 1000-year return period events in real-time. The purpose of this study is to create models that can dynamically predict the impacts of rainfall and subsequent flooding, so that necessary evacuation and rescue efforts can be planned in advance. This study uses a 2D integrated surface water-groundwater model called ICPR (Interconnected Channel and Pond Routing) to simulate both the hydrology and hydrodynamics for Hurricane Harvey. The methodology involves using the NHD stream network to create a 2D model that incorporates rainfall, land use, vadose zone properties and topography to estimate streamflow and generate dynamic flood depths and extents. The results show that dynamic flood mapping captures the flood hydrodynamics more accurately and is able to predict the magnitude, extent and time of occurrence for extreme events such as Hurricane Harvey. Therefore, integrated modeling has the potential to identify regions that are more susceptible to flooding, which is especially useful for large-scale planning and allocation of resources for protection against future flood risk.
Comparison of simulator fidelity model predictions with in-simulator evaluation data
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.
1983-01-01
A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.
Predicting breast cancer using an expression values weighted clinical classifier.
Thomas, Minta; De Brabanter, Kris; Suykens, Johan A K; De Moor, Bart
2014-12-31
Clinical data, such as patient history, laboratory analysis, ultrasound parameters-which are the basis of day-to-day clinical decision support-are often used to guide the clinical management of cancer in the presence of microarray data. Several data fusion techniques are available to integrate genomics or proteomics data, but only a few studies have created a single prediction model using both gene expression and clinical data. These studies often remain inconclusive regarding an obtained improvement in prediction performance. To improve clinical management, these data should be fully exploited. This requires efficient algorithms to integrate these data sets and design a final classifier. LS-SVM classifiers and generalized eigenvalue/singular value decompositions are successfully used in many bioinformatics applications for prediction tasks. While bringing up the benefits of these two techniques, we propose a machine learning approach, a weighted LS-SVM classifier to integrate two data sources: microarray and clinical parameters. We compared and evaluated the proposed methods on five breast cancer case studies. Compared to LS-SVM classifier on individual data sets, generalized eigenvalue decomposition (GEVD) and kernel GEVD, the proposed weighted LS-SVM classifier offers good prediction performance, in terms of test area under ROC Curve (AUC), on all breast cancer case studies. Thus a clinical classifier weighted with microarray data set results in significantly improved diagnosis, prognosis and prediction responses to therapy. The proposed model has been shown as a promising mathematical framework in both data fusion and non-linear classification problems.
The Satellite Clock Bias Prediction Method Based on Takagi-Sugeno Fuzzy Neural Network
NASA Astrophysics Data System (ADS)
Cai, C. L.; Yu, H. G.; Wei, Z. C.; Pan, J. D.
2017-05-01
The continuous improvement of the prediction accuracy of Satellite Clock Bias (SCB) is the key problem of precision navigation. In order to improve the precision of SCB prediction and better reflect the change characteristics of SCB, this paper proposes an SCB prediction method based on the Takagi-Sugeno fuzzy neural network. Firstly, the SCB values are pre-treated based on their characteristics. Then, an accurate Takagi-Sugeno fuzzy neural network model is established based on the preprocessed data to predict SCB. This paper uses the precise SCB data with different sampling intervals provided by IGS (International Global Navigation Satellite System Service) to realize the short-time prediction experiment, and the results are compared with the ARIMA (Auto-Regressive Integrated Moving Average) model, GM(1,1) model, and the quadratic polynomial model. The results show that the Takagi-Sugeno fuzzy neural network model is feasible and effective for the SCB short-time prediction experiment, and performs well for different types of clocks. The prediction results for the proposed method are better than the conventional methods obviously.
Ling, Qi; Liu, Jimin; Zhuo, Jianyong; Zhuang, Runzhou; Huang, Haitao; He, Xiangxiang; Xu, Xiao; Zheng, Shusen
2018-04-27
Donor characteristics and graft quality were recently reported to play an important role in the recurrence of hepatocellular carcinoma after liver transplantation. Our aim was to establish a prognostic model by using both donor and recipient variables. Data of 1,010 adult patients (training/validation: 2/1) undergoing primary liver transplantation for hepatocellular carcinoma were extracted from the China Liver Transplant Registry database and analyzed retrospectively. A multivariate competing risk regression model was developed and used to generate a nomogram predicting the likelihood of post-transplant hepatocellular carcinoma recurrence. Of 673 patients in the training cohort, 70 (10.4%) had hepatocellular carcinoma recurrence with a median recurrence time of 6 months (interquartile range: 4-25 months). Cold ischemia time was the only independent donor prognostic factor for predicting hepatocellular carcinoma recurrence (hazard ratio = 2.234, P = .007). The optimal cutoff value was 12 hours when patients were grouped according to cold ischemia time at 2-hour intervals. Integrating cold ischemia time into the Milan criteria (liver transplantation candidate selection criteria) improved the accuracy for predicting hepatocellular carcinoma recurrence in both training and validation sets (P < .05). A nomogram composed of cold ischemia time, tumor burden, differentiation, and α-fetoprotein level proved to be accurate and reliable in predicting the likelihood of 1-year hepatocellular carcinoma recurrence after liver transplantation. Additionally, donor anti-hepatitis B core antibody positivity, prolonged cold ischemia time, and anhepatic time were linked to the intrahepatic recurrence, whereas older donor age, prolonged donor warm ischemia time, cold ischemia time, and ABO incompatibility were relevant to the extrahepatic recurrence. The graft quality integrated models exhibited considerable predictive accuracy in early hepatocellular carcinoma recurrence risk assessment. The identification of donor risks can further help understand the mechanism of different patterns of recurrence. Copyright © 2018 Elsevier Inc. All rights reserved.
Proposals for enhanced health risk assessment and stratification in an integrated care scenario
Dueñas-Espín, Ivan; Vela, Emili; Pauws, Steffen; Bescos, Cristina; Cano, Isaac; Cleries, Montserrat; Contel, Joan Carles; de Manuel Keenoy, Esteban; Garcia-Aymerich, Judith; Gomez-Cabrero, David; Kaye, Rachelle; Lahr, Maarten M H; Lluch-Ariet, Magí; Moharra, Montserrat; Monterde, David; Mora, Joana; Nalin, Marco; Pavlickova, Andrea; Piera, Jordi; Ponce, Sara; Santaeugenia, Sebastià; Schonenberg, Helen; Störk, Stefan; Tegner, Jesper; Velickovski, Filip; Westerteicher, Christoph; Roca, Josep
2016-01-01
Objectives Population-based health risk assessment and stratification are considered highly relevant for large-scale implementation of integrated care by facilitating services design and case identification. The principal objective of the study was to analyse five health-risk assessment strategies and health indicators used in the five regions participating in the Advancing Care Coordination and Telehealth Deployment (ACT) programme (http://www.act-programme.eu). The second purpose was to elaborate on strategies toward enhanced health risk predictive modelling in the clinical scenario. Settings The five ACT regions: Scotland (UK), Basque Country (ES), Catalonia (ES), Lombardy (I) and Groningen (NL). Participants Responsible teams for regional data management in the five ACT regions. Primary and secondary outcome measures We characterised and compared risk assessment strategies among ACT regions by analysing operational health risk predictive modelling tools for population-based stratification, as well as available health indicators at regional level. The analysis of the risk assessment tool deployed in Catalonia in 2015 (GMAs, Adjusted Morbidity Groups) was used as a basis to propose how population-based analytics could contribute to clinical risk prediction. Results There was consensus on the need for a population health approach to generate health risk predictive modelling. However, this strategy was fully in place only in two ACT regions: Basque Country and Catalonia. We found marked differences among regions in health risk predictive modelling tools and health indicators, and identified key factors constraining their comparability. The research proposes means to overcome current limitations and the use of population-based health risk prediction for enhanced clinical risk assessment. Conclusions The results indicate the need for further efforts to improve both comparability and flexibility of current population-based health risk predictive modelling approaches. Applicability and impact of the proposals for enhanced clinical risk assessment require prospective evaluation. PMID:27084274
Because of natural environmental and faunal differences and scientific perspectives, numerous indices of biological integrity (IBIs) have been developed at local state, and regional scales in the USA. These multiple IBIs, plus different criteria for judging impairment, hinder ri...
Model-based influences on humans' choices and striatal prediction errors.
Daw, Nathaniel D; Gershman, Samuel J; Seymour, Ben; Dayan, Peter; Dolan, Raymond J
2011-03-24
The mesostriatal dopamine system is prominently implicated in model-free reinforcement learning, with fMRI BOLD signals in ventral striatum notably covarying with model-free prediction errors. However, latent learning and devaluation studies show that behavior also shows hallmarks of model-based planning, and the interaction between model-based and model-free values, prediction errors, and preferences is underexplored. We designed a multistep decision task in which model-based and model-free influences on human choice behavior could be distinguished. By showing that choices reflected both influences we could then test the purity of the ventral striatal BOLD signal as a model-free report. Contrary to expectations, the signal reflected both model-free and model-based predictions in proportions matching those that best explained choice behavior. These results challenge the notion of a separate model-free learner and suggest a more integrated computational architecture for high-level human decision-making. Copyright © 2011 Elsevier Inc. All rights reserved.
Predicting Flory-Huggins χ from Simulations
NASA Astrophysics Data System (ADS)
Zhang, Wenlin; Gomez, Enrique D.; Milner, Scott T.
2017-07-01
We introduce a method, based on a novel thermodynamic integration scheme, to extract the Flory-Huggins χ parameter as small as 10-3k T for polymer blends from molecular dynamics (MD) simulations. We obtain χ for the archetypical coarse-grained model of nonpolar polymer blends: flexible bead-spring chains with different Lennard-Jones interactions between A and B monomers. Using these χ values and a lattice version of self-consistent field theory (SCFT), we predict the shape of planar interfaces for phase-separated binary blends. Our SCFT results agree with MD simulations, validating both the predicted χ values and our thermodynamic integration method. Combined with atomistic simulations, our method can be applied to predict χ for new polymers from their chemical structures.
Sahoo, Sudhakar; Świtnicki, Michał P; Pedersen, Jakob Skou
2016-09-01
Recently, new RNA secondary structure probing techniques have been developed, including Next Generation Sequencing based methods capable of probing transcriptome-wide. These techniques hold great promise for improving structure prediction accuracy. However, each new data type comes with its own signal properties and biases, which may even be experiment specific. There is therefore a growing need for RNA structure prediction methods that can be automatically trained on new data types and readily extended to integrate and fully exploit multiple types of data. Here, we develop and explore a modular probabilistic approach for integrating probing data in RNA structure prediction. It can be automatically trained given a set of known structures with probing data. The approach is demonstrated on SHAPE datasets, where we evaluate and selectively model specific correlations. The approach often makes superior use of the probing data signal compared to other methods. We illustrate the use of ProbFold on multiple data types using both simulations and a small set of structures with both SHAPE, DMS and CMCT data. Technically, the approach combines stochastic context-free grammars (SCFGs) with probabilistic graphical models. This approach allows rapid adaptation and integration of new probing data types. ProbFold is implemented in C ++. Models are specified using simple textual formats. Data reformatting is done using separate C ++ programs. Source code, statically compiled binaries for x86 Linux machines, C ++ programs, example datasets and a tutorial is available from http://moma.ki.au.dk/prj/probfold/ : jakob.skou@clin.au.dk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Adhikari, Badri; Hou, Jie; Cheng, Jianlin
2018-03-01
In this study, we report the evaluation of the residue-residue contacts predicted by our three different methods in the CASP12 experiment, focusing on studying the impact of multiple sequence alignment, residue coevolution, and machine learning on contact prediction. The first method (MULTICOM-NOVEL) uses only traditional features (sequence profile, secondary structure, and solvent accessibility) with deep learning to predict contacts and serves as a baseline. The second method (MULTICOM-CONSTRUCT) uses our new alignment algorithm to generate deep multiple sequence alignment to derive coevolution-based features, which are integrated by a neural network method to predict contacts. The third method (MULTICOM-CLUSTER) is a consensus combination of the predictions of the first two methods. We evaluated our methods on 94 CASP12 domains. On a subset of 38 free-modeling domains, our methods achieved an average precision of up to 41.7% for top L/5 long-range contact predictions. The comparison of the three methods shows that the quality and effective depth of multiple sequence alignments, coevolution-based features, and machine learning integration of coevolution-based features and traditional features drive the quality of predicted protein contacts. On the full CASP12 dataset, the coevolution-based features alone can improve the average precision from 28.4% to 41.6%, and the machine learning integration of all the features further raises the precision to 56.3%, when top L/5 predicted long-range contacts are evaluated. And the correlation between the precision of contact prediction and the logarithm of the number of effective sequences in alignments is 0.66. © 2017 Wiley Periodicals, Inc.
PBPK models for the prediction of in vivo performance of oral dosage forms.
Kostewicz, Edmund S; Aarons, Leon; Bergstrand, Martin; Bolger, Michael B; Galetin, Aleksandra; Hatley, Oliver; Jamei, Masoud; Lloyd, Richard; Pepin, Xavier; Rostami-Hodjegan, Amin; Sjögren, Erik; Tannergren, Christer; Turner, David B; Wagner, Christian; Weitschies, Werner; Dressman, Jennifer
2014-06-16
Drug absorption from the gastrointestinal (GI) tract is a highly complex process dependent upon numerous factors including the physicochemical properties of the drug, characteristics of the formulation and interplay with the underlying physiological properties of the GI tract. The ability to accurately predict oral drug absorption during drug product development is becoming more relevant given the current challenges facing the pharmaceutical industry. Physiologically-based pharmacokinetic (PBPK) modeling provides an approach that enables the plasma concentration-time profiles to be predicted from preclinical in vitro and in vivo data and can thus provide a valuable resource to support decisions at various stages of the drug development process. Whilst there have been quite a few successes with PBPK models identifying key issues in the development of new drugs in vivo, there are still many aspects that need to be addressed in order to maximize the utility of the PBPK models to predict drug absorption, including improving our understanding of conditions in the lower small intestine and colon, taking the influence of disease on GI physiology into account and further exploring the reasons behind population variability. Importantly, there is also a need to create more appropriate in vitro models for testing dosage form performance and to streamline data input from these into the PBPK models. As part of the Oral Biopharmaceutical Tools (OrBiTo) project, this review provides a summary of the current status of PBPK models available. The current challenges in PBPK set-ups for oral drug absorption including the composition of GI luminal contents, transit and hydrodynamics, permeability and intestinal wall metabolism are discussed in detail. Further, the challenges regarding the appropriate integration of results from in vitro models, such as consideration of appropriate integration/estimation of solubility and the complexity of the in vitro release and precipitation data, are also highlighted as important steps to advancing the application of PBPK models in drug development. It is expected that the "innovative" integration of in vitro data from more appropriate in vitro models and the enhancement of the GI physiology component of PBPK models, arising from the OrBiTo project, will lead to a significant enhancement in the ability of PBPK models to successfully predict oral drug absorption and advance their role in preclinical and clinical development, as well as for regulatory applications. Copyright © 2013 Elsevier B.V. All rights reserved.
TU-G-210-02: TRANS-FUSIMO - An Integrative Approach to Model-Based Treatment Planning of Liver FUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preusser, T.
Modeling can play a vital role in predicting, optimizing and analyzing the results of therapeutic ultrasound treatments. Simulating the propagating acoustic beam in various targeted regions of the body allows for the prediction of the resulting power deposition and temperature profiles. In this session we will apply various modeling approaches to breast, abdominal organ and brain treatments. Of particular interest is the effectiveness of procedures for correcting for phase aberrations caused by intervening irregular tissues, such as the skull in transcranial applications or inhomogeneous breast tissues. Also described are methods to compensate for motion in targeted abdominal organs such asmore » the liver or kidney. Douglas Christensen – Modeling for Breast and Brain HIFU Treatment Planning Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Tobias Preusser – TRANS-FUSIMO – An Integrative Approach to Model-Based Treatment Planning of Liver FUS Learning Objectives: Understand the role of acoustic beam modeling for predicting the effectiveness of therapeutic ultrasound treatments. Apply acoustic modeling to specific breast, liver, kidney and transcranial anatomies. Determine how to obtain appropriate acoustic modeling parameters from clinical images. Understand the separate role of absorption and scattering in energy delivery to tissues. See how organ motion can be compensated for in ultrasound therapies. Compare simulated data with clinical temperature measurements in transcranial applications. Supported by NIH R01 HL172787 and R01 EB013433 (DC); EU Seventh Framework Programme (FP7/2007-2013) under 270186 (FUSIMO) and 611889 (TRANS-FUSIMO)(TP); and P01 CA159992, GE, FUSF and InSightec (UV)« less
Competitive assessment of aerospace systems using system dynamics
NASA Astrophysics Data System (ADS)
Pfaender, Jens Holger
Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.
Zoccolotti, Pierluigi; De Luca, Maria; Marinelli, Chiara V.; Spinelli, Donatella
2014-01-01
This study was aimed at predicting individual differences in text reading fluency. The basic proposal included two factors, i.e., the ability to decode letter strings (measured by discrete pseudo-word reading) and integration of the various sub-components involved in reading (measured by Rapid Automatized Naming, RAN). Subsequently, a third factor was added to the model, i.e., naming of discrete digits. In order to use homogeneous measures, all contributing variables considered the entire processing of the item, including pronunciation time. The model, which was based on commonality analysis, was applied to data from a group of 43 typically developing readers (11- to 13-year-olds) and a group of 25 chronologically matched dyslexic children. In typically developing readers, both orthographic decoding and integration of reading sub-components contributed significantly to the overall prediction of text reading fluency. The model prediction was higher (from ca. 37 to 52% of the explained variance) when we included the naming of discrete digits variable, which had a suppressive effect on pseudo-word reading. In the dyslexic readers, the variance explained by the two-factor model was high (69%) and did not change when the third factor was added. The lack of a suppression effect was likely due to the prominent individual differences in poor orthographic decoding of the dyslexic children. Analyses on data from both groups of children were replicated by using patches of colors as stimuli (both in the RAN task and in the discrete naming task) obtaining similar results. We conclude that it is possible to predict much of the variance in text-reading fluency using basic processes, such as orthographic decoding and integration of reading sub-components, even without taking into consideration higher-order linguistic factors such as lexical, semantic and contextual abilities. The approach validity of using proximal vs. distal causes to predict reading fluency is discussed. PMID:25477856
O'Neill, Colette M; Kazantzidis, Andreas; Kiely, Mairead; Cox, Lorna; Meadows, Sarah; Goldberg, Gail; Prentice, Ann; Kift, Richard; Webb, Ann R; Cashman, Kevin D
2017-10-01
Within Europe, dark-skinned ethnic groups have been shown to be at much increased risk of vitamin D deficiency compared to their white counterparts. Increasing the dietary supply of vitamin D is potentially the only modifiable environmental component that can be used to prevent vitamin D deficiency among dark-skinned ethnic groups living at high latitude. Empirical data to support development of such strategies is largely lacking. This paper presents the development and validation of an integrated model that may be adapted within the UK population to design fortification strategies for vitamin D, for application in both white and black and Asian minority ethnic (BAME) population groups. Using a step-wise approach, models based on available ultraviolet B (UVB) data, hours of sunlight and two key components (the dose-response of serum 25-hydroxyvitamin D [25(OH)D] to UVB in white and BAME persons and the dose-response of 25(OH)D to vitamin D) were used to predict changes population serum 25(OH)D concentrations throughout the year, stratified by ethnicity, 'via increases' in dietary intake arising from food fortification simulations. The integrated model successfully predicted measured average wintertime 25(OH)D concentrations in addition to the prevalence of serum 25(OH)D <30nmol/L in adult white and BAME individuals (18-70y) in the UK-based National Diet and Nutrition Survey both separately (21.7% and 49.3% predicted versus 20.2% and 50.5% measured, for white and BAME, respectively) and when combined at UK population-relevant proportions of 97% white and 7% BAME (23.2% predicted versus 23.1% measured). Thus this integrated model presents a viable approach to estimating changes in the population concentrations of 25(OH)D that may arise from various dietary fortification approaches. Copyright © 2016 Elsevier Ltd. All rights reserved.
Control theory for scanning probe microscopy revisited.
Stirling, Julian
2014-01-01
We derive a theoretical model for studying SPM feedback in the context of control theory. Previous models presented in the literature that apply standard models for proportional-integral-derivative controllers predict a highly unstable feedback environment. This model uses features specific to the SPM implementation of the proportional-integral controller to give realistic feedback behaviour. As such the stability of SPM feedback for a wide range of feedback gains can be understood. Further consideration of mechanical responses of the SPM system gives insight into the causes of exciting mechanical resonances of the scanner during feedback operation.
Arnautovska, Urska; Fleig, Lena; O'Callaghan, Frances; Hamilton, Kyra
2017-02-01
To assess the effects of conscious and non-conscious processes for prediction of older adults' physical activity (PA), we tested a dual-process model that integrated motivational (behavioural intention) and volitional (action planning and coping planning) processes with non-conscious, automatic processes (habit). Participants (N = 215) comprised community-dwelling older adults (M = 73.8 years). A longitudinal design was adopted to investigate direct and indirect effects of intentions, habit strength (Time 1), and action planning and coping planning (Time 2) on PA behaviour (Time 3). Structural equation modelling was used to evaluate the model. The model provided a good fit to the data, accounting for 44% of the variance in PA behaviour at Time 3. PA was predicted by intentions, action planning, and habit strength, with action planning mediating the intention-behaviour relationship. An effect of sex was also found where males used fewer planning strategies and engaged in more PA than females. By investigating an integration of conscious and non-conscious processes, this study provides a novel understanding of older adults' PA. Interventions aiming to promote PA behaviour of older adults should target the combination of psychological processes.
USE OF PHARMACOKINETIC MODELING TO DESIGN STUDIES FOR PATHWAY-SPECIFIC EXPOSURE MODEL EVALUATION
Validating an exposure pathway model is difficult because the biomarker, which is often used to evaluate the model prediction, is an integrated measure for exposures from all the exposure routes/pathways. The purpose of this paper is to demonstrate a method to use pharmacokeneti...
Risk adjustment alternatives in paying for behavioral health care under Medicaid.
Ettner, S L; Frank, R G; McGuire, T G; Hermann, R C
2001-01-01
OBJECTIVE: To compare the performance of various risk adjustment models in behavioral health applications such as setting mental health and substance abuse (MH/SA) capitation payments or overall capitation payments for populations including MH/SA users. DATA SOURCES/STUDY DESIGN: The 1991-93 administrative data from the Michigan Medicaid program were used. We compared mean absolute prediction error for several risk adjustment models and simulated the profits and losses that behavioral health care carve outs and integrated health plans would experience under risk adjustment if they enrolled beneficiaries with a history of MH/SA problems. Models included basic demographic adjustment, Adjusted Diagnostic Groups, Hierarchical Condition Categories, and specifications designed for behavioral health. PRINCIPAL FINDINGS: Differences in predictive ability among risk adjustment models were small and generally insignificant. Specifications based on relatively few MH/SA diagnostic categories did as well as or better than models controlling for additional variables such as medical diagnoses at predicting MH/SA expenditures among adults. Simulation analyses revealed that among both adults and minors considerable scope remained for behavioral health care carve outs to make profits or losses after risk adjustment based on differential enrollment of severely ill patients. Similarly, integrated health plans have strong financial incentives to avoid MH/SA users even after adjustment. CONCLUSIONS: Current risk adjustment methodologies do not eliminate the financial incentives for integrated health plans and behavioral health care carve-out plans to avoid high-utilizing patients with psychiatric disorders. PMID:11508640
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Barutta, Joaquin; Guex, Raphael; Ibáñez, Agustín
2010-06-01
Abstract From everyday cognition to scientific discovery, analogical processes play an important role: bringing connection, integration, and interrelation of information. Recently, a PFC model of analogy has been proposed to explain many cognitive processes and integrate general functional properties of PFC. We argue here that analogical processes do not suffice to explain the cognitive processes and functions of PFC. Moreover the model does not satisfactorily integrate specific explanatory mechanisms required for the different processes involved. Its relevance would be improved if fewer cognitive phenomena were considered and more specific predictions and explanations about those processes were stated.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Integrated modeling applications for tokamak experiments with OMFIT
NASA Astrophysics Data System (ADS)
Meneghini, O.; Smith, S. P.; Lao, L. L.; Izacard, O.; Ren, Q.; Park, J. M.; Candy, J.; Wang, Z.; Luna, C. J.; Izzo, V. A.; Grierson, B. A.; Snyder, P. B.; Holland, C.; Penna, J.; Lu, G.; Raum, P.; McCubbin, A.; Orlov, D. M.; Belli, E. A.; Ferraro, N. M.; Prater, R.; Osborne, T. H.; Turnbull, A. D.; Staebler, G. M.
2015-08-01
One modeling framework for integrated tasks (OMFIT) is a comprehensive integrated modeling framework which has been developed to enable physics codes to interact in complicated workflows, and support scientists at all stages of the modeling cycle. The OMFIT development follows a unique bottom-up approach, where the framework design and capabilities organically evolve to support progressive integration of the components that are required to accomplish physics goals of increasing complexity. OMFIT provides a workflow for easily generating full kinetic equilibrium reconstructions that are constrained by magnetic and motional Stark effect measurements, and kinetic profile information that includes fast-ion pressure modeled by a transport code. It was found that magnetic measurements can be used to quantify the amount of anomalous fast-ion diffusion that is present in DIII-D discharges, and provide an estimate that is consistent with what would be needed for transport simulations to match the measured neutron rates. OMFIT was used to streamline edge-stability analyses, and evaluate the effect of resonant magnetic perturbation (RMP) on the pedestal stability, which have been found to be consistent with the experimental observations. The development of a five-dimensional numerical fluid model for estimating the effects of the interaction between magnetohydrodynamic (MHD) and microturbulence, and its systematic verification against analytic models was also supported by the framework. OMFIT was used for optimizing an innovative high-harmonic fast wave system proposed for DIII-D. For a parallel refractive index {{n}\\parallel}>3 , the conditions for strong electron-Landau damping were found to be independent of launched {{n}\\parallel} and poloidal angle. OMFIT has been the platform of choice for developing a neural-network based approach to efficiently perform a non-linear multivariate regression of local transport fluxes as a function of local dimensionless parameters. Transport predictions for thousands of DIII-D discharges showed excellent agreement with the power balance calculations across the whole plasma radius and over a broad range of operating regimes. Concerning predictive transport simulations, the framework made possible the design and automation of a workflow that enables self-consistent predictions of kinetic profiles and the plasma equilibrium. It is found that the feedback between the transport fluxes and plasma equilibrium can significantly affect the kinetic profiles predictions. Such a rich set of results provide tangible evidence of how bottom-up approaches can potentially provide a fast track to integrated modeling solutions that are functional, cost-effective, and in sync with the research effort of the community.
Gaussian functional regression for output prediction: Model assimilation and experimental design
NASA Astrophysics Data System (ADS)
Nguyen, N. C.; Peraire, J.
2016-03-01
In this paper, we introduce a Gaussian functional regression (GFR) technique that integrates multi-fidelity models with model reduction to efficiently predict the input-output relationship of a high-fidelity model. The GFR method combines the high-fidelity model with a low-fidelity model to provide an estimate of the output of the high-fidelity model in the form of a posterior distribution that can characterize uncertainty in the prediction. A reduced basis approximation is constructed upon the low-fidelity model and incorporated into the GFR method to yield an inexpensive posterior distribution of the output estimate. As this posterior distribution depends crucially on a set of training inputs at which the high-fidelity models are simulated, we develop a greedy sampling algorithm to select the training inputs. Our approach results in an output prediction model that inherits the fidelity of the high-fidelity model and has the computational complexity of the reduced basis approximation. Numerical results are presented to demonstrate the proposed approach.
TAMPA BAY MODEL EVALUATION AND ASSESSMENT
A long term goal of multimedia environmental management is to achieve sustainable ecological resources. Progress towards this goal rests on a foundation of science-based methods and data integrated into predictive multimedia, multi-stressor open architecture modeling systems. The...
Uzun, Harun; Yıldız, Zeynep; Goldfarb, Jillian L; Ceylan, Selim
2017-06-01
As biomass becomes more integrated into our energy feedstocks, the ability to predict its combustion enthalpies from routine data such as carbon, ash, and moisture content enables rapid decisions about utilization. The present work constructs a novel artificial neural network model with a 3-3-1 tangent sigmoid architecture to predict biomasses' higher heating values from only their proximate analyses, requiring minimal specificity as compared to models based on elemental composition. The model presented has a considerably higher correlation coefficient (0.963) and lower root mean square (0.375), mean absolute (0.328), and mean bias errors (0.010) than other models presented in the literature which, at least when applied to the present data set, tend to under-predict the combustion enthalpy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Frost Growth CFD Model of an Integrated Active Desiccant Rooftop Unit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geoghegan, Patrick J; Petrov, Andrei Y; Vineyard, Edward Allan
2008-01-01
A frost growth model is incorporated into a Computational Fluid Dynamics (CFD) simulation of a heat pump by means of a user-defined function in FLUENT, a commercial CFD code. The transient model is applied to the outdoor section of an Integrated Active Desiccant Rooftop (IADR) unit in heating mode. IADR is a hybrid vapor compression and active desiccant unit capable of handling 100% outdoor air (dedicated outdoor air system) or as a total conditioning system, handling both outdoor air and space cooling or heating loads. The predicted increase in flow resistance and loss in heat transfer capacity due to frostmore » build-up are compared to experimental pressure drop readings and thermal imaging. The purpose of this work is to develop a CFD model that is capable of predicting frost growth, an invaluable tool in evaluating the effectiveness of defrost-on-demand cycles.« less
A simple two-stage model predicts response time distributions.
Carpenter, R H S; Reddi, B A J; Anderson, A J
2009-08-15
The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.
Integrating Biological and Chemical Data for Hepatotoxicity Prediction (SOT)
The U.S. EPA ToxCastTM program is screening thousands of environmental chemicals for bioactivity using hundreds of high-throughput in vitro assays to build predictive models of toxicity. A set of 677 chemicals were represented by 711 bioactivity descriptors (from ToxCast assays),...
Quality of Community Life among Rural Residents: An Integrated Model
ERIC Educational Resources Information Center
Auh, Seongyeon; Cook, Christine C.
2009-01-01
The purpose of this research was to explore the relationships among housing satisfaction, community attachment and community satisfaction and the complex mechanisms involved in predicting community satisfaction among residents in rural communities. The role of housing satisfaction and community attachment in predicting community satisfaction was…
Category-Specific Neural Oscillations Predict Recall Organization During Memory Search
Morton, Neal W.; Kahana, Michael J.; Rosenberg, Emily A.; Baltuch, Gordon H.; Litt, Brian; Sharan, Ashwini D.; Sperling, Michael R.; Polyn, Sean M.
2013-01-01
Retrieved-context models of human memory propose that as material is studied, retrieval cues are constructed that allow one to target particular aspects of past experience. We examined the neural predictions of these models by using electrocorticographic/depth recordings and scalp electroencephalography (EEG) to characterize category-specific oscillatory activity, while participants studied and recalled items from distinct, neurally discriminable categories. During study, these category-specific patterns predict whether a studied item will be recalled. In the scalp EEG experiment, category-specific activity during study also predicts whether a given item will be recalled adjacent to other same-category items, consistent with the proposal that a category-specific retrieval cue is used to guide memory search. Retrieved-context models suggest that integrative neural circuitry is involved in the construction and maintenance of the retrieval cue. Consistent with this hypothesis, we observe category-specific patterns that rise in strength as multiple same-category items are studied sequentially, and find that individual differences in this category-specific neural integration during study predict the degree to which a participant will use category information to organize memory search. Finally, we track the deployment of this retrieval cue during memory search: Category-specific patterns are stronger when participants organize their responses according to the category of the studied material. PMID:22875859
Prediction of tautomer ratios by embedded-cluster integral equation theory
NASA Astrophysics Data System (ADS)
Kast, Stefan M.; Heil, Jochen; Güssregen, Stefan; Schmidt, K. Friedemann
2010-04-01
The "embedded cluster reference interaction site model" (EC-RISM) approach combines statistical-mechanical integral equation theory and quantum-chemical calculations for predicting thermodynamic data for chemical reactions in solution. The electronic structure of the solute is determined self-consistently with the structure of the solvent that is described by 3D RISM integral equation theory. The continuous solvent-site distribution is mapped onto a set of discrete background charges ("embedded cluster") that represent an additional contribution to the molecular Hamiltonian. The EC-RISM analysis of the SAMPL2 challenge set of tautomers proceeds in three stages. Firstly, the group of compounds for which quantitative experimental free energy data was provided was taken to determine appropriate levels of quantum-chemical theory for geometry optimization and free energy prediction. Secondly, the resulting workflow was applied to the full set, allowing for chemical interpretations of the results. Thirdly, disclosure of experimental data for parts of the compounds facilitated a detailed analysis of methodical issues and suggestions for future improvements of the model. Without specifically adjusting parameters, the EC-RISM model yields the smallest value of the root mean square error for the first set (0.6 kcal mol-1) as well as for the full set of quantitative reaction data (2.0 kcal mol-1) among the SAMPL2 participants.
Killeen, G F; McKenzie, F E; Foy, B D; Schieffelin, C; Billingsley, P F; Beier, J C
2000-05-01
We have used a relatively simple but accurate model for predicting the impact of integrated transmission control on the malaria entomologic inoculation rate (EIR) at four endemic sites from across sub-Saharan Africa and the southwest Pacific. The simulated campaign incorporated modestly effective vaccine coverage, bed net use, and larval control. The results indicate that such campaigns would reduce EIRs at all four sites by 30- to 50-fold. Even without the vaccine, 15- to 25-fold reductions of EIR were predicted, implying that integrated control with a few modestly effective tools can meaningfully reduce malaria transmission in a range of endemic settings. The model accurately predicts the effects of bed nets and indoor spraying and demonstrates that they are the most effective tools available for reducing EIR. However, the impact of domestic adult vector control is amplified by measures for reducing the rate of emergence of vectors or the level of infectiousness of the human reservoir. We conclude that available tools, including currently neglected methods for larval control, can reduce malaria transmission intensity enough to alleviate mortality. Integrated control programs should be implemented to the fullest extent possible, even in areas of intense transmission, using simple models as decision-making tools. However, we also conclude that to eliminate malaria in many areas of intense transmission is beyond the scope of methods which developing nations can currently afford. New, cost-effective, practical tools are needed if malaria is ever to be eliminated from highly endemic areas.
Integrating Stomach Content and Stable Isotope Analyses to Quantify the Diets of Pygoscelid Penguins
Polito, Michael J.; Trivelpiece, Wayne Z.; Karnovsky, Nina J.; Ng, Elizabeth; Patterson, William P.; Emslie, Steven D.
2011-01-01
Stomach content analysis (SCA) and more recently stable isotope analysis (SIA) integrated with isotopic mixing models have become common methods for dietary studies and provide insight into the foraging ecology of seabirds. However, both methods have drawbacks and biases that may result in difficulties in quantifying inter-annual and species-specific differences in diets. We used these two methods to simultaneously quantify the chick-rearing diet of Chinstrap (Pygoscelis antarctica) and Gentoo (P. papua) penguins and highlight methods of integrating SCA data to increase accuracy of diet composition estimates using SIA. SCA biomass estimates were highly variable and underestimated the importance of soft-bodied prey such as fish. Two-source, isotopic mixing model predictions were less variable and identified inter-annual and species-specific differences in the relative amounts of fish and krill in penguin diets not readily apparent using SCA. In contrast, multi-source isotopic mixing models had difficulty estimating the dietary contribution of fish species occupying similar trophic levels without refinement using SCA-derived otolith data. Overall, our ability to track inter-annual and species-specific differences in penguin diets using SIA was enhanced by integrating SCA data to isotopic mixing modes in three ways: 1) selecting appropriate prey sources, 2) weighting combinations of isotopically similar prey in two-source mixing models and 3) refining predicted contributions of isotopically similar prey in multi-source models. PMID:22053199
An integrated Modelling framework to monitor and predict trends of agricultural management (iMSoil)
NASA Astrophysics Data System (ADS)
Keller, Armin; Della Peruta, Raneiro; Schaepman, Michael; Gomez, Marta; Mann, Stefan; Schulin, Rainer
2014-05-01
Agricultural systems lay at the interface between natural ecosystems and the anthroposphere. Various drivers induce pressures on the agricultural systems, leading to changes in farming practice. The limitation of available land and the socio-economic drivers are likely to result in further intensification of agricultural land management, with implications on fertilization practices, soil and pest management, as well as crop and livestock production. In order to steer the development into desired directions, tools are required by which the effects of these pressures on agricultural management and resulting impacts on soil functioning can be detected as early as possible, future scenarios predicted and suitable management options and policies defined. In this context, the use of integrated models can play a major role in providing long-term predictions of soil quality and assessing the sustainability of agricultural soil management. Significant progress has been made in this field over the last decades. Some of these integrated modelling frameworks include biophysical parameters, but often the inherent characteristics and detailed processes of the soil system have been very simplified. The development of such tools has been hampered in the past by a lack of spatially explicit soil and land management information at regional scale. The iMSoil project, funded by the Swiss National Science Foundation in the national research programme NRP68 "soil as a resource" (www.nrp68.ch) aims at developing and implementing an integrated modeling framework (IMF) which can overcome the limitations mentioned above, by combining socio-economic, agricultural land management, and biophysical models, in order to predict the long-term impacts of different socio-economic scenarios on the soil quality. In our presentation we briefly outline the approach that is based on an interdisciplinary modular framework that builds on already existing monitoring tools and model components that are currently in development: (i) the socio-economic agent-based model SWISSland; (ii) a land management downscaling approach that provides crop rotation, fertilisers and pesticides application rates for each land management unit, and (iii) the agro-ecosystem model EPIC, which is currently being calibrated with long-term soil measurements and agricultural management data provided by the Swiss Soil Monitoring Network. Moreover, the IMF will make use of land cover information derived from remote sensing to continuously update predictions. The IMF will be tested on two case study regions to develop indicators of sustainable soil management that can be implemented into Swiss policies.
The experience of agency: an interplay between prediction and postdiction
Synofzik, Matthis; Vosgerau, Gottfried; Voss, Martin
2013-01-01
The experience of agency, i.e., the registration that I am the initiator of my actions, is a basic and constant underpinning of our interaction with the world. Whereas several accounts have underlined predictive processes as the central mechanism (e.g., the comparator model by C. Frith), others emphasized postdictive inferences (e.g., post-hoc inference account by D. Wegner). Based on increasing evidence that both predictive and postdictive processes contribute to the experience of agency, we here present a unifying but at the same time parsimonious approach that reconciles these accounts: predictive and postdictive processes are both integrated by the brain according to the principles of optimal cue integration. According to this framework, predictive and postdictive processes each serve as authorship cues that are continuously integrated and weighted depending on their availability and reliability in a given situation. Both sensorimotor and cognitive signals can serve as predictive cues (e.g., internal predictions based on an efferency copy of the motor command or cognitive anticipations based on priming). Similarly, other sensorimotor and cognitive cues can each serve as post-hoc cues (e.g., visual feedback of the action or the affective valence of the action outcome). Integration and weighting of these cues might not only differ between contexts and individuals, but also between different subject and disease groups. For example, schizophrenia patients with delusions of influence seem to rely less on (probably imprecise) predictive motor signals of the action and more on post-hoc action cues like e.g., visual feedback and, possibly, the affective valence of the action outcome. Thus, the framework of optimal cue integration offers a promising approach that directly stimulates a wide range of experimentally testable hypotheses on agency processing in different subject groups. PMID:23508565
A Simple Exercise Reveals the Way Students Think about Scientific Modeling
ERIC Educational Resources Information Center
Ruebush, Laura; Sulikowski, Michelle; North, Simon
2009-01-01
Scientific modeling is an integral part of contemporary science, yet many students have little understanding of how models are developed, validated, and used to predict and explain phenomena. A simple modeling exercise led to significant gains in understanding key attributes of scientific modeling while revealing some stubborn misconceptions.…
Bhhatarai, Barun; Wilson, Daniel M.; Price, Paul S.; Marty, Sue; Parks, Amanda K.; Carney, Edward
2016-01-01
Background: Integrative testing strategies (ITSs) for potential endocrine activity can use tiered in silico and in vitro models. Each component of an ITS should be thoroughly assessed. Objectives: We used the data from three in vitro ToxCast™ binding assays to assess OASIS, a quantitative structure-activity relationship (QSAR) platform covering both estrogen receptor (ER) and androgen receptor (AR) binding. For stronger binders (described here as AC50 < 1 μM), we also examined the relationship of QSAR predictions of ER or AR binding to the results from 18 ER and 10 AR transactivation assays, 72 ER-binding reference compounds, and the in vivo uterotrophic assay. Methods: NovaScreen binding assay data for ER (human, bovine, and mouse) and AR (human, chimpanzee, and rat) were used to assess the sensitivity, specificity, concordance, and applicability domain of two OASIS QSAR models. The binding strength relative to the QSAR-predicted binding strength was examined for the ER data. The relationship of QSAR predictions of binding to transactivation- and pathway-based assays, as well as to in vivo uterotrophic responses, was examined. Results: The QSAR models had both high sensitivity (> 75%) and specificity (> 86%) for ER as well as both high sensitivity (92–100%) and specificity (70–81%) for AR. For compounds within the domains of the ER and AR QSAR models that bound with AC50 < 1 μM, the QSAR models accurately predicted the binding for the parent compounds. The parent compounds were active in all transactivation assays where metabolism was incorporated and, except for those compounds known to require metabolism to manifest activity, all assay platforms where metabolism was not incorporated. Compounds in-domain and predicted to bind by the ER QSAR model that were positive in ToxCast™ ER binding at AC50 < 1 μM were active in the uterotrophic assay. Conclusions: We used the extensive ToxCast™ HTS binding data set to show that OASIS ER and AR QSAR models had high sensitivity and specificity when compounds were in-domain of the models. Based on this research, we recommend a tiered screening approach wherein a) QSAR is used to identify compounds in-domain of the ER or AR binding models and predicted to bind; b) those compounds are screened in vitro to assess binding potency; and c) the stronger binders (AC50 < 1 μM) are screened in vivo. This scheme prioritizes compounds for integrative testing and risk assessment. Importantly, compounds that are not in-domain, that are predicted either not to bind or to bind weakly, that are not active in in vitro, that require metabolism to manifest activity, or for which in vivo AR testing is in order, need to be assessed differently. Citation: Bhhatarai B, Wilson DM, Price PS, Marty S, Parks AK, Carney E. 2016. Evaluation of OASIS QSAR models using ToxCast™ in vitro estrogen and androgen receptor binding data and application in an integrated endocrine screening approach. Environ Health Perspect 124:1453–1461; http://dx.doi.org/10.1289/EHP184 PMID:27152837
Temperature-mediated growth thresholds of Acrobasis vaccinii (Lepidoptera: Pyralidae)
USDA-ARS?s Scientific Manuscript database
Degree-day models link ambient temperature to the development of insects, making such models valuable tools in integrated pest management. Phenology models increase management efficacy by quantifying and predicting pest phenology. In Wisconsin, the top pest of cranberry production is the cranberry f...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.
State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less
Automated System Checkout to Support Predictive Maintenance for the Reusable Launch Vehicle
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Deb, Somnath; Kulkarni, Deepak; Wang, Yao; Lau, Sonie (Technical Monitor)
1998-01-01
The Propulsion Checkout and Control System (PCCS) is a predictive maintenance software system. The real-time checkout procedures and diagnostics are designed to detect components that need maintenance based on their condition, rather than using more conventional approaches such as scheduled or reliability centered maintenance. Predictive maintenance can reduce turn-around time and cost and increase safety as compared to conventional maintenance approaches. Real-time sensor validation, limit checking, statistical anomaly detection, and failure prediction based on simulation models are employed. Multi-signal models, useful for testability analysis during system design, are used during the operational phase to detect and isolate degraded or failed components. The TEAMS-RT real-time diagnostic engine was developed to utilize the multi-signal models by Qualtech Systems, Inc. Capability of predicting the maintenance condition was successfully demonstrated with a variety of data, from simulation to actual operation on the Integrated Propulsion Technology Demonstrator (IPTD) at Marshall Space Flight Center (MSFC). Playback of IPTD valve actuations for feature recognition updates identified an otherwise undetectable Main Propulsion System 12 inch prevalve degradation. The algorithms were loaded into the Propulsion Checkout and Control System for further development and are the first known application of predictive Integrated Vehicle Health Management to an operational cryogenic testbed. The software performed successfully in real-time, meeting the required performance goal of 1 second cycle time.
Macmillan, Donna S; Canipa, Steven J; Chilton, Martyn L; Williams, Richard V; Barber, Christopher G
2016-04-01
There is a pressing need for non-animal methods to predict skin sensitisation potential and a number of in chemico and in vitro assays have been designed with this in mind. However, some compounds can fall outside the applicability domain of these in chemico/in vitro assays and may not be predicted accurately. Rule-based in silico models such as Derek Nexus are expert-derived from animal and/or human data and the mechanism-based alert domain can take a number of factors into account (e.g. abiotic/biotic activation). Therefore, Derek Nexus may be able to predict for compounds outside the applicability domain of in chemico/in vitro assays. To this end, an integrated testing strategy (ITS) decision tree using Derek Nexus and a maximum of two assays (from DPRA, KeratinoSens, LuSens, h-CLAT and U-SENS) was developed. Generally, the decision tree improved upon other ITS evaluated in this study with positive and negative predictivity calculated as 86% and 81%, respectively. Our results demonstrate that an ITS using an in silico model such as Derek Nexus with a maximum of two in chemico/in vitro assays can predict the sensitising potential of a number of chemicals, including those outside the applicability domain of existing non-animal assays. Copyright © 2016 Elsevier Inc. All rights reserved.
Putting mechanisms into crop production models.
Boote, Kenneth J; Jones, James W; White, Jeffrey W; Asseng, Senthold; Lizaso, Jon I
2013-09-01
Crop growth models dynamically simulate processes of C, N and water balance on daily or hourly time-steps to predict crop growth and development and at season-end, final yield. Their ability to integrate effects of genetics, environment and crop management have led to applications ranging from understanding gene function to predicting potential impacts of climate change. The history of crop models is reviewed briefly, and their level of mechanistic detail for assimilation and respiration, ranging from hourly leaf-to-canopy assimilation to daily radiation-use efficiency is discussed. Crop models have improved steadily over the past 30-40 years, but much work remains. Improvements are needed for the prediction of transpiration response to elevated CO₂ and high temperature effects on phenology and reproductive fertility, and simulation of root growth and nutrient uptake under stressful edaphic conditions. Mechanistic improvements are needed to better connect crop growth to genetics and to soil fertility, soil waterlogging and pest damage. Because crop models integrate multiple processes and consider impacts of environment and management, they have excellent potential for linking research from genomics and allied disciplines to crop responses at the field scale, thus providing a valuable tool for deciphering genotype by environment by management effects. © 2013 John Wiley & Sons Ltd.
Stenling, Andreas; Tafvelin, Susanne
2016-10-01
Leadership development programs are common in sports, but seldom evaluated; hence, we have limited knowledge about what the participants actually learn and the impact these programs have on sports clubs' daily operations. The purpose of the current study was to integrate a transfer of training model with self-determination theory to understand predictors of learning and training transfer, following a leadership development program among organizational leaders in Swedish sports clubs. Bayesian multilevel path analysis showed that autonomous motivation and an autonomy-supportive implementation of the program positively predicted near transfer (i.e., immediately after the training program) and that perceiving an autonomy-supportive climate in the sports club positively predicted far transfer (i.e., 1 year after the training program). This study extends previous research by integrating a transfer of training model with self-determination theory and identified important motivational factors that predict near and far training transfer.
OʼHara, Susan
2014-01-01
Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.
Huang, Yongzhi; Green, Alexander L; Hyam, Jonathan; Fitzgerald, James; Aziz, Tipu Z; Wang, Shouyan
2018-01-01
Understanding the function of sensory thalamic neural activity is essential for developing and improving interventions for neuropathic pain. However, there is a lack of investigation of the relationship between sensory thalamic oscillations and pain relief in patients with neuropathic pain. This study aims to identify the oscillatory neural characteristics correlated with pain relief induced by deep brain stimulation (DBS), and develop a quantitative model to predict pain relief by integrating characteristic measures of the neural oscillations. Measures of sensory thalamic local field potentials (LFPs) in thirteen patients with neuropathic pain were screened in three dimensional feature space according to the rhythm, balancing, and coupling neural behaviours, and correlated with pain relief. An integrated approach based on principal component analysis (PCA) and multiple regression analysis is proposed to integrate the multiple measures and provide a predictive model. This study reveals distinct thalamic rhythms of theta, alpha, high beta and high gamma oscillations correlating with pain relief. The balancing and coupling measures between these neural oscillations were also significantly correlated with pain relief. The study enriches the series research on the function of thalamic neural oscillations in neuropathic pain and relief, and provides a quantitative approach for predicting pain relief by DBS using thalamic neural oscillations. Copyright © 2017 Elsevier Inc. All rights reserved.
R. Quinn Thomas; Evan B. Brooks; Annika L. Jersild; Eric J. Ward; Randolph H. Wynne; Timothy J. Albaugh; Heather Dinon-Aldridge; Harold E. Burkhart; Jean-Christophe Domec; Timothy R. Fox; Carlos A. Gonzalez-Benecke; Timothy A. Martin; Asko Noormets; David A. Sampson; Robert O. Teskey
2017-01-01
Predicting how forest carbon cycling will change in response to climate change and management depends on the collective knowledge from measurements across environmental gradients, ecosystem manipulations of global change factors, and mathematical models. Formally integrating these sources of knowledge through data assimilation, or modelâdata fusion, allows the use of...
Miranian, A; Abdollahzade, M
2013-02-01
Local modeling approaches, owing to their ability to model different operating regimes of nonlinear systems and processes by independent local models, seem appealing for modeling, identification, and prediction applications. In this paper, we propose a local neuro-fuzzy (LNF) approach based on the least-squares support vector machines (LSSVMs). The proposed LNF approach employs LSSVMs, which are powerful in modeling and predicting time series, as local models and uses hierarchical binary tree (HBT) learning algorithm for fast and efficient estimation of its parameters. The HBT algorithm heuristically partitions the input space into smaller subdomains by axis-orthogonal splits. In each partitioning, the validity functions automatically form a unity partition and therefore normalization side effects, e.g., reactivation, are prevented. Integration of LSSVMs into the LNF network as local models, along with the HBT learning algorithm, yield a high-performance approach for modeling and prediction of complex nonlinear time series. The proposed approach is applied to modeling and predictions of different nonlinear and chaotic real-world and hand-designed systems and time series. Analysis of the prediction results and comparisons with recent and old studies demonstrate the promising performance of the proposed LNF approach with the HBT learning algorithm for modeling and prediction of nonlinear and chaotic systems and time series.
Tani, Yuji; Ogasawara, Katsuhiko
2012-01-01
This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.
Modeling human vestibular responses during eccentric rotation and off vertical axis rotation
NASA Technical Reports Server (NTRS)
Merfeld, D. M.; Paloski, W. H. (Principal Investigator)
1995-01-01
A mathematical model has been developed to help explain human multi-sensory interactions. The most important constituent of the model is the hypothesis that the nervous system incorporates knowledge of sensory dynamics into an "internal model" of these dynamics. This internal model allows the nervous system to integrate the sensory information from many different sensors into a coherent estimate of self-motion. The essence of the model is unchanged from a previously published model of monkey eye movement responses; only a few variables have been adjusted to yield the prediction of human responses. During eccentric rotation, the model predicts that the axis of eye rotation shifts slightly toward alignment with gravito-inertial force. The model also predicts that the time course of the perception of tilt following the acceleration phase of eccentric rotation is much slower than that during deceleration. During off vertical axis rotation (OVAR) the model predicts a small horizontal bias along with small horizontal, vertical, and torsional oscillations. Following OVAR stimulation, when stopped right- or left-side down, a small vertical component is predicted that decays with the horizontal post-rotatory response. All of the predictions are consistent with measurements of human responses.
Kesorn, Kraisak; Ongruk, Phatsavee; Chompoosri, Jakkrawarn; Phumee, Atchara; Thavara, Usavadee; Tawatsin, Apiwat; Siriyasatien, Padet
2015-01-01
Background In the past few decades, several researchers have proposed highly accurate prediction models that have typically relied on climate parameters. However, climate factors can be unreliable and can lower the effectiveness of prediction when they are applied in locations where climate factors do not differ significantly. The purpose of this study was to improve a dengue surveillance system in areas with similar climate by exploiting the infection rate in the Aedes aegypti mosquito and using the support vector machine (SVM) technique for forecasting the dengue morbidity rate. Methods and Findings Areas with high incidence of dengue outbreaks in central Thailand were studied. The proposed framework consisted of the following three major parts: 1) data integration, 2) model construction, and 3) model evaluation. We discovered that the Ae. aegypti female and larvae mosquito infection rates were significantly positively associated with the morbidity rate. Thus, the increasing infection rate of female mosquitoes and larvae led to a higher number of dengue cases, and the prediction performance increased when those predictors were integrated into a predictive model. In this research, we applied the SVM with the radial basis function (RBF) kernel to forecast the high morbidity rate and take precautions to prevent the development of pervasive dengue epidemics. The experimental results showed that the introduced parameters significantly increased the prediction accuracy to 88.37% when used on the test set data, and these parameters led to the highest performance compared to state-of-the-art forecasting models. Conclusions The infection rates of the Ae. aegypti female mosquitoes and larvae improved the morbidity rate forecasting efficiency better than the climate parameters used in classical frameworks. We demonstrated that the SVM-R-based model has high generalization performance and obtained the highest prediction performance compared to classical models as measured by the accuracy, sensitivity, specificity, and mean absolute error (MAE). PMID:25961289
Relational Integration as a Predictor of Academic Achievement
ERIC Educational Resources Information Center
Krumm, Stefan; Lipnevich, Anastasiya A.; Schmidt-Atzert, Lothar; Buhner, Markus
2012-01-01
The current study aimed at applying a broad model of cognitive functions to predict performance in science and language courses at school as well as performance in a science course at university. We hypothesized that performance in science courses was predominantly related to the cognitive function known as relational integration, whereas…
A productivity model for parasitized, multibrooded songbirds
Powell, L.A.; Knutson, M.G.
2006-01-01
We present an enhancement of a simulation model to predict annual productivity for Wood Thrushes (Hylocichla mustelina) and American Redstarts (Setophaga ruticilla); the model includes effects of Brown-headed Cowbird (Molothrus ater) parasitism. We used species-specific data from the Driftless Area Ecoregion of Wisconsin, Minnesota, and Iowa to parameterize the model as a case study. The simulation model predicted annual productivity of 2.03 ?? 1.60 SD for Wood Thrushes and 1.56 ?? 1.31 SD for American Redstarts. Our sensitivity analysis showed that high parasitism lowered Wood Thrush annual productivity more than American Redstart productivity, even though parasitism affected individual nests of redstarts more severely. Annual productivity predictions are valuable for habitat managers, but productivity is not easily obtained from field studies. Our model provides a useful means of integrating complex life history parameters to predict productivity for songbirds that experience nest parasitism. ?? The Cooper Ornithological Society 2006.
NASA Astrophysics Data System (ADS)
Yin, Yip Chee; Hock-Eam, Lim
2012-09-01
This paper investigates the forecasting ability of Mallows Model Averaging (MMA) by conducting an empirical analysis of five Asia countries, Malaysia, Thailand, Philippines, Indonesia and China's GDP growth rate. Results reveal that MMA has no noticeable differences in predictive ability compared to the general autoregressive fractional integrated moving average model (ARFIMA) and its predictive ability is sensitive to the effect of financial crisis. MMA could be an alternative forecasting method for samples without recent outliers such as financial crisis.
Technosocial Modeling of IED Threat Scenarios and Attacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Brothers, Alan J.; Coles, Garill A.
2009-03-23
This paper describes an approach for integrating sociological and technical models to develop more complete threat assessment. Current approaches to analyzing and addressing threats tend to focus on the technical factors. This paper addresses development of predictive models that encompass behavioral as well as these technical factors. Using improvised explosive device (IED) attacks as motivation, this model supports identification of intervention activities 'left of boom' as well as prioritizing attack modalities. We show how Bayes nets integrate social factors associated with IED attacks into general threat model containing technical and organizational steps from planning through obtaining the IED to initiationmore » of the attack. The social models are computationally-based representations of relevant social science literature that describes human decision making and physical factors. When combined with technical models, the resulting model provides improved knowledge integration into threat assessment for monitoring. This paper discusses the construction of IED threat scenarios, integration of diverse factors into an analytical framework for threat assessment, indicator identification for future threats, and future research directions.« less
Toward “optimal” integration of terrestrial biosphere models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwalm, Christopher R.; Huntingzger, Deborah; Fisher, Joshua B.
2015-06-10
Multi-model ensembles (MME) are commonplace in Earth system modeling. Here we perform MME integration using a 10-member ensemble of terrestrial biosphere models (TBMs) from the Multi-scale synthesis and Terrestrial Model Intercomparison Project (MsTMIP). We contrast optimal (skill-based for present-day carbon cycling) versus naïve (“one model – one vote”) integration. MsTMIP optimal and naïve mean land sink strength estimates (–1.16 vs. –1.15 Pg C per annum respectively) are statistically indistinguishable. This holds also for grid cell values and extends to gross uptake, biomass, and net ecosystem productivity. TBM skill is similarly indistinguishable. The added complexity of skill-based integration does not materiallymore » change MME values. This suggests that carbon metabolism has predictability limits and/or that all models and references are misspecified. Resolving this issue requires addressing specific uncertainty types (initial conditions, structure, references) and a change in model development paradigms currently dominant in the TBM community.« less
Multi-scale predictions of massive conifer mortality due to chronic temperature rise
NASA Astrophysics Data System (ADS)
McDowell, N. G.; Williams, A. P.; Xu, C.; Pockman, W. T.; Dickman, L. T.; Sevanto, S.; Pangle, R.; Limousin, J.; Plaut, J.; Mackay, D. S.; Ogee, J.; Domec, J. C.; Allen, C. D.; Fisher, R. A.; Jiang, X.; Muss, J. D.; Breshears, D. D.; Rauscher, S. A.; Koven, C.
2016-03-01
Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our ability to accurately simulate drought-induced forest impacts remains highly uncertain in part owing to our failure to integrate physiological measurements, regional-scale models, and dynamic global vegetation models (DGVMs). Here we show consistent predictions of widespread mortality of needleleaf evergreen trees (NET) within Southwest USA by 2100 using state-of-the-art models evaluated against empirical data sets. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ψpd) thresholds (April-August mean) beyond which photosynthesis, hydraulic and stomatal conductance, and carbohydrate availability approached zero. The evaluated regional models accurately predicted NET Ψpd, and 91% of predictions (10 out of 11) exceeded mortality thresholds within the twenty-first century due to temperature rise. The independent DGVMs predicted >=50% loss of Northern Hemisphere NET by 2100, consistent with the NET findings for Southwest USA. Notably, the global models underestimated future mortality within Southwest USA, highlighting that predictions of future mortality within global models may be underestimates. Taken together, the validated regional predictions and the global simulations predict widespread conifer loss in coming decades under projected global warming.
Multi-scale predictions of massive conifer mortality due to chronic temperature rise
McDowell, Nathan G.; Williams, A.P.; Xu, C.; Pockman, W. T.; Dickman, L. T.; Sevanto, Sanna; Pangle, R.; Limousin, J.; Plaut, J.J.; Mackay, D.S.; Ogee, J.; Domec, Jean-Christophe; Allen, Craig D.; Fisher, Rosie A.; Jiang, X.; Muss, J.D.; Breshears, D.D.; Rauscher, Sara A.; Koven, C.
2016-01-01
Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our ability to accurately simulate drought-induced forest impacts remains highly uncertain in part owing to our failure to integrate physiological measurements, regional-scale models, and dynamic global vegetation models (DGVMs). Here we show consistent predictions of widespread mortality of needleleaf evergreen trees (NET) within Southwest USA by 2100 using state-of-the-art models evaluated against empirical data sets. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ψpd) thresholds (April–August mean) beyond which photosynthesis, hydraulic and stomatal conductance, and carbohydrate availability approached zero. The evaluated regional models accurately predicted NET Ψpd, and 91% of predictions (10 out of 11) exceeded mortality thresholds within the twenty-first century due to temperature rise. The independent DGVMs predicted ≥50% loss of Northern Hemisphere NET by 2100, consistent with the NET findings for Southwest USA. Notably, the global models underestimated future mortality within Southwest USA, highlighting that predictions of future mortality within global models may be underestimates. Taken together, the validated regional predictions and the global simulations predict widespread conifer loss in coming decades under projected global warming.
Latent feature decompositions for integrative analysis of multi-platform genomic data
Gregory, Karl B.; Momin, Amin A.; Coombes, Kevin R.; Baladandayuthapani, Veerabhadran
2015-01-01
Increased availability of multi-platform genomics data on matched samples has sparked research efforts to discover how diverse molecular features interact both within and between platforms. In addition, simultaneous measurements of genetic and epigenetic characteristics illuminate the roles their complex relationships play in disease progression and outcomes. However, integrative methods for diverse genomics data are faced with the challenges of ultra-high dimensionality and the existence of complex interactions both within and between platforms. We propose a novel modeling framework for integrative analysis based on decompositions of the large number of platform-specific features into a smaller number of latent features. Subsequently we build a predictive model for clinical outcomes accounting for both within- and between-platform interactions based on Bayesian model averaging procedures. Principal components, partial least squares and non-negative matrix factorization as well as sparse counterparts of each are used to define the latent features, and the performance of these decompositions is compared both on real and simulated data. The latent feature interactions are shown to preserve interactions between the original features and not only aid prediction but also allow explicit selection of outcome-related features. The methods are motivated by and applied to, a glioblastoma multiforme dataset from The Cancer Genome Atlas to predict patient survival times integrating gene expression, microRNA, copy number and methylation data. For the glioblastoma data, we find a high concordance between our selected prognostic genes and genes with known associations with glioblastoma. In addition, our model discovers several relevant cross-platform interactions such as copy number variation associated gene dosing and epigenetic regulation through promoter methylation. On simulated data, we show that our proposed method successfully incorporates interactions within and between genomic platforms to aid accurate prediction and variable selection. Our methods perform best when principal components are used to define the latent features. PMID:26146492
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion
Ramirez Ramirez, L. Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
Background The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Methods Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. Results DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. Conclusions The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient. PMID:28464015
NASA Technical Reports Server (NTRS)
Luvall, Jeffrey C.; Sprigg, William A.; Huete, Alfredo; Pejanovic, Goran; Nickovic,Slobodan; Ponce-Campos, Guillermo; Krapfl, Heide; Budge, Amy; Zelicoff, Alan; VandeWater, Peter K.;
2011-01-01
This slide presentation reviews the study that used a model to forecast pollen to assist in warning for asthma populations. Using MODIS daily reflectances to input to a model, PREAM, adapted from the Dust REgional Atmospheric Modeling (DREAM) system, a product of predicted pollen is produced. Using the pollen from Juniper the PREAM model was shown to be an assist in alerting the public of pollen bursts, and reduce the health impact on asthma populations.
van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W
2010-01-22
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.
Portal dosimetry for VMAT using integrated images obtained during treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bedford, James L., E-mail: James.Bedford@icr.ac.uk; Hanson, Ian M.; Hansen, Vibeke Nordmark
2014-02-15
Purpose: Portal dosimetry provides an accurate and convenient means of verifying dose delivered to the patient. A simple method for carrying out portal dosimetry for volumetric modulated arc therapy (VMAT) is described, together with phantom measurements demonstrating the validity of the approach. Methods: Portal images were predicted by projecting dose in the isocentric plane through to the portal image plane, with exponential attenuation and convolution with a double-Gaussian scatter function. Appropriate parameters for the projection were selected by fitting the calculation model to portal images measured on an iViewGT portal imager (Elekta AB, Stockholm, Sweden) for a variety of phantommore » thicknesses and field sizes. This model was then used to predict the portal image resulting from each control point of a VMAT arc. Finally, all these control point images were summed to predict the overall integrated portal image for the whole arc. The calculated and measured integrated portal images were compared for three lung and three esophagus plans delivered to a thorax phantom, and three prostate plans delivered to a homogeneous phantom, using a gamma index for 3% and 3 mm. A 0.6 cm{sup 3} ionization chamber was used to verify the planned isocentric dose. The sensitivity of this method to errors in monitor units, field shaping, gantry angle, and phantom position was also evaluated by means of computer simulations. Results: The calculation model for portal dose prediction was able to accurately compute the portal images due to simple square fields delivered to solid water phantoms. The integrated images of VMAT treatments delivered to phantoms were also correctly predicted by the method. The proportion of the images with a gamma index of less than unity was 93.7% ± 3.0% (1SD) and the difference between isocenter dose calculated by the planning system and measured by the ionization chamber was 0.8% ± 1.0%. The method was highly sensitive to errors in monitor units and field shape, but less sensitive to errors in gantry angle or phantom position. Conclusions: This method of predicting integrated portal images provides a convenient means of verifying dose delivered using VMAT, with minimal image acquisition and data processing requirements.« less
Simulations of photochemical smog formation in complex urban areas
NASA Astrophysics Data System (ADS)
Muilwijk, C.; Schrijvers, P. J. C.; Wuerz, S.; Kenjereš, S.
2016-12-01
In the present study we numerically investigated the dispersion of photochemical reactive pollutants in complex urban areas by applying an integrated Computational Fluid Dynamics (CFD) and Computational Reaction Dynamics (CRD) approach. To model chemical reactions involved in smog generation, the Generic Reaction Set (GRS) approach is used. The GRS model was selected since it does not require detailed modeling of a large set of reactive components. Smog formation is modeled first in the case of an intensive traffic emission, subjected to low to moderate wind conditions in an idealized two-dimensional street canyon with a building aspect ratio (height/width) of one. It is found that Reactive Organic Components (ROC) play an important role in the chemistry of smog formation. In contrast to the NOx/O3 photochemical steady state model that predicts a depletion of the (ground level) ozone, the GRS model predicts generation of ozone. Secondly, the effect of direct sunlight and shadow within the street canyon on the chemical reaction dynamics is investigated for three characteristic solar angles (morning, midday and afternoon). Large differences of up to one order of magnitude are found in the ozone production for different solar angles. As a proof of concept for real urban areas, the integrated CFD/CRD approach is applied for a real scale (1 × 1 km2) complex urban area (a district of the city of Rotterdam, The Netherlands) with high traffic emissions. The predicted pollutant concentration levels give realistic values that correspond to moderate to heavy smog. It is concluded that the integrated CFD/CRD method with the GRS model of chemical reactions is both accurate and numerically robust, and can be used for modeling of smog formation in complex urban areas.
Wind farms production: Control and prediction
NASA Astrophysics Data System (ADS)
El-Fouly, Tarek Hussein Mostafa
Wind energy resources, unlike dispatchable central station generation, produce power dependable on external irregular source and that is the incident wind speed which does not always blow when electricity is needed. This results in the variability, unpredictability, and uncertainty of wind resources. Therefore, the integration of wind facilities to utility electrical grid presents a major challenge to power system operator. Such integration has significant impact on the optimum power flow, transmission congestion, power quality issues, system stability, load dispatch, and economic analysis. Due to the irregular nature of wind power production, accurate prediction represents the major challenge to power system operators. Therefore, in this thesis two novel models are proposed for wind speed and wind power prediction. One proposed model is dedicated to short-term prediction (one-hour ahead) and the other involves medium term prediction (one-day ahead). The accuracy of the proposed models is revealed by comparing their results with the corresponding values of a reference prediction model referred to as the persistent model. Utility grid operation is not only impacted by the uncertainty of the future production of wind farms, but also by the variability of their current production and how the active and reactive power exchange with the grid is controlled. To address this particular task, a control technique for wind turbines, driven by doubly-fed induction generators (DFIGs), is developed to regulate the terminal voltage by equally sharing the generated/absorbed reactive power between the rotor-side and the gridside converters. To highlight the impact of the new developed technique in reducing the power loss in the generator set, an economic analysis is carried out. Moreover, a new aggregated model for wind farms is proposed that accounts for the irregularity of the incident wind distribution throughout the farm layout. Specifically, this model includes the wake effect and the time delay of the incident wind speed of the different turbines on the farm, and to simulate the fluctuation in the generated power more accurately and more closer to real-time operation. Recently, wind farms with considerable output power ratings have been installed. Their integrating into the utility grid will substantially affect the electricity markets. This thesis investigates the possible impact of wind power variability, wind farm control strategy, wind energy penetration level, wind farm location, and wind power prediction accuracy on the total generation costs and close to real time electricity market prices. These issues are addressed by developing a single auction market model for determining the real-time electricity market prices.
NASA Astrophysics Data System (ADS)
Shibuo, Yoshihiro; Ikoma, Eiji; Lawford, Peter; Oyanagi, Misa; Kanauchi, Shizu; Koudelova, Petra; Kitsuregawa, Masaru; Koike, Toshio
2014-05-01
While availability of hydrological- and hydrometeorological data shows growing tendency and advanced modeling techniques are emerging, such newly available data and advanced models may not always be applied in the field of decision-making. In this study we present an integrated system of ensemble streamflow forecast (ESP) and virtual dam simulator, which is designed to support river and dam manager's decision making. The system consists of three main functions: real time hydrological model, ESP model, and dam simulator model. In the real time model, the system simulates current condition of river basins, such as soil moisture and river discharges, using LSM coupled distributed hydrological model. The ESP model takes initial condition from the real time model's output and generates ESP, based on numerical weather prediction. The dam simulator model provides virtual dam operation and users can experience impact of dam control on remaining reservoir volume and downstream flood under the anticipated flood forecast. Thus the river and dam managers shall be able to evaluate benefit of priori dam release and flood risk reduction at the same time, on real time basis. Furthermore the system has been developed under the concept of data and models integration, and it is coupled with Data Integration and Analysis System (DIAS) - a Japanese national project for integrating and analyzing massive amount of observational and model data. Therefore it has advantage in direct use of miscellaneous data from point/radar-derived observation, numerical weather prediction output, to satellite imagery stored in data archive. Output of the system is accessible over the web interface, making information available with relative ease, e.g. from ordinary PC to mobile devices. We have been applying the system to the Upper Tone region, located northwest from Tokyo metropolitan area, and we show application example of the system in recent flood events caused by typhoons.
NASA Astrophysics Data System (ADS)
Magombeyi, M. S.; Taigbenu, A. E.
Computerised integrated models from science contribute to better informed and holistic assessments of multifaceted policies and technologies than individual models. This view has led to considerable effort being devoted to developing integrated models to support decision-making under integrated water resources management (IWRM). Nevertheless, an appraisal of previous and ongoing efforts to develop such decision support systems shows considerable deficiencies in attempts to address the hydro-socio-economic effects on livelihoods. To date, no universal standard integration method or framework is in use. For the existing integrated models, their application failures have pointed to the lack of stakeholder participation. In an endeavour to close this gap, development and application of a seasonal time-step integrated model with prediction capability is presented in this paper. This model couples existing hydrology, agronomy and socio-economic models with feedbacks to link livelihoods of resource-constrained smallholder farmers to water resources at catchment level in the semi-arid Olifants subbasin in South Africa. These three models, prior to coupling, were calibrated and validated using observed data and participation of local stakeholders. All the models gave good representation of the study conditions, as indicated by the statistical indicators. The integrated model is of general applicability, hence can be extended to other catchments. The impacts of untied ridges, planting basins and supplemental irrigation were compared to conventional rainfed tillage under maize crop production and for different farm typologies. Over the 20 years of simulation, the predicted benefit of untied ridges and planting basins versus conventional rainfed tillage on surface runoff (Mm 3/year) reduction was 14.3% and 19.8%, respectively, and about 41-46% sediment yield (t/year) reduction in the catchment. Under supplemental irrigation, maize yield improved by up to 500% from the long-term average yield of 0.5 t/ha. At 90% confidence interval, family savings improved from between US 4 and US 270 under conventional rainfed to between US 233 and US 1140 under supplemental irrigation. These results highlight the economic and environmental benefits that could be achieved by adopting these improved crop management practices. However, the application of various crop management practices is site-specific and depends on both physical and socio-economic characteristics of the farmers.
Modelling proteins' hidden conformations to predict antibiotic resistance
NASA Astrophysics Data System (ADS)
Hart, Kathryn M.; Ho, Chris M. W.; Dutta, Supratik; Gross, Michael L.; Bowman, Gregory R.
2016-10-01
TEM β-lactamase confers bacteria with resistance to many antibiotics and rapidly evolves activity against new drugs. However, functional changes are not easily explained by differences in crystal structures. We employ Markov state models to identify hidden conformations and explore their role in determining TEM's specificity. We integrate these models with existing drug-design tools to create a new technique, called Boltzmann docking, which better predicts TEM specificity by accounting for conformational heterogeneity. Using our MSMs, we identify hidden states whose populations correlate with activity against cefotaxime. To experimentally detect our predicted hidden states, we use rapid mass spectrometric footprinting and confirm our models' prediction that increased cefotaxime activity correlates with reduced Ω-loop flexibility. Finally, we design novel variants to stabilize the hidden cefotaximase states, and find their populations predict activity against cefotaxime in vitro and in vivo. Therefore, we expect this framework to have numerous applications in drug and protein design.
NASA Technical Reports Server (NTRS)
Mcbeath, Giorgio; Ghorashi, Bahman; Chun, Kue
1993-01-01
A thermal NO(x) prediction model is developed to interface with a CFD, k-epsilon based code. A converged solution from the CFD code is the input to the postprocessing model for prediction of thermal NO(x). The model uses a decoupled analysis to estimate the equilibrium level of (NO(x))e which is the constant rate limit. This value is used to estimate the flame (NO(x)) and in turn predict the rate of formation at each node using a two-step Zeldovich mechanism. The rate is fixed on the NO(x) production rate plot by estimating the time to reach equilibrium by a differential analysis based on the reaction: O + N2 = NO + N. The rate is integrated in the nonequilibrium time space based on the residence time at each node in the computational domain. The sum of all nodal predictions yields the total NO(x) level.
Mathieu, Romain; Vartolomei, Mihai D; Mbeutcha, Aurélie; Karakiewicz, Pierre I; Briganti, Alberto; Roupret, Morgan; Shariat, Shahrokh F
2016-08-01
The aim of this review was to provide an overview of current biomarkers and risk stratification models in urothelial cancer of the upper urinary tract (UTUC). A non-systematic Medline/PubMed literature search was performed using the terms "biomarkers", "preoperative models", "postoperative models", "risk stratification", together with "upper tract urothelial carcinoma". Original articles published between January 2003 and August 2015 were included based on their clinical relevance. Additional references were collected by cross referencing the bibliography of the selected articles. Various promising predictive and prognostic biomarkers have been identified in UTUC thanks to the increasing knowledge of the different biological pathways involved in UTUC tumorigenesis. These biomarkers may help identify tumors with aggressive biology and worse outcomes. Current tools aim at predicting muscle invasive or non-organ confined disease, renal failure after radical nephroureterectomy and survival outcomes. These models are still mainly based on imaging and clinicopathological feature and none has integrated biomarkers. Risk stratification in UTUC is still suboptimal, especially in the preoperative setting due to current limitations in staging and grading. Identification of novel biomarkers and external validation of current prognostic models may help improve risk stratification to allow evidence-based counselling for kidney-sparing approaches, perioperative chemotherapy and/or risk-based surveillance. Despite growing understanding of the biology underlying UTUC, management of this disease remains difficult due to the lack of validated biomarkers and the limitations of current predictive and prognostic tools. Further efforts and collaborations are necessaryry to allow their integration in daily practice.
Efficient Reduction and Analysis of Model Predictive Error
NASA Astrophysics Data System (ADS)
Doherty, J.
2006-12-01
Most groundwater models are calibrated against historical measurements of head and other system states before being used to make predictions in a real-world context. Through the calibration process, parameter values are estimated or refined such that the model is able to reproduce historical behaviour of the system at pertinent observation points reasonably well. Predictions made by the model are deemed to have greater integrity because of this. Unfortunately, predictive integrity is not as easy to achieve as many groundwater practitioners would like to think. The level of parameterisation detail estimable through the calibration process (especially where estimation takes place on the basis of heads alone) is strictly limited, even where full use is made of modern mathematical regularisation techniques such as those encapsulated in the PEST calibration package. (Use of these mechanisms allows more information to be extracted from a calibration dataset than is possible using simpler regularisation devices such as zones of piecewise constancy.) Where a prediction depends on aspects of parameterisation detail that are simply not inferable through the calibration process (which is often the case for predictions related to contaminant movement, and/or many aspects of groundwater/surface water interaction), then that prediction may be just as much in error as it would have been if the model had not been calibrated at all. Model predictive error arises from two sources. These are (a) the presence of measurement noise within the calibration dataset through which linear combinations of parameters spanning the "calibration solution space" are inferred, and (b) the sensitivity of the prediction to members of the "calibration null space" spanned by linear combinations of parameters which are not inferable through the calibration process. The magnitude of the former contribution depends on the level of measurement noise. The magnitude of the latter contribution (which often dominates the former) depends on the "innate variability" of hydraulic properties within the model domain. Knowledge of both of these is a prerequisite for characterisation of the magnitude of possible model predictive error. Unfortunately, in most cases, such knowledge is incomplete and subjective. Nevertheless, useful analysis of model predictive error can still take place. The present paper briefly discusses the means by which mathematical regularisation can be employed in the model calibration process in order to extract as much information as possible on hydraulic property heterogeneity prevailing within the model domain, thereby reducing predictive error to the lowest that can be achieved on the basis of that dataset. It then demonstrates the means by which predictive error variance can be quantified based on information supplied by the regularised inversion process. Both linear and nonlinear predictive error variance analysis is demonstrated using a number of real-world and synthetic examples.
Modeling Stationary Lithium-Ion Batteries for Optimization and Predictive Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri A; Shi, Ying; Christensen, Dane T
Accurately modeling stationary battery storage behavior is crucial to understand and predict its limitations in demand-side management scenarios. In this paper, a lithium-ion battery model was derived to estimate lifetime and state-of-charge for building-integrated use cases. The proposed battery model aims to balance speed and accuracy when modeling battery behavior for real-time predictive control and optimization. In order to achieve these goals, a mixed modeling approach was taken, which incorporates regression fits to experimental data and an equivalent circuit to model battery behavior. A comparison of the proposed battery model output to actual data from the manufacturer validates the modelingmore » approach taken in the paper. Additionally, a dynamic test case demonstrates the effects of using regression models to represent internal resistance and capacity fading.« less
Modeling Stationary Lithium-Ion Batteries for Optimization and Predictive Control: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raszmann, Emma; Baker, Kyri; Shi, Ying
Accurately modeling stationary battery storage behavior is crucial to understand and predict its limitations in demand-side management scenarios. In this paper, a lithium-ion battery model was derived to estimate lifetime and state-of-charge for building-integrated use cases. The proposed battery model aims to balance speed and accuracy when modeling battery behavior for real-time predictive control and optimization. In order to achieve these goals, a mixed modeling approach was taken, which incorporates regression fits to experimental data and an equivalent circuit to model battery behavior. A comparison of the proposed battery model output to actual data from the manufacturer validates the modelingmore » approach taken in the paper. Additionally, a dynamic test case demonstrates the effects of using regression models to represent internal resistance and capacity fading.« less
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
A Grey NGM(1,1, k) Self-Memory Coupling Prediction Model for Energy Consumption Prediction
Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling
2014-01-01
Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span. PMID:25054174
Metabolic network modeling with model organisms.
Yilmaz, L Safak; Walhout, Albertha Jm
2017-02-01
Flux balance analysis (FBA) with genome-scale metabolic network models (GSMNM) allows systems level predictions of metabolism in a variety of organisms. Different types of predictions with different accuracy levels can be made depending on the applied experimental constraints ranging from measurement of exchange fluxes to the integration of gene expression data. Metabolic network modeling with model organisms has pioneered method development in this field. In addition, model organism GSMNMs are useful for basic understanding of metabolism, and in the case of animal models, for the study of metabolic human diseases. Here, we discuss GSMNMs of most highly used model organisms with the emphasis on recent reconstructions. Published by Elsevier Ltd.
Metabolic network modeling with model organisms
Yilmaz, L. Safak; Walhout, Albertha J.M.
2017-01-01
Flux balance analysis (FBA) with genome-scale metabolic network models (GSMNM) allows systems level predictions of metabolism in a variety of organisms. Different types of predictions with different accuracy levels can be made depending on the applied experimental constraints ranging from measurement of exchange fluxes to the integration of gene expression data. Metabolic network modeling with model organisms has pioneered method development in this field. In addition, model organism GSMNMs are useful for basic understanding of metabolism, and in the case of animal models, for the study of metabolic human diseases. Here, we discuss GSMNMs of most highly used model organisms with the emphasis on recent reconstructions. PMID:28088694
NASA Astrophysics Data System (ADS)
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
NASA Technical Reports Server (NTRS)
Steele, W. G.; Molder, K. J.; Hudson, S. T.; Vadasy, K. V.; Rieder, P. T.; Giel, T.
2005-01-01
NASA and the U.S. Air Force are working on a joint project to develop a new hydrogen-fueled, full-flow, staged combustion rocket engine. The initial testing and modeling work for the Integrated Powerhead Demonstrator (IPD) project is being performed by NASA Marshall and Stennis Space Centers. A key factor in the testing of this engine is the ability to predict and measure the transient fluid flow during engine start and shutdown phases of operation. A model built by NASA Marshall in the ROCket Engine Transient Simulation (ROCETS) program is used to predict transient engine fluid flows. The model is initially calibrated to data from previous tests on the Stennis E1 test stand. The model is then used to predict the next run. Data from this run can then be used to recalibrate the model providing a tool to guide the test program in incremental steps to reduce the risk to the prototype engine. In this paper, they define this type of model as a calibrated model. This paper proposes a method to estimate the uncertainty of a model calibrated to a set of experimental test data. The method is similar to that used in the calibration of experiment instrumentation. For the IPD example used in this paper, the model uncertainty is determined for both LOX and LH flow rates using previous data. The successful use of this model is then demonstrated to predict another similar test run within the uncertainty bounds. The paper summarizes the uncertainty methodology when a model is continually recalibrated with new test data. The methodology is general and can be applied to other calibrated models.