Sample records for model development time

  1. Reliability of Degree-Day Models to Predict the Development Time of Plutella xylostella (L.) under Field Conditions.

    PubMed

    Marchioro, C A; Krechemer, F S; de Moraes, C P; Foerster, L A

    2015-12-01

    The diamondback moth, Plutella xylostella (L.), is a cosmopolitan pest of brassicaceous crops occurring in regions with highly distinct climate conditions. Several studies have investigated the relationship between temperature and P. xylostella development rate, providing degree-day models for populations from different geographical regions. However, there are no data available to date to demonstrate the suitability of such models to make reliable projections on the development time for this species in field conditions. In the present study, 19 models available in the literature were tested regarding their ability to accurately predict the development time of two cohorts of P. xylostella under field conditions. Only 11 out of the 19 models tested accurately predicted the development time for the first cohort of P. xylostella, but only seven for the second cohort. Five models correctly predicted the development time for both cohorts evaluated. Our data demonstrate that the accuracy of the models available for P. xylostella varies widely and therefore should be used with caution for pest management purposes.

  2. Nonstandard working schedules and health: the systematic search for a comprehensive model.

    PubMed

    Merkus, Suzanne L; Holte, Kari Anne; Huysmans, Maaike A; van Mechelen, Willem; van der Beek, Allard J

    2015-10-23

    Theoretical models on shift work fall short of describing relevant health-related pathways associated with the broader concept of nonstandard working schedules. Shift work models neither combine relevant working time characteristics applicable to nonstandard schedules nor include the role of rest periods and recovery in the development of health complaints. Therefore, this paper aimed to develop a comprehensive model on nonstandard working schedules to address these shortcomings. A literature review was conducted using a systematic search and selection process. Two searches were performed: one associating the working time characteristics time-of-day and working time duration with health and one associating recovery after work with health. Data extracted from the models were used to develop a comprehensive model on nonstandard working schedules and health. For models on the working time characteristics, the search strategy yielded 3044 references, of which 26 met the inclusion criteria that contained 22 distinctive models. For models on recovery after work, the search strategy yielded 896 references, of which seven met the inclusion criteria containing seven distinctive models. Of the models on the working time characteristics, three combined time-of-day with working time duration, 18 were on time-of-day (i.e. shift work), and one was on working time duration. The model developed in the paper has a comprehensive approach to working hours and other work-related risk factors and proposes that they should be balanced by positive non-work factors to maintain health. Physiological processes leading to health complaints are circadian disruption, sleep deprivation, and activation that should be counterbalanced by (re-)entrainment, restorative sleep, and recovery, respectively, to maintain health. A comprehensive model on nonstandard working schedules and health was developed. The model proposes that work and non-work as well as their associated physiological processes need to be balanced to maintain good health. The model gives researchers a useful overview over the various risk factors and pathways associated with health that should be considered when studying any form of nonstandard working schedule.

  3. Modeling Rabbit Responses to Single and Multiple Aerosol ...

    EPA Pesticide Factsheets

    Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev

  4. Automated time activity classification based on global positioning system (GPS) tracking data

    PubMed Central

    2011-01-01

    Background Air pollution epidemiological studies are increasingly using global positioning system (GPS) to collect time-location data because they offer continuous tracking, high temporal resolution, and minimum reporting burden for participants. However, substantial uncertainties in the processing and classifying of raw GPS data create challenges for reliably characterizing time activity patterns. We developed and evaluated models to classify people's major time activity patterns from continuous GPS tracking data. Methods We developed and evaluated two automated models to classify major time activity patterns (i.e., indoor, outdoor static, outdoor walking, and in-vehicle travel) based on GPS time activity data collected under free living conditions for 47 participants (N = 131 person-days) from the Harbor Communities Time Location Study (HCTLS) in 2008 and supplemental GPS data collected from three UC-Irvine research staff (N = 21 person-days) in 2010. Time activity patterns used for model development were manually classified by research staff using information from participant GPS recordings, activity logs, and follow-up interviews. We evaluated two models: (a) a rule-based model that developed user-defined rules based on time, speed, and spatial location, and (b) a random forest decision tree model. Results Indoor, outdoor static, outdoor walking and in-vehicle travel activities accounted for 82.7%, 6.1%, 3.2% and 7.2% of manually-classified time activities in the HCTLS dataset, respectively. The rule-based model classified indoor and in-vehicle travel periods reasonably well (Indoor: sensitivity > 91%, specificity > 80%, and precision > 96%; in-vehicle travel: sensitivity > 71%, specificity > 99%, and precision > 88%), but the performance was moderate for outdoor static and outdoor walking predictions. No striking differences in performance were observed between the rule-based and the random forest models. The random forest model was fast and easy to execute, but was likely less robust than the rule-based model under the condition of biased or poor quality training data. Conclusions Our models can successfully identify indoor and in-vehicle travel points from the raw GPS data, but challenges remain in developing models to distinguish outdoor static points and walking. Accurate training data are essential in developing reliable models in classifying time-activity patterns. PMID:22082316

  5. Automated time activity classification based on global positioning system (GPS) tracking data.

    PubMed

    Wu, Jun; Jiang, Chengsheng; Houston, Douglas; Baker, Dean; Delfino, Ralph

    2011-11-14

    Air pollution epidemiological studies are increasingly using global positioning system (GPS) to collect time-location data because they offer continuous tracking, high temporal resolution, and minimum reporting burden for participants. However, substantial uncertainties in the processing and classifying of raw GPS data create challenges for reliably characterizing time activity patterns. We developed and evaluated models to classify people's major time activity patterns from continuous GPS tracking data. We developed and evaluated two automated models to classify major time activity patterns (i.e., indoor, outdoor static, outdoor walking, and in-vehicle travel) based on GPS time activity data collected under free living conditions for 47 participants (N = 131 person-days) from the Harbor Communities Time Location Study (HCTLS) in 2008 and supplemental GPS data collected from three UC-Irvine research staff (N = 21 person-days) in 2010. Time activity patterns used for model development were manually classified by research staff using information from participant GPS recordings, activity logs, and follow-up interviews. We evaluated two models: (a) a rule-based model that developed user-defined rules based on time, speed, and spatial location, and (b) a random forest decision tree model. Indoor, outdoor static, outdoor walking and in-vehicle travel activities accounted for 82.7%, 6.1%, 3.2% and 7.2% of manually-classified time activities in the HCTLS dataset, respectively. The rule-based model classified indoor and in-vehicle travel periods reasonably well (Indoor: sensitivity > 91%, specificity > 80%, and precision > 96%; in-vehicle travel: sensitivity > 71%, specificity > 99%, and precision > 88%), but the performance was moderate for outdoor static and outdoor walking predictions. No striking differences in performance were observed between the rule-based and the random forest models. The random forest model was fast and easy to execute, but was likely less robust than the rule-based model under the condition of biased or poor quality training data. Our models can successfully identify indoor and in-vehicle travel points from the raw GPS data, but challenges remain in developing models to distinguish outdoor static points and walking. Accurate training data are essential in developing reliable models in classifying time-activity patterns.

  6. Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems.

    PubMed

    Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick

    2013-01-01

    Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.

  7. Modeling hurricane evacuation traffic : development of a time-dependent hurricane evacuation demand model.

    DOT National Transportation Integrated Search

    2008-04-01

    The objective of this research is to develop alternative time-dependent travel demand models of hurricane evacuation travel and to compare the performance of these models with each other and with the state-of-the-practice models in current use. Speci...

  8. Individual Differences in Boys’ and Girls’ Timing and Tempo of Puberty: Modeling Development With Nonlinear Growth Models

    PubMed Central

    Marceau, Kristine; Ram, Nilam; Houts, Renate M.; Grimm, Kevin J.; Susman, Elizabeth J.

    2014-01-01

    Pubertal development is a nonlinear process progressing from prepubescent beginnings through biological, physical, and psychological changes to full sexual maturity. To tether theoretical concepts of puberty with sophisticated longitudinal, analytical models capable of articulating pubertal development more accurately, we used nonlinear mixed-effects models to describe both the timing and tempo of pubertal development in the sample of 364 White boys and 373 White girls measured across 6 years as part of the National Institute of Child Health and Human Development Study of Early Child Care and Youth Development. Individual differences in timing and tempo were extracted with models of logistic growth. Differential relations emerged for how boys’ and girls’ timing and tempo of development were related to physical characteristics (body mass index, height, and weight) and psychological outcomes (internalizing problems, externalizing problems, and risky sexual behavior). Timing and tempo are associated in boys but not girls. Pubertal timing and tempo are particularly important for predicting psychological outcomes in girls but only sparsely related to boys’ psychological outcomes. Results highlight the importance of considering the nonlinear nature of puberty and expand the repertoire of possibilities for examining important aspects of how and when pubertal processes contribute to development. PMID:21639623

  9. Road safety forecasts in five European countries using structural time series models.

    PubMed

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  10. The optimal manufacturing batch size with rework under time-varying demand process for a finite time horizon

    NASA Astrophysics Data System (ADS)

    Musa, Sarah; Supadi, Siti Suzlin; Omar, Mohd

    2014-07-01

    Rework is one of the solutions to some of the main issues in reverse logistic and green supply chain as it reduces production cost and environmental problem. Many researchers focus on developing rework model, but to the knowledge of the author, none of them has developed a model for time-varying demand rate. In this paper, we extend previous works and develop multiple batch production system for time-varying demand rate with rework. In this model, the rework is done within the same production cycle.

  11. Real-Time Simulation of the X-33 Aerospace Engine

    NASA Technical Reports Server (NTRS)

    Aguilar, Robert

    1999-01-01

    This paper discusses the development and performance of the X-33 Aerospike Engine RealTime Model. This model was developed for the purposes of control law development, six degree-of-freedom trajectory analysis, vehicle system integration testing, and hardware-in-the loop controller verification. The Real-Time Model uses time-step marching solution of non-linear differential equations representing the physical processes involved in the operation of a liquid propellant rocket engine, albeit in a simplified form. These processes include heat transfer, fluid dynamics, combustion, and turbomachine performance. Two engine models are typically employed in order to accurately model maneuvering and the powerpack-out condition where the power section of one engine is used to supply propellants to both engines if one engine malfunctions. The X-33 Real-Time Model is compared to actual hot fire test data and is been found to be in good agreement.

  12. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  13. Hot-bench simulation of the active flexible wing wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Houck, Jacob A.

    1990-01-01

    Two simulations, one batch and one real-time, of an aeroelastically-scaled wind-tunnel model were developed. The wind-tunnel model was a full-span, free-to-roll model of an advanced fighter concept. The batch simulation was used to generate and verify the real-time simulation and to test candidate control laws prior to implementation. The real-time simulation supported hot-bench testing of a digital controller, which was developed to actively control the elastic deformation of the wind-tunnel model. Time scaling was required for hot-bench testing. The wind-tunnel model, the mathematical models for the simulations, the techniques employed to reduce the hot-bench time-scale factors, and the verification procedures are described.

  14. Green Pea and Garlic Puree Model Food Development for Thermal Pasteurization Process Quality Evaluation.

    PubMed

    Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang

    2017-07-01

    Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.

  15. Development of constitutive model for composites exhibiting time dependent properties

    NASA Astrophysics Data System (ADS)

    Pupure, L.; Joffe, R.; Varna, J.; Nyström, B.

    2013-12-01

    Regenerated cellulose fibres and their composites exhibit highly nonlinear behaviour. The mechanical response of these materials can be successfully described by the model developed by Schapery for time-dependent materials. However, this model requires input parameters that are experimentally determined via large number of time-consuming tests on the studied composite material. If, for example, the volume fraction of fibres is changed we have a different material and new series of experiments on this new material are required. Therefore the ultimate objective of our studies is to develop model which determines the composite behaviour based on behaviour of constituents of the composite. This paper gives an overview of problems and difficulties, associated with development, implementation and verification of such model.

  16. Real-Time Global Nonlinear Aerodynamic Modeling for Learn-To-Fly

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2016-01-01

    Flight testing and modeling techniques were developed to accurately identify global nonlinear aerodynamic models for aircraft in real time. The techniques were developed and demonstrated during flight testing of a remotely-piloted subscale propeller-driven fixed-wing aircraft using flight test maneuvers designed to simulate a Learn-To-Fly scenario. Prediction testing was used to evaluate the quality of the global models identified in real time. The real-time global nonlinear aerodynamic modeling algorithm will be integrated and further tested with learning adaptive control and guidance for NASA Learn-To-Fly concept flight demonstrations.

  17. Exploratory Study for Continuous-time Parameter Estimation of Ankle Dynamics

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.; Boyle, Richard D.

    2014-01-01

    Recently, a parallel pathway model to describe ankle dynamics was proposed. This model provides a relationship between ankle angle and net ankle torque as the sum of a linear and nonlinear contribution. A technique to identify parameters of this model in discrete-time has been developed. However, these parameters are a nonlinear combination of the continuous-time physiology, making insight into the underlying physiology impossible. The stable and accurate estimation of continuous-time parameters is critical for accurate disease modeling, clinical diagnosis, robotic control strategies, development of optimal exercise protocols for longterm space exploration, sports medicine, etc. This paper explores the development of a system identification technique to estimate the continuous-time parameters of ankle dynamics. The effectiveness of this approach is assessed via simulation of a continuous-time model of ankle dynamics with typical parameters found in clinical studies. The results show that although this technique improves estimates, it does not provide robust estimates of continuous-time parameters of ankle dynamics. Due to this we conclude that alternative modeling strategies and more advanced estimation techniques be considered for future work.

  18. REAL-TIME MODELING OF MOTOR VEHICLE EMISSIONS FOR ESTIMATING HUMAN EXPOSURES NEAR ROADWAYS

    EPA Science Inventory

    The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory is developing a real-time model of motor vehicle emissions to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop ...

  19. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  20. Time delays, population, and economic development

    NASA Astrophysics Data System (ADS)

    Gori, Luca; Guerrini, Luca; Sodini, Mauro

    2018-05-01

    This research develops an augmented Solow model with population dynamics and time delays. The model produces either a single stationary state or multiple stationary states (able to characterise different development regimes). The existence of time delays may cause persistent fluctuations in both economic and demographic variables. In addition, the work identifies in a simple way the reasons why economics affects demographics and vice versa.

  1. Analysis of EDZ Development of Columnar Jointed Rock Mass in the Baihetan Diversion Tunnel

    NASA Astrophysics Data System (ADS)

    Hao, Xian-Jie; Feng, Xia-Ting; Yang, Cheng-Xiang; Jiang, Quan; Li, Shao-Jun

    2016-04-01

    Due to the time dependency of the crack propagation, columnar jointed rock masses exhibit marked time-dependent behaviour. In this study, in situ measurements, scanning electron microscope (SEM), back-analysis method and numerical simulations are presented to study the time-dependent development of the excavation damaged zone (EDZ) around underground diversion tunnels in a columnar jointed rock mass. Through in situ measurements of crack propagation and EDZ development, their extent is seen to have increased over time, despite the fact that the advancing face has passed. Similar to creep behaviour, the time-dependent EDZ development curve also consists of three stages: a deceleration stage, a stabilization stage, and an acceleration stage. A corresponding constitutive model of columnar jointed rock mass considering time-dependent behaviour is proposed. The time-dependent degradation coefficient of the roughness coefficient and residual friction angle in the Barton-Bandis strength criterion are taken into account. An intelligent back-analysis method is adopted to obtain the unknown time-dependent degradation coefficients for the proposed constitutive model. The numerical modelling results are in good agreement with the measured EDZ. Not only that, the failure pattern simulated by this time-dependent constitutive model is consistent with that observed in the scanning electron microscope (SEM) and in situ observation, indicating that this model could accurately simulate the failure pattern and time-dependent EDZ development of columnar joints. Moreover, the effects of the support system provided and the in situ stress on the time-dependent coefficients are studied. Finally, the long-term stability analysis of diversion tunnels excavated in columnar jointed rock masses is performed.

  2. Development of time-trend model for analysing and predicting case pattern of dog bite injury induced rabies-like-illness in Liberia, 2014-2017.

    PubMed

    Jomah, N D; Ojo, J F; Odigie, E A; Olugasa, B O

    2014-12-01

    The post-civil war records of dog bite injuries (DBI) and rabies-like-illness (RLI) among humans in Liberia is a vital epidemiological resource for developing a predictive model to guide the allocation of resources towards human rabies control. Whereas DBI and RLI are high, they are largely under-reported. The objective of this study was to develop a time model of the case-pattern and apply it to derive predictors of time-trend point distribution of DBI-RLI cases. A retrospective 6 years data of DBI distribution among humans countrywide were converted to quarterly series using a transformation technique of Minimizing Squared First Difference statistic. The generated dataset was used to train a time-trend model of the DBI-RLI syndrome in Liberia. An additive detenninistic time-trend model was selected due to its performance compared to multiplication model of trend and seasonal movement. Parameter predictors were run on least square method to predict DBI cases for a prospective 4 years period, covering 2014-2017. The two-stage predictive model of DBI case-pattern between 2014 and 2017 was characterised by a uniform upward trend within Liberia's coastal and hinterland Counties over the forecast period. This paper describes a translational application of the time-trend distribution pattern of DBI epidemics, 2008-2013 reported in Liberia, on which a predictive model was developed. A computationally feasible two-stage time-trend permutation approach is proposed to estimate the time-trend parameters and conduct predictive inference on DBI-RLI in Liberia.

  3. Clinical Predictive Modeling Development and Deployment through FHIR Web Services.

    PubMed

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction.

  4. Clinical Predictive Modeling Development and Deployment through FHIR Web Services

    PubMed Central

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction. PMID:26958207

  5. Light-weight Parallel Python Tools for Earth System Modeling Workflows

    NASA Astrophysics Data System (ADS)

    Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.

    2015-12-01

    With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.

  6. Analyzing developmental processes on an individual level using nonstationary time series modeling.

    PubMed

    Molenaar, Peter C M; Sinclair, Katerina O; Rovine, Michael J; Ram, Nilam; Corneal, Sherry E

    2009-01-01

    Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in developmental processes over time using a multivariate nonstationary time series model. They apply this model to describe the changing relationships between a biological son and father and a stepson and stepfather at the individual level. The authors also explain how to use an extended Kalman filter with iteration and smoothing estimator to capture how dynamics change over time. Finally, they suggest further applications of the multivariate nonstationary time series model and detail the next steps in the development of statistical models used to analyze individual-level data.

  7. Development and parameter identification of a visco-hyperelastic model for the periodontal ligament.

    PubMed

    Huang, Huixiang; Tang, Wencheng; Tan, Qiyan; Yan, Bin

    2017-04-01

    The present study developed and implemented a new visco-hyperelastic model that is capable of predicting the time-dependent biomechanical behavior of the periodontal ligament. The constitutive model has been implemented into the finite element package ABAQUS by means of a user-defined material subroutine (UMAT). The stress response is decomposed into two constitutive parts in parallel which are a hyperelastic and a time-dependent viscoelastic stress response. In order to identify the model parameters, the indentation equation based on V-W hyperelastic model and the indentation creep model are developed. Then the parameters are determined by fitting them to the corresponding nanoindentation experimental data of the PDL. The nanoindentation experiment was simulated by finite element analysis to validate the visco-hyperelastic model. The simulated results are in good agreement with the experimental data, which demonstrates that the visco-hyperelastic model developed is able to accurately predict the time-dependent mechanical behavior of the PDL. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Development of three-dimensional patient face model that enables real-time collision detection and cutting operation for a dental simulator.

    PubMed

    Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi

    2012-01-01

    The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.

  9. Real Time Data Management for Estimating Probabilities of Incidents and Near Misses

    NASA Astrophysics Data System (ADS)

    Stanitsas, P. D.; Stephanedes, Y. J.

    2011-08-01

    Advances in real-time data collection, data storage and computational systems have led to development of algorithms for transport administrators and engineers that improve traffic safety and reduce cost of road operations. Despite these advances, problems in effectively integrating real-time data acquisition, processing, modelling and road-use strategies at complex intersections and motorways remain. These are related to increasing system performance in identification, analysis, detection and prediction of traffic state in real time. This research develops dynamic models to estimate the probability of road incidents, such as crashes and conflicts, and incident-prone conditions based on real-time data. The models support integration of anticipatory information and fee-based road use strategies in traveller information and management. Development includes macroscopic/microscopic probabilistic models, neural networks, and vector autoregressions tested via machine vision at EU and US sites.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Jim Bouchard

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less

  11. Motivation and timing: Clues for modeling the reward system

    PubMed Central

    Galtress, Tiffany; Marshall, Andrew T.; Kirkpatrick, Kimberly

    2012-01-01

    There is growing evidence that a change in reward magnitude or value alters interval timing, indicating that motivation and timing are not independent processes as was previously believed. The present paper reviews several recent studies, as well as presenting some new evidence with further manipulations of reward value during training vs. testing on a peak procedure. The combined results cannot be accounted for by any of the current psychological timing theories. However, in examining the neural circuitry of the reward system, it is not surprising that motivation has an impact on timing because the motivation/valuation system directly interfaces with the timing system. A new approach is proposed for the development of the next generation of timing models, which utilizes knowledge of the neuroanatomy and neurophysiology of the reward system to guide the development of a neurocomputational model of the reward system. The initial foundation along with heuristics for proceeding with developing such a model is unveiled in an attempt to stimulate new theoretical approaches in the field. PMID:22421220

  12. Study of mathematical modeling of communication systems transponders and receivers

    NASA Technical Reports Server (NTRS)

    Walsh, J. R.

    1972-01-01

    The modeling of communication receivers is described at both the circuit detail level and at the block level. The largest effort was devoted to developing new models at the block modeling level. The available effort did not permit full development of all of the block modeling concepts envisioned, but idealized blocks were developed for signal sources, a variety of filters, limiters, amplifiers, mixers, and demodulators. These blocks were organized into an operational computer simulation of communications receiver circuits identified as the frequency and time circuit analysis technique (FATCAT). The simulation operates in both the time and frequency domains, and permits output plots or listings of either frequency spectra or time waveforms from any model block. Transfer between domains is handled with a fast Fourier transform algorithm.

  13. Guidelines and Procedures for Computing Time-Series Suspended-Sediment Concentrations and Loads from In-Stream Turbidity-Sensor and Streamflow Data

    USGS Publications Warehouse

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.

    2009-01-01

    In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.

  14. Developing and Validating Path-Dependent Uncertainty Estimates for use with the Regional Seismic Travel Time (RSTT) Model

    NASA Astrophysics Data System (ADS)

    Begnaud, M. L.; Anderson, D. N.; Phillips, W. S.; Myers, S. C.; Ballard, S.

    2016-12-01

    The Regional Seismic Travel Time (RSTT) tomography model has been developed to improve travel time predictions for regional phases (Pn, Sn, Pg, Lg) in order to increase seismic location accuracy, especially for explosion monitoring. The RSTT model is specifically designed to exploit regional phases for location, especially when combined with teleseismic arrivals. The latest RSTT model (version 201404um) has been released (http://www.sandia.gov/rstt). Travel time uncertainty estimates for RSTT are determined using one-dimensional (1D), distance-dependent error models, that have the benefit of being very fast to use in standard location algorithms, but do not account for path-dependent variations in error, and structural inadequacy of the RSTTT model (e.g., model error). Although global in extent, the RSTT tomography model is only defined in areas where data exist. A simple 1D error model does not accurately model areas where RSTT has not been calibrated. We are developing and validating a new error model for RSTT phase arrivals by mathematically deriving this multivariate model directly from a unified model of RSTT embedded into a statistical random effects model that captures distance, path and model error effects. An initial method developed is a two-dimensional path-distributed method using residuals. The goals for any RSTT uncertainty method are for it to be both readily useful for the standard RSTT user as well as improve travel time uncertainty estimates for location. We have successfully tested using the new error model for Pn phases and will demonstrate the method and validation of the error model for Sn, Pg, and Lg phases.

  15. Toward a comprehensive model of antisocial development: a dynamic systems approach.

    PubMed

    Granic, Isabela; Patterson, Gerald R

    2006-01-01

    The purpose of this article is to develop a preliminary comprehensive model of antisocial development based on dynamic systems principles. The model is built on the foundations of behavioral research on coercion theory. First, the authors focus on the principles of multistability, feedback, and nonlinear causality to reconceptualize real-time parent-child and peer processes. Second, they model the mechanisms by which these real-time processes give rise to negative developmental outcomes, which in turn feed back to determine real-time interactions. Third, they examine mechanisms of change and stability in early- and late-onset antisocial trajectories. Finally, novel clinical designs and predictions are introduced. The authors highlight new predictions and present studies that have tested aspects of the model

  16. Alternative approach to modeling bacterial lag time, using logistic regression as a function of time, temperature, pH, and sodium chloride concentration.

    PubMed

    Koseki, Shige; Nonaka, Junko

    2012-09-01

    The objective of this study was to develop a probabilistic model to predict the end of lag time (λ) during the growth of Bacillus cereus vegetative cells as a function of temperature, pH, and salt concentration using logistic regression. The developed λ model was subsequently combined with a logistic differential equation to simulate bacterial numbers over time. To develop a novel model for λ, we determined whether bacterial growth had begun, i.e., whether λ had ended, at each time point during the growth kinetics. The growth of B. cereus was evaluated by optical density (OD) measurements in culture media for various pHs (5.5 ∼ 7.0) and salt concentrations (0.5 ∼ 2.0%) at static temperatures (10 ∼ 20°C). The probability of the end of λ was modeled using dichotomous judgments obtained at each OD measurement point concerning whether a significant increase had been observed. The probability of the end of λ was described as a function of time, temperature, pH, and salt concentration and showed a high goodness of fit. The λ model was validated with independent data sets of B. cereus growth in culture media and foods, indicating acceptable performance. Furthermore, the λ model, in combination with a logistic differential equation, enabled a simulation of the population of B. cereus in various foods over time at static and/or fluctuating temperatures with high accuracy. Thus, this newly developed modeling procedure enables the description of λ using observable environmental parameters without any conceptual assumptions and the simulation of bacterial numbers over time with the use of a logistic differential equation.

  17. Time on Your Hands: Modeling Time

    ERIC Educational Resources Information Center

    Finson, Kevin; Beaver, John

    2007-01-01

    Building physical models relative to a concept can be an important activity to help students develop and manipulate abstract ideas and mental models that often prove difficult to grasp. One such concept is "time". A method for helping students understand the cyclical nature of time involves the construction of a Time Zone Calculator through a…

  18. Modeling Pubertal Timing and Tempo and Examining Links to Behavior Problems

    ERIC Educational Resources Information Center

    Beltz, Adriene M.; Corley, Robin P.; Bricker, Josh B.; Wadsworth, Sally J.; Berenbaum, Sheri A.

    2014-01-01

    Research on the role of puberty in adolescent psychological development requires attention to the meaning and measurement of pubertal development. Particular questions concern the utility of self-report, the need for complex models to describe pubertal development, the psychological significance of pubertal timing vs. tempo, and sex differences in…

  19. A Dynamical System Approach Explaining the Process of Development by Introducing Different Time-scales.

    PubMed

    Hashemi Kamangar, Somayeh Sadat; Moradimanesh, Zahra; Mokhtari, Setareh; Bakouie, Fatemeh

    2018-06-11

    A developmental process can be described as changes through time within a complex dynamic system. The self-organized changes and emergent behaviour during development can be described and modeled as a dynamical system. We propose a dynamical system approach to answer the main question in human cognitive development i.e. the changes during development happens continuously or in discontinuous stages. Within this approach there is a concept; the size of time scales, which can be used to address the aforementioned question. We introduce a framework, by considering the concept of time-scale, in which "fast" and "slow" is defined by the size of time-scales. According to our suggested model, the overall pattern of development can be seen as one continuous function, with different time-scales in different time intervals.

  20. Comparison of Conventional and ANN Models for River Flow Forecasting

    NASA Astrophysics Data System (ADS)

    Jain, A.; Ganti, R.

    2011-12-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. River flow is generally estimated using time series or rainfall-runoff models. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been extensively adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conventional models. In this paper, a comparative study has been carried out for river flow forecasting using the conventional and ANN models. Among the conventional models, multiple linear, and non linear regression, and time series models of auto regressive (AR) type have been developed. Feed forward neural network model structure trained using the back propagation algorithm, a gradient search method, was adopted. The daily river flow data derived from Godavari Basin @ Polavaram, Andhra Pradesh, India have been employed to develop all the models included here. Two inputs, flows at two past time steps, (Q(t-1) and Q(t-2)) were selected using partial auto correlation analysis for forecasting flow at time t, Q(t). A wide range of error statistics have been used to evaluate the performance of all the models developed in this study. It has been found that the regression and AR models performed comparably, and the ANN model performed the best amongst all the models investigated in this study. It is concluded that ANN model should be adopted in real catchments for hydrological modeling and forecasting.

  1. Real-time Social Internet Data to Guide Forecasting Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Valle, Sara Y.

    Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less

  2. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    NASA Astrophysics Data System (ADS)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  3. The practical use of simplicity in developing ground water models

    USGS Publications Warehouse

    Hill, M.C.

    2006-01-01

    The advantages of starting with simple models and building complexity slowly can be significant in the development of ground water models. In many circumstances, simpler models are characterized by fewer defined parameters and shorter execution times. In this work, the number of parameters is used as the primary measure of simplicity and complexity; the advantages of shorter execution times also are considered. The ideas are presented in the context of constructing ground water models but are applicable to many fields. Simplicity first is put in perspective as part of the entire modeling process using 14 guidelines for effective model calibration. It is noted that neither very simple nor very complex models generally produce the most accurate predictions and that determining the appropriate level of complexity is an ill-defined process. It is suggested that a thorough evaluation of observation errors is essential to model development. Finally, specific ways are discussed to design useful ground water models that have fewer parameters and shorter execution times.

  4. Analysis of Time-Series Quasi-Experiments. Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; Maguire, Thomas O.

    The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…

  5. Predicting the Risk of Attrition for Undergraduate Students with Time Based Modelling

    ERIC Educational Resources Information Center

    Chai, Kevin E. K.; Gibson, David

    2015-01-01

    Improving student retention is an important and challenging problem for universities. This paper reports on the development of a student attrition model for predicting which first year students are most at-risk of leaving at various points in time during their first semester of study. The objective of developing such a model is to assist…

  6. Real-time monitoring of a microbial electrolysis cell using an electrical equivalent circuit model.

    PubMed

    Hussain, S A; Perrier, M; Tartakovsky, B

    2018-04-01

    Efforts in developing microbial electrolysis cells (MECs) resulted in several novel approaches for wastewater treatment and bioelectrosynthesis. Practical implementation of these approaches necessitates the development of an adequate system for real-time (on-line) monitoring and diagnostics of MEC performance. This study describes a simple MEC equivalent electrical circuit (EEC) model and a parameter estimation procedure, which enable such real-time monitoring. The proposed approach involves MEC voltage and current measurements during its operation with periodic power supply connection/disconnection (on/off operation) followed by parameter estimation using either numerical or analytical solution of the model. The proposed monitoring approach is demonstrated using a membraneless MEC with flow-through porous electrodes. Laboratory tests showed that changes in the influent carbon source concentration and composition significantly affect MEC total internal resistance and capacitance estimated by the model. Fast response of these EEC model parameters to changes in operating conditions enables the development of a model-based approach for real-time monitoring and fault detection.

  7. Evaluation of a Multi-Axial, Temperature, and Time Dependent (MATT) Failure Model

    NASA Technical Reports Server (NTRS)

    Richardson, D. E.; Anderson, G. L.; Macon, D. J.; Rudolphi, Michael (Technical Monitor)

    2002-01-01

    To obtain a better understanding the response of the structural adhesives used in the Space Shuttle's Reusable Solid Rocket Motor (RSRM) nozzle, an extensive effort has been conducted to characterize in detail the failure properties of these adhesives. This effort involved the development of a failure model that includes the effects of multi-axial loading, temperature, and time. An understanding of the effects of these parameters on the failure of the adhesive is crucial to the understanding and prediction of the safety of the RSRM nozzle. This paper documents the use of this newly developed multi-axial, temperature, and time (MATT) dependent failure model for modeling failure for the adhesives TIGA 321, EA913NA, and EA946. The development of the mathematical failure model using constant load rate normal and shear test data is presented. Verification of the accuracy of the failure model is shown through comparisons between predictions and measured creep and multi-axial failure data. The verification indicates that the failure model performs well for a wide range of conditions (loading, temperature, and time) for the three adhesives. The failure criterion is shown to be accurate through the glass transition for the adhesive EA946. Though this failure model has been developed and evaluated with adhesives, the concepts are applicable for other isotropic materials.

  8. Predicting Drug Concentration‐Time Profiles in Multiple CNS Compartments Using a Comprehensive Physiologically‐Based Pharmacokinetic Model

    PubMed Central

    Yamamoto, Yumi; Välitalo, Pyry A.; Huntjens, Dymphy R.; Proost, Johannes H.; Vermeulen, An; Krauwinkel, Walter; Beukers, Margot W.; van den Berg, Dirk‐Jan; Hartman, Robin; Wong, Yin Cheong; Danhof, Meindert; van Hasselt, John G. C.

    2017-01-01

    Drug development targeting the central nervous system (CNS) is challenging due to poor predictability of drug concentrations in various CNS compartments. We developed a generic physiologically based pharmacokinetic (PBPK) model for prediction of drug concentrations in physiologically relevant CNS compartments. System‐specific and drug‐specific model parameters were derived from literature and in silico predictions. The model was validated using detailed concentration‐time profiles from 10 drugs in rat plasma, brain extracellular fluid, 2 cerebrospinal fluid sites, and total brain tissue. These drugs, all small molecules, were selected to cover a wide range of physicochemical properties. The concentration‐time profiles for these drugs were adequately predicted across the CNS compartments (symmetric mean absolute percentage error for the model prediction was <91%). In conclusion, the developed PBPK model can be used to predict temporal concentration profiles of drugs in multiple relevant CNS compartments, which we consider valuable information for efficient CNS drug development. PMID:28891201

  9. Remote sensing of aquatic vegetation distribution in Taihu Lake using an improved classification tree with modified thresholds.

    PubMed

    Zhao, Dehua; Jiang, Hao; Yang, Tangwu; Cai, Ying; Xu, Delin; An, Shuqing

    2012-03-01

    Classification trees (CT) have been used successfully in the past to classify aquatic vegetation from spectral indices (SI) obtained from remotely-sensed images. However, applying CT models developed for certain image dates to other time periods within the same year or among different years can reduce the classification accuracy. In this study, we developed CT models with modified thresholds using extreme SI values (CT(m)) to improve the stability of the models when applying them to different time periods. A total of 903 ground-truth samples were obtained in September of 2009 and 2010 and classified as emergent, floating-leaf, or submerged vegetation or other cover types. Classification trees were developed for 2009 (Model-09) and 2010 (Model-10) using field samples and a combination of two images from winter and summer. Overall accuracies of these models were 92.8% and 94.9%, respectively, which confirmed the ability of CT analysis to map aquatic vegetation in Taihu Lake. However, Model-10 had only 58.9-71.6% classification accuracy and 31.1-58.3% agreement (i.e., pixels classified the same in the two maps) for aquatic vegetation when it was applied to image pairs from both a different time period in 2010 and a similar time period in 2009. We developed a method to estimate the effects of extrinsic (EF) and intrinsic (IF) factors on model uncertainty using Modis images. Results indicated that 71.1% of the instability in classification between time periods was due to EF, which might include changes in atmospheric conditions, sun-view angle and water quality. The remainder was due to IF, such as phenological and growth status differences between time periods. The modified version of Model-10 (i.e. CT(m)) performed better than traditional CT with different image dates. When applied to 2009 images, the CT(m) version of Model-10 had very similar thresholds and performance as Model-09, with overall accuracies of 92.8% and 90.5% for Model-09 and the CT(m) version of Model-10, respectively. CT(m) decreased the variability related to EF and IF and thereby improved the applicability of the models to different time periods. In both practice and theory, our results suggested that CT(m) was more stable than traditional CT models and could be used to map aquatic vegetation in time periods other than the one for which the model was developed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Lessons learned in the transition to Ada from FORTRAN at NASA/Goddard

    NASA Technical Reports Server (NTRS)

    Brophy, Carolyn Elizabeth

    1989-01-01

    Two dynamics satellite simulators are developed from the same requirements, one in Ada and the other in FORTRAN. The purpose of the research was to find out how well the prescriptive Ada development model worked to develop the Ada simulator. The FORTRAN simulator development, as well as past FORTRAN developments, provided a baseline for comparison. Since this was the first simulator developed, the prescriptive Ada development model had many similarities to the usual FORTRAN development model. However, it was modified to include longer design and shorter testing phases, which is generally expected with Ada developments. One result was that the percentage of time the Ada project spent in the various development activities was very similar to the percentage of time spent in these activities when doing a FORTRAN project. Another finding was the difficulty the Ada team had with unit testing as well as with integration. It was realized that adding additional steps to the design phase, such as an abstract data type analysis, and certain guidelines to the implementation phase, such as to use primarily library units and nest sparingly, would have made development easier. These are among the recommendations made to be incorporated in a new Ada development model next time.

  11. Hierarchical Diffusion Models for Two-Choice Response Times

    ERIC Educational Resources Information Center

    Vandekerckhove, Joachim; Tuerlinckx, Francis; Lee, Michael D.

    2011-01-01

    Two-choice response times are a common type of data, and much research has been devoted to the development of process models for such data. However, the practical application of these models is notoriously complicated, and flexible methods are largely nonexistent. We combine a popular model for choice response times--the Wiener diffusion…

  12. Competing risks models and time-dependent covariates

    PubMed Central

    Barnett, Adrian; Graves, Nick

    2008-01-01

    New statistical models for analysing survival data in an intensive care unit context have recently been developed. Two models that offer significant advantages over standard survival analyses are competing risks models and multistate models. Wolkewitz and colleagues used a competing risks model to examine survival times for nosocomial pneumonia and mortality. Their model was able to incorporate time-dependent covariates and so examine how risk factors that changed with time affected the chances of infection or death. We briefly explain how an alternative modelling technique (using logistic regression) can more fully exploit time-dependent covariates for this type of data. PMID:18423067

  13. Characterization of anomalous relaxation using the time-fractional Bloch equation and multiple echo T2 *-weighted magnetic resonance imaging at 7 T.

    PubMed

    Qin, Shanlin; Liu, Fawang; Turner, Ian W; Yu, Qiang; Yang, Qianqian; Vegh, Viktor

    2017-04-01

    To study the utility of fractional calculus in modeling gradient-recalled echo MRI signal decay in the normal human brain. We solved analytically the extended time-fractional Bloch equations resulting in five model parameters, namely, the amplitude, relaxation rate, order of the time-fractional derivative, frequency shift, and constant offset. Voxel-level temporal fitting of the MRI signal was performed using the classical monoexponential model, a previously developed anomalous relaxation model, and using our extended time-fractional relaxation model. Nine brain regions segmented from multiple echo gradient-recalled echo 7 Tesla MRI data acquired from five participants were then used to investigate the characteristics of the extended time-fractional model parameters. We found that the extended time-fractional model is able to fit the experimental data with smaller mean squared error than the classical monoexponential relaxation model and the anomalous relaxation model, which do not account for frequency shift. We were able to fit multiple echo time MRI data with high accuracy using the developed model. Parameters of the model likely capture information on microstructural and susceptibility-induced changes in the human brain. Magn Reson Med 77:1485-1494, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  14. Motivation and timing: clues for modeling the reward system.

    PubMed

    Galtress, Tiffany; Marshall, Andrew T; Kirkpatrick, Kimberly

    2012-05-01

    There is growing evidence that a change in reward magnitude or value alters interval timing, indicating that motivation and timing are not independent processes as was previously believed. The present paper reviews several recent studies, as well as presenting some new evidence with further manipulations of reward value during training vs. testing on a peak procedure. The combined results cannot be accounted for by any of the current psychological timing theories. However, in examining the neural circuitry of the reward system, it is not surprising that motivation has an impact on timing because the motivation/valuation system directly interfaces with the timing system. A new approach is proposed for the development of the next generation of timing models, which utilizes knowledge of the neuroanatomy and neurophysiology of the reward system to guide the development of a neurocomputational model of the reward system. The initial foundation along with heuristics for proceeding with developing such a model is unveiled in an attempt to stimulate new theoretical approaches in the field. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Refining Time-Activity Classification of Human Subjects Using the Global Positioning System.

    PubMed

    Hu, Maogui; Li, Wei; Li, Lianfa; Houston, Douglas; Wu, Jun

    2016-01-01

    Detailed spatial location information is important in accurately estimating personal exposure to air pollution. Global Position System (GPS) has been widely used in tracking personal paths and activities. Previous researchers have developed time-activity classification models based on GPS data, most of them were developed for specific regions. An adaptive model for time-location classification can be widely applied to air pollution studies that use GPS to track individual level time-activity patterns. Time-activity data were collected for seven days using GPS loggers and accelerometers from thirteen adult participants from Southern California under free living conditions. We developed an automated model based on random forests to classify major time-activity patterns (i.e. indoor, outdoor-static, outdoor-walking, and in-vehicle travel). Sensitivity analysis was conducted to examine the contribution of the accelerometer data and the supplemental spatial data (i.e. roadway and tax parcel data) to the accuracy of time-activity classification. Our model was evaluated using both leave-one-fold-out and leave-one-subject-out methods. Maximum speeds in averaging time intervals of 7 and 5 minutes, and distance to primary highways with limited access were found to be the three most important variables in the classification model. Leave-one-fold-out cross-validation showed an overall accuracy of 99.71%. Sensitivities varied from 84.62% (outdoor walking) to 99.90% (indoor). Specificities varied from 96.33% (indoor) to 99.98% (outdoor static). The exclusion of accelerometer and ambient light sensor variables caused a slight loss in sensitivity for outdoor walking, but little loss in overall accuracy. However, leave-one-subject-out cross-validation showed considerable loss in sensitivity for outdoor static and outdoor walking conditions. The random forests classification model can achieve high accuracy for the four major time-activity categories. The model also performed well with just GPS, road and tax parcel data. However, caution is warranted when generalizing the model developed from a small number of subjects to other populations.

  16. A high fidelity real-time simulation of a small turboshaft engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1988-01-01

    A high-fidelity component-type model and real-time digital simulation of the General Electric T700-GE-700 turboshaft engine were developed for use with current generation real-time blade-element rotor helicopter simulations. A control system model based on the specification fuel control system used in the UH-60A Black Hawk helicopter is also presented. The modeling assumptions and real-time digital implementation methods particular to the simulation of small turboshaft engines are described. The validity of the simulation is demonstrated by comparison with analysis-oriented simulations developed by the manufacturer, available test data, and flight-test time histories.

  17. Development of In Vitro-In Vivo Correlation for Potassium Chloride Extended Release Tablet Formulation Using Urinary Pharmacokinetic Data.

    PubMed

    Mittapalli, Rajendar K; Marroum, Patrick; Qiu, Yihong; Apfelbaum, Kathleen; Xiong, Hao

    2017-07-01

    To develop and validate a Level A in vitro-in vivo correlation (IVIVC) for potassium chloride extended-release (ER) formulations. Three prototype ER formulations of potassium chloride with different in vitro release rates were developed and their urinary pharmacokinetic profiles were evaluated in healthy subjects. A mathematical model between in vitro dissolution and in vivo urinary excretion, a surrogate for measuring in vivo absorption, was developed using time-scale and time-shift parameters. The IVIVC model was then validated based on internal and external predictability. With the established IVIVC model, there was a good correlation between the observed fraction of dose excreted in urine and the time-scaled and time-shifted fraction of the drug dissolved, and between the in vitro dissolution time and the in vivo urinary excretion time for the ER formulations. The percent prediction error (%PE) on cumulative urinary excretion over the 24 h interval (A e0-24h ) and maximum urinary excretion rate (R max ) was less than 15% for the individual formulations and less than 10% for the average of the two formulations used to develop the model. Further, the %PE values using external predictability were below 10%. A novel Level A IVIVC was successfully developed and validated for the new potassium chloride ER formulations using urinary pharmacokinetic data. This successful IVIVC may facilitate future development or manufacturing changes to the potassium chloride ER formulation.

  18. Predicting seed dormancy loss and germination timing for Bromus tectorum in a semi-arid environment using hydrothermal time models

    Treesearch

    Susan E. Meyer; Phil S. Allen

    2009-01-01

    A principal goal of seed germination modelling for wild species is to predict germination timing under fluctuating field conditions. We coupled our previously developed hydrothermal time, thermal and hydrothermal afterripening time, and hydration-dehydration models for dormancy loss and germination with field seed zone temperature and water potential measurements from...

  19. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Morelli, Eugene A.

    2014-01-01

    Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.

  1. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  2. Development and Validation of a New Air Carrier Block Time Prediction Model and Methodology

    NASA Astrophysics Data System (ADS)

    Litvay, Robyn Olson

    Commercial airline operations rely on predicted block times as the foundation for critical, successive decisions that include fuel purchasing, crew scheduling, and airport facility usage planning. Small inaccuracies in the predicted block times have the potential to result in huge financial losses, and, with profit margins for airline operations currently almost nonexistent, potentially negate any possible profit. Although optimization techniques have resulted in many models targeting airline operations, the challenge of accurately predicting and quantifying variables months in advance remains elusive. The objective of this work is the development of an airline block time prediction model and methodology that is practical, easily implemented, and easily updated. Research was accomplished, and actual U.S., domestic, flight data from a major airline was utilized, to develop a model to predict airline block times with increased accuracy and smaller variance in the actual times from the predicted times. This reduction in variance represents tens of millions of dollars (U.S.) per year in operational cost savings for an individual airline. A new methodology for block time prediction is constructed using a regression model as the base, as it has both deterministic and probabilistic components, and historic block time distributions. The estimation of the block times for commercial, domestic, airline operations requires a probabilistic, general model that can be easily customized for a specific airline’s network. As individual block times vary by season, by day, and by time of day, the challenge is to make general, long-term estimations representing the average, actual block times while minimizing the variation. Predictions of block times for the third quarter months of July and August of 2011 were calculated using this new model. The resulting, actual block times were obtained from the Research and Innovative Technology Administration, Bureau of Transportation Statistics (Airline On-time Performance Data, 2008-2011) for comparison and analysis. Future block times are shown to be predicted with greater accuracy, without exception and network-wide, for a major, U.S., domestic airline.

  3. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  4. Interactions of timing and prediction error learning.

    PubMed

    Kirkpatrick, Kimberly

    2014-01-01

    Timing and prediction error learning have historically been treated as independent processes, but growing evidence has indicated that they are not orthogonal. Timing emerges at the earliest time point when conditioned responses are observed, and temporal variables modulate prediction error learning in both simple conditioning and cue competition paradigms. In addition, prediction errors, through changes in reward magnitude or value alter timing of behavior. Thus, there appears to be a bi-directional interaction between timing and prediction error learning. Modern theories have attempted to integrate the two processes with mixed success. A neurocomputational approach to theory development is espoused, which draws on neurobiological evidence to guide and constrain computational model development. Heuristics for future model development are presented with the goal of sparking new approaches to theory development in the timing and prediction error fields. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Bayesian spatiotemporal crash frequency models with mixture components for space-time interactions.

    PubMed

    Cheng, Wen; Gill, Gurdiljot Singh; Zhang, Yongping; Cao, Zhong

    2018-03-01

    The traffic safety research has developed spatiotemporal models to explore the variations in the spatial pattern of crash risk over time. Many studies observed notable benefits associated with the inclusion of spatial and temporal correlation and their interactions. However, the safety literature lacks sufficient research for the comparison of different temporal treatments and their interaction with spatial component. This study developed four spatiotemporal models with varying complexity due to the different temporal treatments such as (I) linear time trend; (II) quadratic time trend; (III) Autoregressive-1 (AR-1); and (IV) time adjacency. Moreover, the study introduced a flexible two-component mixture for the space-time interaction which allows greater flexibility compared to the traditional linear space-time interaction. The mixture component allows the accommodation of global space-time interaction as well as the departures from the overall spatial and temporal risk patterns. This study performed a comprehensive assessment of mixture models based on the diverse criteria pertaining to goodness-of-fit, cross-validation and evaluation based on in-sample data for predictive accuracy of crash estimates. The assessment of model performance in terms of goodness-of-fit clearly established the superiority of the time-adjacency specification which was evidently more complex due to the addition of information borrowed from neighboring years, but this addition of parameters allowed significant advantage at posterior deviance which subsequently benefited overall fit to crash data. The Base models were also developed to study the comparison between the proposed mixture and traditional space-time components for each temporal model. The mixture models consistently outperformed the corresponding Base models due to the advantages of much lower deviance. For cross-validation comparison of predictive accuracy, linear time trend model was adjudged the best as it recorded the highest value of log pseudo marginal likelihood (LPML). Four other evaluation criteria were considered for typical validation using the same data for model development. Under each criterion, observed crash counts were compared with three types of data containing Bayesian estimated, normal predicted, and model replicated ones. The linear model again performed the best in most scenarios except one case of using model replicated data and two cases involving prediction without including random effects. These phenomena indicated the mediocre performance of linear trend when random effects were excluded for evaluation. This might be due to the flexible mixture space-time interaction which can efficiently absorb the residual variability escaping from the predictable part of the model. The comparison of Base and mixture models in terms of prediction accuracy further bolstered the superiority of the mixture models as the mixture ones generated more precise estimated crash counts across all four models, suggesting that the advantages associated with mixture component at model fit were transferable to prediction accuracy. Finally, the residual analysis demonstrated the consistently superior performance of random effect models which validates the importance of incorporating the correlation structures to account for unobserved heterogeneity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Real-time individualization of the unified model of performance.

    PubMed

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  7. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  8. A whole-body physiologically based pharmacokinetic (WB-PBPK) model of ciprofloxacin: a step towards predicting bacterial killing at sites of infection.

    PubMed

    Sadiq, Muhammad W; Nielsen, Elisabet I; Khachman, Dalia; Conil, Jean-Marie; Georges, Bernard; Houin, Georges; Laffont, Celine M; Karlsson, Mats O; Friberg, Lena E

    2017-04-01

    The purpose of this study was to develop a whole-body physiologically based pharmacokinetic (WB-PBPK) model for ciprofloxacin for ICU patients, based on only plasma concentration data. In a next step, tissue and organ concentration time profiles in patients were predicted using the developed model. The WB-PBPK model was built using a non-linear mixed effects approach based on data from 102 adult intensive care unit patients. Tissue to plasma distribution coefficients (Kp) were available from the literature and used as informative priors. The developed WB-PBPK model successfully characterized both the typical trends and variability of the available ciprofloxacin plasma concentration data. The WB-PBPK model was thereafter combined with a pharmacokinetic-pharmacodynamic (PKPD) model, developed based on in vitro time-kill data of ciprofloxacin and Escherichia coli to illustrate the potential of this type of approach to predict the time-course of bacterial killing at different sites of infection. The predicted unbound concentration-time profile in extracellular tissue was driving the bacterial killing in the PKPD model and the rate and extent of take-over of mutant bacteria in different tissues were explored. The bacterial killing was predicted to be most efficient in lung and kidney, which correspond well to ciprofloxacin's indications pneumonia and urinary tract infections. Furthermore, a function based on available information on bacterial killing by the immune system in vivo was incorporated. This work demonstrates the development and application of a WB-PBPK-PD model to compare killing of bacteria with different antibiotic susceptibility, of value for drug development and the optimal use of antibiotics .

  9. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  10. Nonlinear Maps for Design of Discrete Time Models of Neuronal Network Dynamics

    DTIC Science & Technology

    2016-02-29

    Performance/Technic~ 02-01-2016- 02-29-2016 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER Nonlinear Maps for Design of Discrete -Time Models of Neuronal...neuronal model in the form of difference equations that generates neuronal states in discrete moments of time. In this approach, time step can be made...propose to use modern DSP ideas to develop new efficient approaches to the design of such discrete -time models for studies of large-scale neuronal

  11. A New Design Method of Automotive Electronic Real-time Control System

    NASA Astrophysics Data System (ADS)

    Zuo, Wenying; Li, Yinguo; Wang, Fengjuan; Hou, Xiaobo

    Structure and functionality of automotive electronic control system is becoming more and more complex. The traditional manual programming development mode to realize automotive electronic control system can't satisfy development needs. So, in order to meet diversity and speedability of development of real-time control system, combining model-based design approach and auto code generation technology, this paper proposed a new design method of automotive electronic control system based on Simulink/RTW. Fristly, design algorithms and build a control system model in Matlab/Simulink. Then generate embedded code automatically by RTW and achieve automotive real-time control system development in OSEK/VDX operating system environment. The new development mode can significantly shorten the development cycle of automotive electronic control system, improve program's portability, reusability and scalability and had certain practical value for the development of real-time control system.

  12. HIGH TIME-RESOLVED COMPARISONS FOR IN-DEPTH PROBING OF CMAQ FINE-PARTICLES AND GAS PREDICTIONS

    EPA Science Inventory

    Model evaluation is important to develop confidence in models and develop an understanding of their predictions. Most comparisons in the U.S. involve time-integrated measurements of 24-hours or longer. Comparisons against continuous or semi-continuous particle and gaseous measur...

  13. Functional Fault Modeling Conventions and Practices for Real-Time Fault Isolation

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara

    2010-01-01

    The purpose of this paper is to present the conventions, best practices, and processes that were established based on the prototype development of a Functional Fault Model (FFM) for a Cryogenic System that would be used for real-time Fault Isolation in a Fault Detection, Isolation, and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using a suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FFMs were created offline but would eventually be used by a real-time reasoner to isolate faults in a Cryogenic System. Through their development and review, a set of modeling conventions and best practices were established. The prototype FFM development also provided a pathfinder for future FFM development processes. This paper documents the rationale and considerations for robust FFMs that can easily be transitioned to a real-time operating environment.

  14. Development of cost-effective pavement treatment selection and treatment performance models : [tech summary].

    DOT National Transportation Integrated Search

    2015-09-01

    The overall objective of this study was to develop pavement treatment performance : models in support of cost-e ective selection of pavement treatment type, project : boundaries, and time of treatment. The development of the proposed models was ba...

  15. Development of a time-trend model for analyzing and predicting case-pattern of Lassa fever epidemics in Liberia, 2013-2017.

    PubMed

    Olugasa, Babasola O; Odigie, Eugene A; Lawani, Mike; Ojo, Johnson F

    2015-01-01

    The objective was to develop a case-pattern model for Lassa fever (LF) among humans and derive predictors of time-trend point distribution of LF cases in Liberia in view of the prevailing under-reporting and public health challenge posed by the disease in the country. A retrospective 5 years data of LF distribution countrywide among humans were used to train a time-trend model of the disease in Liberia. A time-trend quadratic model was selected due to its goodness-of-fit (R2 = 0.89, and P < 0.05) and best performance compared to linear and exponential models. Parameter predictors were run on least square method to predict LF cases for a prospective 5 years period, covering 2013-2017. The two-stage predictive model of LF case-pattern between 2013 and 2017 was characterized by a prospective decline within the South-coast County of Grand Bassa over the forecast period and an upward case-trend within the Northern County of Nimba. Case specific exponential increase was predicted for the first 2 years (2013-2014) with a geometric increase over the next 3 years (2015-2017) in Nimba County. This paper describes a translational application of the space-time distribution pattern of LF epidemics, 2008-2012 reported in Liberia, on which a predictive model was developed. We proposed a computationally feasible two-stage space-time permutation approach to estimate the time-trend parameters and conduct predictive inference on LF in Liberia.

  16. A comparative study of generalized linear mixed modelling and artificial neural network approach for the joint modelling of survival and incidence of Dengue patients in Sri Lanka

    NASA Astrophysics Data System (ADS)

    Hapugoda, J. C.; Sooriyarachchi, M. R.

    2017-09-01

    Survival time of patients with a disease and the incidence of that particular disease (count) is frequently observed in medical studies with the data of a clustered nature. In many cases, though, the survival times and the count can be correlated in a way that, diseases that occur rarely could have shorter survival times or vice versa. Due to this fact, joint modelling of these two variables will provide interesting and certainly improved results than modelling these separately. Authors have previously proposed a methodology using Generalized Linear Mixed Models (GLMM) by joining the Discrete Time Hazard model with the Poisson Regression model to jointly model survival and count model. As Aritificial Neural Network (ANN) has become a most powerful computational tool to model complex non-linear systems, it was proposed to develop a new joint model of survival and count of Dengue patients of Sri Lanka by using that approach. Thus, the objective of this study is to develop a model using ANN approach and compare the results with the previously developed GLMM model. As the response variables are continuous in nature, Generalized Regression Neural Network (GRNN) approach was adopted to model the data. To compare the model fit, measures such as root mean square error (RMSE), absolute mean error (AME) and correlation coefficient (R) were used. The measures indicate the GRNN model fits the data better than the GLMM model.

  17. Verification of models for ballistic movement time and endpoint variability.

    PubMed

    Lin, Ray F; Drury, Colin G

    2013-01-01

    A hand control movement is composed of several ballistic movements. The time required in performing a ballistic movement and its endpoint variability are two important properties in developing movement models. The purpose of this study was to test potential models for predicting these two properties. Twelve participants conducted ballistic movements of specific amplitudes using a drawing tablet. The measured data of movement time and endpoint variability were then used to verify the models. This study was successful with Hoffmann and Gan's movement time model (Hoffmann, 1981; Gan and Hoffmann 1988) predicting more than 90.7% data variance for 84 individual measurements. A new theoretically developed ballistic movement variability model, proved to be better than Howarth, Beggs, and Bowden's (1971) model, predicting on average 84.8% of stopping-variable error and 88.3% of aiming-variable errors. These two validated models will help build solid theoretical movement models and evaluate input devices. This article provides better models for predicting end accuracy and movement time of ballistic movements that are desirable in rapid aiming tasks, such as keying in numbers on a smart phone. The models allow better design of aiming tasks, for example button sizes on mobile phones for different user populations.

  18. Pan-European stochastic flood event set

    NASA Astrophysics Data System (ADS)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.

  19. Bioeconomic of profit maximization of red tilapia (Oreochromis sp.) culture using polynomial growth model

    NASA Astrophysics Data System (ADS)

    Wijayanto, D.; Kurohman, F.; Nugroho, RA

    2018-03-01

    The research purpose was to develop a model bioeconomic of profit maximization that can be applied to red tilapia culture. The development of fish growth model used polynomial growth function. Profit maximization process used the first derivative of profit equation to time of culture equal to zero. This research has also developed the equations to estimate the culture time to reach the target size of the fish harvest. The research proved that this research model could be applied in the red tilapia culture. In the case of this study, red tilapia culture can achieve the maximum profit at 584 days and the profit of Rp. 28,605,731 per culture cycle. If used size target of 250 g, the culture of red tilapia need 82 days of culture time.

  20. Development and evaluation of a reservoir model for the Chain of Lakes in Illinois

    USGS Publications Warehouse

    Domanski, Marian M.

    2017-01-27

    Forecasts of flows entering and leaving the Chain of Lakes reservoir on the Fox River in northeastern Illinois are critical information to water-resource managers who determine the optimal operation of the dam at McHenry, Illinois, to help minimize damages to property and loss of life because of flooding on the Fox River. In 2014, the U.S. Geological Survey; the Illinois Department of Natural Resources, Office of Water Resources; and National Weather Service, North Central River Forecast Center began a cooperative study to develop a system to enable engineers and planners to simulate and communicate flows and to prepare proactively for precipitation events in near real time in the upper Fox River watershed. The purpose of this report is to document the development and evaluation of the Chain of Lakes reservoir model developed in this study.The reservoir model for the Chain of Lakes was developed using the Hydrologic Engineering Center–Reservoir System Simulation program. Because of the complex relation between the dam headwater and reservoir pool elevations, the reservoir model uses a linear regression model that relates dam headwater elevation to reservoir pool elevation. The linear regression model was developed using 17 U.S. Geological Survey streamflow measurements, along with the gage height in the reservoir pool and the gage height at the dam headwater. The Nash-Sutcliffe model efficiency coefficients for all three linear regression model variables ranged from 0.90 to 0.98.The reservoir model performance was evaluated by graphically comparing simulated and observed reservoir pool elevation time series during nine periods of high pool elevation. In addition, the peak elevations during these time periods were graphically compared to the closest-in-time observed pool elevation peak. The mean difference in the simulated and observed peak elevations was -0.03 feet, with a standard deviation of 0.19 feet. The Nash-Sutcliffe coefficient for peak prediction was calculated as 0.94. Evaluation of the model based on accuracy of peak prediction and the ability to simulate an elevation time series showed the performance of the model was satisfactory.

  1. A Kinetic Model Describing Injury-Burden in Team Sports.

    PubMed

    Fuller, Colin W

    2017-12-01

    Injuries in team sports are normally characterised by the incidence, severity, and location and type of injuries sustained: these measures, however, do not provide an insight into the variable injury-burden experienced during a season. Injury burden varies according to the team's match and training loads, the rate at which injuries are sustained and the time taken for these injuries to resolve. At the present time, this time-based variation of injury burden has not been modelled. To develop a kinetic model describing the time-based injury burden experienced by teams in elite team sports and to demonstrate the model's utility. Rates of injury were quantified using a large eight-season database of rugby injuries (5253) and exposure (60,085 player-match-hours) in English professional rugby. Rates of recovery from injury were quantified using time-to-recovery analysis of the injuries. The kinetic model proposed for predicting a team's time-based injury burden is based on a composite rate equation developed from the incidence of injury, a first-order rate of recovery from injury and the team's playing load. The utility of the model was demonstrated by examining common scenarios encountered in elite rugby. The kinetic model developed describes and predicts the variable injury-burden arising from match play during a season of rugby union based on the incidence of match injuries, the rate of recovery from injury and the playing load. The model is equally applicable to other team sports and other scenarios.

  2. Development of a Clinical Forecasting Model to Predict Comorbid Depression Among Diabetes Patients and an Application in Depression Screening Policy Making.

    PubMed

    Jin, Haomiao; Wu, Shinyi; Di Capua, Paul

    2015-09-03

    Depression is a common but often undiagnosed comorbid condition of people with diabetes. Mass screening can detect undiagnosed depression but may require significant resources and time. The objectives of this study were 1) to develop a clinical forecasting model that predicts comorbid depression among patients with diabetes and 2) to evaluate a model-based screening policy that saves resources and time by screening only patients considered as depressed by the clinical forecasting model. We trained and validated 4 machine learning models by using data from 2 safety-net clinical trials; we chose the one with the best overall predictive ability as the ultimate model. We compared model-based policy with alternative policies, including mass screening and partial screening, on the basis of depression history or diabetes severity. Logistic regression had the best overall predictive ability of the 4 models evaluated and was chosen as the ultimate forecasting model. Compared with mass screening, the model-based policy can save approximately 50% to 60% of provider resources and time but will miss identifying about 30% of patients with depression. Partial-screening policy based on depression history alone found only a low rate of depression. Two other heuristic-based partial screening policies identified depression at rates similar to those of the model-based policy but cost more in resources and time. The depression prediction model developed in this study has compelling predictive ability. By adopting the model-based depression screening policy, health care providers can use their resources and time better and increase their efficiency in managing their patients with depression.

  3. Direct use of linear time-domain aerodynamics in aeroservoelastic analysis: Aerodynamic model

    NASA Technical Reports Server (NTRS)

    Woods, J. A.; Gilbert, Michael G.

    1990-01-01

    The work presented here is the first part of a continuing effort to expand existing capabilities in aeroelasticity by developing the methodology which is necessary to utilize unsteady time-domain aerodynamics directly in aeroservoelastic design and analysis. The ultimate objective is to define a fully integrated state-space model of an aeroelastic vehicle's aerodynamics, structure and controls which may be used to efficiently determine the vehicle's aeroservoelastic stability. Here, the current status of developing a state-space model for linear or near-linear time-domain indicial aerodynamic forces is presented.

  4. Evaluation of Lightning Induced Effects in a Graphite Composite Fairing Structure. Parts 1 and 2

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Defining the electromagnetic environment inside a graphite composite fairing due to lightning is of interest to spacecraft developers. This paper is the first in a two part series and studies the shielding effectiveness of a graphite composite model fairing using derived equivalent properties. A frequency domain Method of Moments (MoM) model is developed and comparisons are made with shielding test results obtained using a vehicle-like composite fairing. The comparison results show that the analytical models can adequately predict the test results. Both measured and model data indicate that graphite composite fairings provide significant attenuation to magnetic fields as frequency increases. Diffusion effects are also discussed. Part 2 examines the time domain based effects through the development of a loop based induced field testing and a Transmission-Line-Matrix (TLM) model is developed in the time domain to study how the composite fairing affects lightning induced magnetic fields. Comparisons are made with shielding test results obtained using a vehicle-like composite fairing in the time domain. The comparison results show that the analytical models can adequately predict the test and industry results.

  5. Development of cost-effective pavement treatment selection and treatment performance models : research project capsule.

    DOT National Transportation Integrated Search

    2010-09-10

    The overall goal of this study is to develop pavement treatment performance models in support of the : cost-effective selection of pavement treatment types, project boundaries, and time of treatment. The : development of the proposed models will be b...

  6. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  7. 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Dame, L. T.; Chen, P. C.; Hartle, M. S.; Huang, H. T.

    1985-01-01

    The objective is to develop analytical tools capable of economically evaluating the cyclic time dependent plasticity which occurs in hot section engine components in areas of strain concentration resulting from the combination of both mechanical and thermal stresses. Three models were developed. A simple model performs time dependent inelastic analysis using the power law creep equation. The second model is the classical model of Professors Walter Haisler and David Allen of Texas A and M University. The third model is the unified model of Bodner, Partom, et al. All models were customized for linear variation of loads and temperatures with all material properties and constitutive models being temperature dependent.

  8. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  9. Use of the Box and Jenkins time series technique in traffic forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nihan, N.L.; Holmesland, K.O.

    The use of recently developed time series techniques for short-term traffic volume forecasting is examined. A data set containing monthly volumes on a freeway segment for 1968-76 is used to fit a time series model. The resultant model is used to forecast volumes for 1977. The forecast volumes are then compared with actual volumes in 1977. Time series techniques can be used to develop highly accurate and inexpensive short-term forecasts. The feasibility of using these models to evaluate the effects of policy changes or other outside impacts is considered. (1 diagram, 1 map, 14 references,2 tables)

  10. Draft Forecasts from Real-Time Runs of Physics-Based Models - A Road to the Future

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOAA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  11. Time-varying BRDFs.

    PubMed

    Sun, Bo; Sunkavalli, Kalyan; Ramamoorthi, Ravi; Belhumeur, Peter N; Nayar, Shree K

    2007-01-01

    The properties of virtually all real-world materials change with time, causing their bidirectional reflectance distribution functions (BRDFs) to be time varying. However, none of the existing BRDF models and databases take time variation into consideration; they represent the appearance of a material at a single time instance. In this paper, we address the acquisition, analysis, modeling, and rendering of a wide range of time-varying BRDFs (TVBRDFs). We have developed an acquisition system that is capable of sampling a material's BRDF at multiple time instances, with each time sample acquired within 36 sec. We have used this acquisition system to measure the BRDFs of a wide range of time-varying phenomena, which include the drying of various types of paints (watercolor, spray, and oil), the drying of wet rough surfaces (cement, plaster, and fabrics), the accumulation of dusts (household and joint compound) on surfaces, and the melting of materials (chocolate). Analytic BRDF functions are fit to these measurements and the model parameters' variations with time are analyzed. Each category exhibits interesting and sometimes nonintuitive parameter trends. These parameter trends are then used to develop analytic TVBRDF models. The analytic TVBRDF models enable us to apply effects such as paint drying and dust accumulation to arbitrary surfaces and novel materials.

  12. Neuroanatomical basis for recognition primed decision making.

    PubMed

    Hudson, Darren

    2013-01-01

    Effective decision making under time constraints is often overlooked in medical decision making. The recognition primed decision making (RPDM) model was developed by Gary Klein based on previous recognized situations to develop a satisfactory solution to the current problem. Bayes Theorem is the most popular decision making model in medicine but is limited by the need for adequate time to consider all probabilities. Unlike other decision making models, there is a potential neurobiological basis for RPDM. This model has significant implication for health informatics and medical education.

  13. Finite difference time domain (FDTD) modeling of implanted deep brain stimulation electrodes and brain tissue.

    PubMed

    Gabran, S R I; Saad, J H; Salama, M M A; Mansour, R R

    2009-01-01

    This paper demonstrates the electromagnetic modeling and simulation of an implanted Medtronic deep brain stimulation (DBS) electrode using finite difference time domain (FDTD). The model is developed using Empire XCcel and represents the electrode surrounded with brain tissue assuming homogenous and isotropic medium. The model is created to study the parameters influencing the electric field distribution within the tissue in order to provide reference and benchmarking data for DBS and intra-cortical electrode development.

  14. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  15. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  16. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.

  17. Effect of Time Varying Gravity on DORIS processing for ITRF2013

    NASA Astrophysics Data System (ADS)

    Zelensky, N. P.; Lemoine, F. G.; Chinn, D. S.; Beall, J. W.; Melachroinos, S. A.; Beckley, B. D.; Pavlis, D.; Wimert, J.

    2013-12-01

    Computations are under way to develop a new time series of DORIS SINEX solutions to contribute to the development of the new realization of the terrestrial reference frame (c.f. ITRF2013). One of the improvements that are envisaged is the application of improved models of time-variable gravity in the background orbit modeling. At GSFC we have developed a time series of spherical harmonics to degree and order 5 (using the GOC02S model as a base), based on the processing of SLR and DORIS data to 14 satellites from 1993 to 2013. This is compared with the standard approach used in ITRF2008, based on the static model EIGEN-GL04S1 which included secular variations in only a few select coefficients. Previous work on altimeter satellite POD (c.f. TOPEX/Poseidon, Jason-1, Jason-2) has shown that the standard model is not adequate and orbit improvements are observed with application of more detailed models of time-variable gravity. In this study, we quantify the impact of TVG modeling on DORIS satellite POD, and ascertain the impact on DORIS station positions estimated weekly from 1993 to 2013. The numerous recent improvements to SLR and DORIS processing at GSFC include a more complete compliance to IERS2010 standards, improvements to SLR/DORIS measurement modeling, and improved non-conservative force modeling to DORIS satellites. These improvements will affect gravity coefficient estimates, POD, and the station solutions. Tests evaluate the impact of time varying gravity on tracking data residuals, station consistency, and the geocenter and scale reference frame parameters.

  18. An approach to developing user interfaces for space systems

    NASA Astrophysics Data System (ADS)

    Shackelford, Keith; McKinney, Karen

    1993-08-01

    Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.

  19. Hybrid automata models of cardiac ventricular electrophysiology for real-time computational applications.

    PubMed

    Andalam, Sidharta; Ramanna, Harshavardhan; Malik, Avinash; Roop, Parthasarathi; Patel, Nitish; Trew, Mark L

    2016-08-01

    Virtual heart models have been proposed for closed loop validation of safety-critical embedded medical devices, such as pacemakers. These models must react in real-time to off-the-shelf medical devices. Real-time performance can be obtained by implementing models in computer hardware, and methods of compiling classes of Hybrid Automata (HA) onto FPGA have been developed. Models of ventricular cardiac cell electrophysiology have been described using HA which capture the complex nonlinear behavior of biological systems. However, many models that have been used for closed-loop validation of pacemakers are highly abstract and do not capture important characteristics of the dynamic rate response. We developed a new HA model of cardiac cells which captures dynamic behavior and we implemented the model in hardware. This potentially enables modeling the heart with over 1 million dynamic cells, making the approach ideal for closed loop testing of medical devices.

  20. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  1. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  2. Deriving Tools from Real-time Runs: A New CCMC Support for SEC and AFWA

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions. the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models. and on the transition of appropriate models to space weather forecast centers. As part of the latter activity. the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  3. Refining Time-Activity Classification of Human Subjects Using the Global Positioning System

    PubMed Central

    Hu, Maogui; Li, Wei; Li, Lianfa; Houston, Douglas; Wu, Jun

    2016-01-01

    Background Detailed spatial location information is important in accurately estimating personal exposure to air pollution. Global Position System (GPS) has been widely used in tracking personal paths and activities. Previous researchers have developed time-activity classification models based on GPS data, most of them were developed for specific regions. An adaptive model for time-location classification can be widely applied to air pollution studies that use GPS to track individual level time-activity patterns. Methods Time-activity data were collected for seven days using GPS loggers and accelerometers from thirteen adult participants from Southern California under free living conditions. We developed an automated model based on random forests to classify major time-activity patterns (i.e. indoor, outdoor-static, outdoor-walking, and in-vehicle travel). Sensitivity analysis was conducted to examine the contribution of the accelerometer data and the supplemental spatial data (i.e. roadway and tax parcel data) to the accuracy of time-activity classification. Our model was evaluated using both leave-one-fold-out and leave-one-subject-out methods. Results Maximum speeds in averaging time intervals of 7 and 5 minutes, and distance to primary highways with limited access were found to be the three most important variables in the classification model. Leave-one-fold-out cross-validation showed an overall accuracy of 99.71%. Sensitivities varied from 84.62% (outdoor walking) to 99.90% (indoor). Specificities varied from 96.33% (indoor) to 99.98% (outdoor static). The exclusion of accelerometer and ambient light sensor variables caused a slight loss in sensitivity for outdoor walking, but little loss in overall accuracy. However, leave-one-subject-out cross-validation showed considerable loss in sensitivity for outdoor static and outdoor walking conditions. Conclusions The random forests classification model can achieve high accuracy for the four major time-activity categories. The model also performed well with just GPS, road and tax parcel data. However, caution is warranted when generalizing the model developed from a small number of subjects to other populations. PMID:26919723

  4. Crash Frequency Modeling Using Real-Time Environmental and Traffic Data and Unbalanced Panel Data Models

    PubMed Central

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2016-01-01

    Traffic and environmental conditions (e.g., weather conditions), which frequently change with time, have a significant impact on crash occurrence. Traditional crash frequency models with large temporal scales and aggregated variables are not sufficient to capture the time-varying nature of driving environmental factors, causing significant loss of critical information on crash frequency modeling. This paper aims at developing crash frequency models with refined temporal scales for complex driving environments, with such an effort providing more detailed and accurate crash risk information which can allow for more effective and proactive traffic management and law enforcement intervention. Zero-inflated, negative binomial (ZINB) models with site-specific random effects are developed with unbalanced panel data to analyze hourly crash frequency on highway segments. The real-time driving environment information, including traffic, weather and road surface condition data, sourced primarily from the Road Weather Information System, is incorporated into the models along with site-specific road characteristics. The estimation results of unbalanced panel data ZINB models suggest there are a number of factors influencing crash frequency, including time-varying factors (e.g., visibility and hourly traffic volume) and site-varying factors (e.g., speed limit). The study confirms the unique significance of the real-time weather, road surface condition and traffic data to crash frequency modeling. PMID:27322306

  5. Nonlinear Maps for Design of Discrete-Time Models of Neuronal Network Dynamics

    DTIC Science & Technology

    2016-03-31

    2016 Performance/Technic~ 03-01-2016- 03-31-2016 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER Nonlinear Maps for Design of Discrete -Time Models of...simulations is to design a neuronal model in the form of difference equations that generates neuronal states in discrete moments of time. In this...responsive tiring patterns. We propose to use modern DSP ideas to develop new efficient approaches to the design of such discrete -time models for

  6. A longitudinal model for disease progression was developed and applied to multiple sclerosis

    PubMed Central

    Lawton, Michael; Tilling, Kate; Robertson, Neil; Tremlett, Helen; Zhu, Feng; Harding, Katharine; Oger, Joel; Ben-Shlomo, Yoav

    2015-01-01

    Objectives To develop a model of disease progression using multiple sclerosis (MS) as an exemplar. Study Design and Settings Two observational cohorts, the University of Wales MS (UoWMS), UK (1976), and British Columbia MS (BCMS) database, Canada (1980), with longitudinal disability data [the Expanded Disability Status Scale (EDSS)] were used; individuals potentially eligible for MS disease-modifying drugs treatments, but who were unexposed, were selected. Multilevel modeling was used to estimate the EDSS trajectory over time in one data set and validated in the other; challenges addressed included the choice and function of time axis, complex observation-level variation, adjustments for MS relapses, and autocorrelation. Results The best-fitting model for the UoWMS cohort (404 individuals, and 2,290 EDSS observations) included a nonlinear function of time since onset. Measurement error decreased over time and ad hoc methods reduced autocorrelation and the effect of relapse. Replication within the BCMS cohort (978 individuals and 7,335 EDSS observations) led to a model with similar time (years) coefficients, time [0.22 (95% confidence interval {CI}: 0.19, 0.26), 0.16 (95% CI: 0.10, 0.22)] and log time [−0.13 (95% CI: −0.39, 0.14), −0.15 (95% CI: −0.70, 0.40)] for BCMS and UoWMS, respectively. Conclusion It is possible to develop robust models of disability progression for chronic disease. However, explicit validation is important given the complex methodological challenges faced. PMID:26071892

  7. Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.

    PubMed

    Durandau, Guillaume; Farina, Dario; Sartori, Massimo

    2018-03-01

    Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.

  8. Yield model development project implementation plan

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A.

    1982-01-01

    Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.

  9. Identification of aerodynamic models for maneuvering aircraft

    NASA Technical Reports Server (NTRS)

    Lan, C. Edward; Hu, C. C.

    1992-01-01

    A Fourier analysis method was developed to analyze harmonic forced-oscillation data at high angles of attack as functions of the angle of attack and its time rate of change. The resulting aerodynamic responses at different frequencies are used to build up the aerodynamic models involving time integrals of the indicial type. An efficient numerical method was also developed to evaluate these time integrals for arbitrary motions based on a concept of equivalent harmonic motion. The method was verified by first using results from two-dimensional and three-dimensional linear theories. The developed models for C sub L, C sub D, and C sub M based on high-alpha data for a 70 deg delta wing in harmonic motions showed accurate results in reproducing hysteresis. The aerodynamic models are further verified by comparing with test data using ramp-type motions.

  10. Individual Differences in Boys' and Girls' Timing and Tempo of Puberty: Modeling Development with Nonlinear Growth Models

    ERIC Educational Resources Information Center

    Marceau, Kristine; Ram, Nilam; Houts, Renate M.; Grimm, Kevin J.; Susman, Elizabeth J.

    2011-01-01

    Pubertal development is a nonlinear process progressing from prepubescent beginnings through biological, physical, and psychological changes to full sexual maturity. To tether theoretical concepts of puberty with sophisticated longitudinal, analytical models capable of articulating pubertal development more accurately, we used nonlinear…

  11. Hybrid Model of IRT and Latent Class Models.

    ERIC Educational Resources Information Center

    Yamamoto, Kentaro

    This study developed a hybrid of item response theory (IRT) models and latent class models, which combined the strengths of each type of model. The primary motivation for developing the new model is to describe characteristics of examinees' knowledge at the time of the examination. Hence, the application of the model lies mainly in so-called…

  12. The Use of Leaf Functional Traits for Modeling the Timing and Rate of Canopy Development

    NASA Astrophysics Data System (ADS)

    Savoy, P.; Mackay, D. S.

    2015-12-01

    Leaves vary in their habit, with some being short lived and possessing high intrinsic photosynthetic rates and others being long lived with lower photosynthetic capacity. Longer lived leaves will thus tend to cost more to produce and be able to assimilate carbon over a longer period of time. The timing and seasonality of forest canopies is a cost benefit strategy for the exploitation of favorable environmental conditions and avoidance of unfavorable conditions. Because of the selective pressure for plants to gather a return on leaf investment in relation to their leaf habit we propose that there is a relationship between plant functional traits and the timing and rate of canopy development. In a recent study it was shown that errors in predicted canopy dynamics could be reduced via a single parameter (τ) which modified the timing and rate of canopy development (Savoy & Mackay 2015). If τ is related to underlying mechanisms of plant physiology then it should vary predictably. To test this we will first examine the relationship between τ and observable biophysical variables which vary in ecologically meaningful ways. Then we will develop a model based on leaf traits which will regulate the timing and rate at which vegetation reaches peak rates of assimilation. The model will then be tested at eddy covariance sites which span a range environmental conditions. Preliminary results demonstrate a strong relationship (R2 = 0.58) between estimated values of τ and leaf carbon to nitrogen ratio, which is important for representing the costs of leaf construction and nitrogen investment into photosynthetic machinery of leaves. By developing a canopy seasonality model based on plant functional traits and rooted in the framework of leaf economics it is possible to have a more flexible and generalized model. Such a model will be more adept at making predictions under novel environmental conditions than purely correlative empirical models.

  13. Development and validation of a set of six adaptable prognosis prediction (SAP) models based on time-series real-world big data analysis for patients with cancer receiving chemotherapy: A multicenter case crossover study

    PubMed Central

    Kanai, Masashi; Okamoto, Kazuya; Yamamoto, Yosuke; Yoshioka, Akira; Hiramoto, Shuji; Nozaki, Akira; Nishikawa, Yoshitaka; Yamaguchi, Daisuke; Tomono, Teruko; Nakatsui, Masahiko; Baba, Mika; Morita, Tatsuya; Matsumoto, Shigemi; Kuroda, Tomohiro; Okuno, Yasushi; Muto, Manabu

    2017-01-01

    Background We aimed to develop an adaptable prognosis prediction model that could be applied at any time point during the treatment course for patients with cancer receiving chemotherapy, by applying time-series real-world big data. Methods Between April 2004 and September 2014, 4,997 patients with cancer who had received systemic chemotherapy were registered in a prospective cohort database at the Kyoto University Hospital. Of these, 2,693 patients with a death record were eligible for inclusion and divided into training (n = 1,341) and test (n = 1,352) cohorts. In total, 3,471,521 laboratory data at 115,738 time points, representing 40 laboratory items [e.g., white blood cell counts and albumin (Alb) levels] that were monitored for 1 year before the death event were applied for constructing prognosis prediction models. All possible prediction models comprising three different items from 40 laboratory items (40C3 = 9,880) were generated in the training cohort, and the model selection was performed in the test cohort. The fitness of the selected models was externally validated in the validation cohort from three independent settings. Results A prognosis prediction model utilizing Alb, lactate dehydrogenase, and neutrophils was selected based on a strong ability to predict death events within 1–6 months and a set of six prediction models corresponding to 1,2, 3, 4, 5, and 6 months was developed. The area under the curve (AUC) ranged from 0.852 for the 1 month model to 0.713 for the 6 month model. External validation supported the performance of these models. Conclusion By applying time-series real-world big data, we successfully developed a set of six adaptable prognosis prediction models for patients with cancer receiving chemotherapy. PMID:28837592

  14. TISK 1.0: An easy-to-use Python implementation of the time-invariant string kernel model of spoken word recognition.

    PubMed

    You, Heejo; Magnuson, James S

    2018-06-01

    This article describes a new Python distribution of TISK, the time-invariant string kernel model of spoken word recognition (Hannagan et al. in Frontiers in Psychology, 4, 563, 2013). TISK is an interactive-activation model similar to the TRACE model (McClelland & Elman in Cognitive Psychology, 18, 1-86, 1986), but TISK replaces most of TRACE's reduplicated, time-specific nodes with theoretically motivated time-invariant, open-diphone nodes. We discuss the utility of computational models as theory development tools, the relative merits of TISK as compared to other models, and the ways in which researchers might use this implementation to guide their own research and theory development. We describe a TISK model that includes features that facilitate in-line graphing of simulation results, integration with standard Python data formats, and graph and data export. The distribution can be downloaded from https://github.com/maglab-uconn/TISK1.0 .

  15. DEVELOPMENT AND PEER REVIEW OF TIME-TO-EFFECT MODELS FOR THE ANALYSIS OF NEUROTOXICITY AND OTHER TIME DEPENDENT DATA

    EPA Science Inventory

    Neurobehavioral studies pose unique challenges for dose-response modeling, including small sample size and relatively large intra-subject variation, repeated measurements over time, multiple endpoints with both continuous and ordinal scales, and time dependence of risk characteri...

  16. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  17. Thermodynamics constrains allometric scaling of optimal development time in insects.

    PubMed

    Dillon, Michael E; Frazier, Melanie R

    2013-01-01

    Development time is a critical life-history trait that has profound effects on organism fitness and on population growth rates. For ectotherms, development time is strongly influenced by temperature and is predicted to scale with body mass to the quarter power based on 1) the ontogenetic growth model of the metabolic theory of ecology which describes a bioenergetic balance between tissue maintenance and growth given the scaling relationship between metabolism and body size, and 2) numerous studies, primarily of vertebrate endotherms, that largely support this prediction. However, few studies have investigated the allometry of development time among invertebrates, including insects. Abundant data on development of diverse insects provides an ideal opportunity to better understand the scaling of development time in this ecologically and economically important group. Insects develop more quickly at warmer temperatures until reaching a minimum development time at some optimal temperature, after which development slows. We evaluated the allometry of insect development time by compiling estimates of minimum development time and optimal developmental temperature for 361 insect species from 16 orders with body mass varying over nearly 6 orders of magnitude. Allometric scaling exponents varied with the statistical approach: standardized major axis regression supported the predicted quarter-power scaling relationship, but ordinary and phylogenetic generalized least squares did not. Regardless of the statistical approach, body size alone explained less than 28% of the variation in development time. Models that also included optimal temperature explained over 50% of the variation in development time. Warm-adapted insects developed more quickly, regardless of body size, supporting the "hotter is better" hypothesis that posits that ectotherms have a limited ability to evolutionarily compensate for the depressing effects of low temperatures on rates of biological processes. The remaining unexplained variation in development time likely reflects additional ecological and evolutionary differences among insect species.

  18. Generalized semiparametric varying-coefficient models for longitudinal data

    NASA Astrophysics Data System (ADS)

    Qi, Li

    In this dissertation, we investigate the generalized semiparametric varying-coefficient models for longitudinal data that can flexibly model three types of covariate effects: time-constant effects, time-varying effects, and covariate-varying effects, i.e., the covariate effects that depend on other possibly time-dependent exposure variables. First, we consider the model that assumes the time-varying effects are unspecified functions of time while the covariate-varying effects are parametric functions of an exposure variable specified up to a finite number of unknown parameters. The estimation procedures are developed using multivariate local linear smoothing and generalized weighted least squares estimation techniques. The asymptotic properties of the proposed estimators are established. The simulation studies show that the proposed methods have satisfactory finite sample performance. ACTG 244 clinical trial of HIV infected patients are applied to examine the effects of antiretroviral treatment switching before and after HIV developing the 215-mutation. Our analysis shows benefit of treatment switching before developing the 215-mutation. The proposed methods are also applied to the STEP study with MITT cases showing that they have broad applications in medical research.

  19. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  20. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  1. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  2. Novel hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization estimation method for population pharmacokinetic data analysis.

    PubMed

    Ng, C M

    2013-10-01

    The development of a population PK/PD model, an essential component for model-based drug development, is both time- and labor-intensive. A graphical-processing unit (GPU) computing technology has been proposed and used to accelerate many scientific computations. The objective of this study was to develop a hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization (MCPEM) estimation algorithm for population PK data analysis. A hybrid GPU-CPU implementation of the MCPEM algorithm (MCPEMGPU) and identical algorithm that is designed for the single CPU (MCPEMCPU) were developed using MATLAB in a single computer equipped with dual Xeon 6-Core E5690 CPU and a NVIDIA Tesla C2070 GPU parallel computing card that contained 448 stream processors. Two different PK models with rich/sparse sampling design schemes were used to simulate population data in assessing the performance of MCPEMCPU and MCPEMGPU. Results were analyzed by comparing the parameter estimation and model computation times. Speedup factor was used to assess the relative benefit of parallelized MCPEMGPU over MCPEMCPU in shortening model computation time. The MCPEMGPU consistently achieved shorter computation time than the MCPEMCPU and can offer more than 48-fold speedup using a single GPU card. The novel hybrid GPU-CPU implementation of parallelized MCPEM algorithm developed in this study holds a great promise in serving as the core for the next-generation of modeling software for population PK/PD analysis.

  3. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  4. Modeling Real-Time Applications with Reusable Design Patterns

    NASA Astrophysics Data System (ADS)

    Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik

    Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Huys, Otti, E-mail: otti.dhuys@phy.duke.edu; Haynes, Nicholas D.; Lohmann, Johannes

    Autonomous Boolean networks are commonly used to model the dynamics of gene regulatory networks and allow for the prediction of stable dynamical attractors. However, most models do not account for time delays along the network links and noise, which are crucial features of real biological systems. Concentrating on two paradigmatic motifs, the toggle switch and the repressilator, we develop an experimental testbed that explicitly includes both inter-node time delays and noise using digital logic elements on field-programmable gate arrays. We observe transients that last millions to billions of characteristic time scales and scale exponentially with the amount of time delaysmore » between nodes, a phenomenon known as super-transient scaling. We develop a hybrid model that includes time delays along network links and allows for stochastic variation in the delays. Using this model, we explain the observed super-transient scaling of both motifs and recreate the experimentally measured transient distributions.« less

  6. Task allocation model for minimization of completion time in distributed computer systems

    NASA Astrophysics Data System (ADS)

    Wang, Jai-Ping; Steidley, Carl W.

    1993-08-01

    A task in a distributed computing system consists of a set of related modules. Each of the modules will execute on one of the processors of the system and communicate with some other modules. In addition, precedence relationships may exist among the modules. Task allocation is an essential activity in distributed-software design. This activity is of importance to all phases of the development of a distributed system. This paper establishes task completion-time models and task allocation models for minimizing task completion time. Current work in this area is either at the experimental level or without the consideration of precedence relationships among modules. The development of mathematical models for the computation of task completion time and task allocation will benefit many real-time computer applications such as radar systems, navigation systems, industrial process control systems, image processing systems, and artificial intelligence oriented systems.

  7. Development of a Threshold Model to Predict Germination of Populus tomentosa Seeds after Harvest and Storage under Ambient Condition

    PubMed Central

    Wang, Wei-Qing; Cheng, Hong-Yan; Song, Song-Quan

    2013-01-01

    Effects of temperature, storage time and their combination on germination of aspen (Populus tomentosa) seeds were investigated. Aspen seeds were germinated at 5 to 30°C at 5°C intervals after storage for a period of time under 28°C and 75% relative humidity. The effect of temperature on aspen seed germination could not be effectively described by the thermal time (TT) model, which underestimated the germination rate at 5°C and poorly predicted the time courses of germination at 10, 20, 25 and 30°C. A modified TT model (MTT) which assumed a two-phased linear relationship between germination rate and temperature was more accurate in predicting the germination rate and percentage and had a higher likelihood of being correct than the TT model. The maximum lifetime threshold (MLT) model accurately described the effect of storage time on seed germination across all the germination temperatures. An aging thermal time (ATT) model combining both the TT and MLT models was developed to describe the effect of both temperature and storage time on seed germination. When the ATT model was applied to germination data across all the temperatures and storage times, it produced a relatively poor fit. Adjusting the ATT model to separately fit germination data at low and high temperatures in the suboptimal range increased the models accuracy for predicting seed germination. Both the MLT and ATT models indicate that germination of aspen seeds have distinct physiological responses to temperature within a suboptimal range. PMID:23658654

  8. Ontological Model of Business Process Management Systems

    NASA Astrophysics Data System (ADS)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  9. An architecture for the development of real-time fault diagnosis systems using model-based reasoning

    NASA Technical Reports Server (NTRS)

    Hall, Gardiner A.; Schuetzle, James; Lavallee, David; Gupta, Uday

    1992-01-01

    Presented here is an architecture for implementing real-time telemetry based diagnostic systems using model-based reasoning. First, we describe Paragon, a knowledge acquisition tool for offline entry and validation of physical system models. Paragon provides domain experts with a structured editing capability to capture the physical component's structure, behavior, and causal relationships. We next describe the architecture of the run time diagnostic system. The diagnostic system, written entirely in Ada, uses the behavioral model developed offline by Paragon to simulate expected component states as reflected in the telemetry stream. The diagnostic algorithm traces causal relationships contained within the model to isolate system faults. Since the diagnostic process relies exclusively on the behavioral model and is implemented without the use of heuristic rules, it can be used to isolate unpredicted faults in a wide variety of systems. Finally, we discuss the implementation of a prototype system constructed using this technique for diagnosing faults in a science instrument. The prototype demonstrates the use of model-based reasoning to develop maintainable systems with greater diagnostic capabilities at a lower cost.

  10. MSW Time to Tumor Model and Supporting Documentation

    EPA Science Inventory

    The multistage Weibull (MSW) time-to-tumor model and related documentation were developed principally (but not exclusively) for conducting time-to-tumor analyses to support risk assessments under the IRIS program. These programs and related docum...

  11. Additive hazards regression and partial likelihood estimation for ecological monitoring data across space.

    PubMed

    Lin, Feng-Chang; Zhu, Jun

    2012-01-01

    We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.

  12. Fast Algorithms for Mining Co-evolving Time Series

    DTIC Science & Technology

    2011-09-01

    Keogh et al., 2001, 2004] and (b) forecasting, like an autoregressive integrated moving average model ( ARIMA ) and related meth- ods [Box et al., 1994...computing hardware? We develop models to mine time series with missing values, to extract compact representation from time sequences, to segment the...sequences, and to do forecasting. For large scale data, we propose algorithms for learning time series models , in particular, including Linear Dynamical

  13. Incorporating Auditory Models in Speech/Audio Applications

    NASA Astrophysics Data System (ADS)

    Krishnamoorthi, Harish

    2011-12-01

    Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.

  14. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. A PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODEL FOR TRICHLOROETHYLENE WITH SPECIFICITY FOR THE LONG EVANS RAT

    EPA Science Inventory

    A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...

  16. Enhancements to the EPANET-RTX (Real-Time Analytics) ...

    EPA Pesticide Factsheets

    Technical brief and software The U.S. Environmental Protection Agency (EPA) developed EPANET-RTX as a collection of object-oriented software libraries comprising the core data access, data transformation, and data synthesis (real-time analytics) components of a real-time hydraulic and water quality modeling system. While EPANET-RTX uses the hydraulic and water quality solvers of EPANET, the object libraries are a self-contained set of building blocks for software developers. “Real-time EPANET” promises to change the way water utilities, commercial vendors, engineers, and the water community think about modeling.

  17. Continuing Development of a Hybrid Model (VSH) of the Neutral Thermosphere

    NASA Technical Reports Server (NTRS)

    Burns, Alan

    1996-01-01

    We propose to continue the development of a new operational model of neutral thermospheric density, composition, temperatures and winds to improve current engineering environment definitions of the neutral thermosphere. This model will be based on simulations made with the National Center for Atmospheric Research (NCAR) Thermosphere-Ionosphere- Electrodynamic General Circulation Model (TIEGCM) and on empirical data. It will be capable of using real-time geophysical indices or data from ground-based and satellite inputs and provides neutral variables at specified locations and times. This "hybrid" model will be based on a Vector Spherical Harmonic (VSH) analysis technique developed (over the last 8 years) at the University of Michigan that permits the incorporation of the TIGCM outputs and data into the model. The VSH model will be a more accurate version of existing models of the neutral thermospheric, and will thus improve density specification for satellites flying in low Earth orbit (LEO).

  18. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    NASA Technical Reports Server (NTRS)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  19. An Immuno-epidemiological Model of Paratuberculosis

    NASA Astrophysics Data System (ADS)

    Martcheva, M.

    2011-11-01

    The primary objective of this article is to introduce an immuno-epidemiological model of paratuberculosis (Johne's disease). To develop the immuno-epidemiological model, we first develop an immunological model and an epidemiological model. Then, we link the two models through time-since-infection structure and parameters of the epidemiological model. We use the nested approach to compose the immuno-epidemiological model. Our immunological model captures the switch between the T-cell immune response and the antibody response in Johne's disease. The epidemiological model is a time-since-infection model and captures the variability of transmission rate and the vertical transmission of the disease. We compute the immune-response-dependent epidemiological reproduction number. Our immuno-epidemiological model can be used for investigation of the impact of the immune response on the epidemiology of Johne's disease.

  20. Real-Time System for Water Modeling and Management

    NASA Astrophysics Data System (ADS)

    Lee, J.; Zhao, T.; David, C. H.; Minsker, B.

    2012-12-01

    Working closely with the Texas Commission on Environmental Quality (TCEQ) and the University of Texas at Austin (UT-Austin), we are developing a real-time system for water modeling and management using advanced cyberinfrastructure, data integration and geospatial visualization, and numerical modeling. The state of Texas suffered a severe drought in 2011 that cost the state $7.62 billion in agricultural losses (crops and livestock). Devastating situations such as this could potentially be avoided with better water modeling and management strategies that incorporate state of the art simulation and digital data integration. The goal of the project is to prototype a near-real-time decision support system for river modeling and management in Texas that can serve as a national and international model to promote more sustainable and resilient water systems. The system uses National Weather Service current and predicted precipitation data as input to the Noah-MP Land Surface model, which forecasts runoff, soil moisture, evapotranspiration, and water table levels given land surface features. These results are then used by a river model called RAPID, along with an error model currently under development at UT-Austin, to forecast stream flows in the rivers. Model forecasts are visualized as a Web application for TCEQ decision makers, who issue water diversion (withdrawal) permits and any needed drought restrictions; permit holders; and reservoir operation managers. Users will be able to adjust model parameters to predict the impacts of alternative curtailment scenarios or weather forecasts. A real-time optimization system under development will help TCEQ to identify optimal curtailment strategies to minimize impacts on permit holders and protect health and safety. To develop the system we have implemented RAPID as a remotely-executed modeling service using the Cyberintegrator workflow system with input data downloaded from the North American Land Data Assimilation System. The Cyberintegrator workflow system provides RESTful web services for users to provide inputs, execute workflows, and retrieve outputs. Along with REST endpoints, PAW (Publishable Active Workflows) provides the web user interface toolkit for us to develop web applications with scientific workflows. The prototype web application is built on top of workflows with PAW, so that users will have a user-friendly web environment to provide input parameters, execute the model, and visualize/retrieve the results using geospatial mapping tools. In future work the optimization model will be developed and integrated into the workflow.; Real-Time System for Water Modeling and Management

  1. Optimizing and controlling earthmoving operations using spatial technologies

    NASA Astrophysics Data System (ADS)

    Alshibani, Adel

    This thesis presents a model designed for optimizing, tracking, and controlling earthmoving operations. The proposed model utilizes, Genetic Algorithm (GA), Linear Programming (LP), and spatial technologies including Global Positioning Systems (GPS) and Geographic Information Systems (GIS) to support the management functions of the developed model. The model assists engineers and contractors in selecting near optimum crew formations in planning phase and during construction, using GA and LP supported by the Pathfinder Algorithm developed in a GIS environment. GA is used in conjunction with a set of rules developed to accelerate the optimization process and to avoid generating and evaluating hypothetical and unrealistic crew formations. LP is used to determine quantities of earth to be moved from different borrow pits and to be placed at different landfill sites to meet project constraints and to minimize the cost of these earthmoving operations. On the one hand, GPS is used for onsite data collection and for tracking construction equipment in near real-time. On the other hand, GIS is employed to automate data acquisition and to analyze the collected spatial data. The model is also capable of reconfiguring crew formations dynamically during the construction phase while site operations are in progress. The optimization of the crew formation considers: (1) construction time, (2) construction direct cost, or (3) construction total cost. The model is also capable of generating crew formations to meet, as close as possible, specified time and/or cost constraints. In addition, the model supports tracking and reporting of project progress utilizing the earned-value concept and the project ratio method with modifications that allow for more accurate forecasting of project time and cost at set future dates and at completion. The model is capable of generating graphical and tabular reports. The developed model has been implemented in prototype software, using Object-Oriented Programming, Microsoft Foundation Classes (MFC), and has been coded using visual C++ V.6. Microsoft Access is employed as database management system. The developed software operates in Microsoft windows' environment. Three example applications were analyzed to validate the development made and to illustrate the essential features of the developed model.

  2. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  3. Real-time quality monitoring in debutanizer column with regression tree and ANFIS

    NASA Astrophysics Data System (ADS)

    Siddharth, Kumar; Pathak, Amey; Pani, Ajaya Kumar

    2018-05-01

    A debutanizer column is an integral part of any petroleum refinery. Online composition monitoring of debutanizer column outlet streams is highly desirable in order to maximize the production of liquefied petroleum gas. In this article, data-driven models for debutanizer column are developed for real-time composition monitoring. The dataset used has seven process variables as inputs and the output is the butane concentration in the debutanizer column bottom product. The input-output dataset is divided equally into a training (calibration) set and a validation (testing) set. The training set data were used to develop fuzzy inference, adaptive neuro fuzzy (ANFIS) and regression tree models for the debutanizer column. The accuracy of the developed models were evaluated by simulation of the models with the validation dataset. It is observed that the ANFIS model has better estimation accuracy than other models developed in this work and many data-driven models proposed so far in the literature for the debutanizer column.

  4. Modeling of Principal Flank Wear: An Empirical Approach Combining the Effect of Tool, Environment and Workpiece Hardness

    NASA Astrophysics Data System (ADS)

    Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan

    2016-10-01

    Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.

  5. Advanced Combustion Numerics and Modeling - FY18 First Quarter Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitesides, R. A.; Killingsworth, N. J.; McNenly, M. J.

    This project is focused on early stage research and development of numerical methods and models to improve advanced engine combustion concepts and systems. The current focus is on development of new mathematics and algorithms to reduce the time to solution for advanced combustion engine design using detailed fuel chemistry. The research is prioritized towards the most time-consuming workflow bottlenecks (computer and human) and accuracy gaps that slow ACS program members. Zero-RK, the fast and accurate chemical kinetics solver software developed in this project, is central to the research efforts and continues to be developed to address the current and emergingmore » needs of the engine designers, engine modelers and fuel mechanism developers.« less

  6. Development of a Geomagnetic Storm Correction to the International Reference Ionosphere E-Region Electron Densities Using TIMED/SABER Observations

    NASA Technical Reports Server (NTRS)

    Mertens, C. J.; Xu, X.; Fernandez, J. R.; Bilitza, D.; Russell, J. M., III; Mlynczak, M. G.

    2009-01-01

    Auroral infrared emission observed from the TIMED/SABER broadband 4.3 micron channel is used to develop an empirical geomagnetic storm correction to the International Reference Ionosphere (IRI) E-region electron densities. The observation-based proxy used to develop the storm model is SABER-derived NO+(v) 4.3 micron volume emission rates (VER). A correction factor is defined as the ratio of storm-time NO+(v) 4.3 micron VER to a quiet-time climatological averaged NO+(v) 4.3 micron VER, which is linearly fit to available geomagnetic activity indices. The initial version of the E-region storm model, called STORM-E, is most applicable within the auroral oval region. The STORM-E predictions of E-region electron densities are compared to incoherent scatter radar electron density measurements during the Halloween 2003 storm events. Future STORM-E updates will extend the model outside the auroral oval.

  7. Minimization of vibration in elastic beams with time-variant boundary conditions

    NASA Technical Reports Server (NTRS)

    Amirouche, F. M. L.; Xie, Mingjun

    1992-01-01

    This paper presents an innovative method for minimizing the vibration of structures with time-variant boundary conditions (supports). The elastic body is modeled in two ways: (1) the first model is a letter seven type beam with a movable mass not to exceed the lower tip; (2) the second model has an arm that is a hollow beam with an inside mass with adjustable position. The complete solutions to both problems are carried out where the body is undergoing large rotation. The quasi-static procedure is used for the time-variant boundary conditions. The method developed employs partial differential equations governing the motion of the beam, including the effects of rigid-body motion, time-variant boundary conditions, and calculus of variations. The analytical solution is developed using Laplace and Fourier transforms. Examples of elastic robotic arms are given to illustrate the effectiveness of the methods developed.

  8. V/STOL tilt rotor study. Volume 5: A mathematical model for real time flight simulation of the Bell model 301 tilt rotor research aircraft

    NASA Technical Reports Server (NTRS)

    Harendra, P. B.; Joglekar, M. J.; Gaffey, T. M.; Marr, R. L.

    1973-01-01

    A mathematical model for real-time flight simulation of a tilt rotor research aircraft was developed. The mathematical model was used to support the aircraft design, pilot training, and proof-of-concept aspects of the development program. The structure of the mathematical model is indicated by a block diagram. The mathematical model differs from that for a conventional fixed wing aircraft principally in the added requirement to represent the dynamics and aerodynamics of the rotors, the interaction of the rotor wake with the airframe, and the rotor control and drive systems. The constraints imposed on the mathematical model are defined.

  9. Research and development program for non-linear structural modeling with advanced time-temperature dependent constitutive relationships

    NASA Technical Reports Server (NTRS)

    Walker, K. P.

    1981-01-01

    Results of a 20-month research and development program for nonlinear structural modeling with advanced time-temperature constitutive relationships are reported. The program included: (1) the evaluation of a number of viscoplastic constitutive models in the published literature; (2) incorporation of three of the most appropriate constitutive models into the MARC nonlinear finite element program; (3) calibration of the three constitutive models against experimental data using Hastelloy-X material; and (4) application of the most appropriate constitutive model to a three dimensional finite element analysis of a cylindrical combustor liner louver test specimen to establish the capability of the viscoplastic model to predict component structural response.

  10. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  11. Model Eliciting Activities: Fostering 21st Century Learners

    ERIC Educational Resources Information Center

    Stohlmann, Micah

    2013-01-01

    Real world mathematical modeling activities can develop needed and valuable 21st century skills. The knowledge and skills to become adept at mathematical modeling need to develop over time and students in the elementary grades should have experiences with mathematical modeling. For this to occur elementary teachers need to have positive…

  12. A Multivariate Model for the Meta-Analysis of Study Level Survival Data at Multiple Times

    ERIC Educational Resources Information Center

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-01-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and…

  13. Review of Thawing Time Prediction Models Depending
on Process Conditions and Product Characteristics

    PubMed Central

    Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna

    2016-01-01

    Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387

  14. Space logistics simulation: Launch-on-time

    NASA Technical Reports Server (NTRS)

    Nii, Kendall M.

    1990-01-01

    During 1989-1990 the Center for Space Construction developed the Launch-On-Time (L-O-T) Model to help asses and improve the likelihood of successfully supporting space construction requiring multi-logistic delivery flights. The model chose a reference by which the L-O-T probability and improvements to L-O-T probability can be judged. The measure of improvement was chosen as the percent reduction in E(S(sub N)), the total expected amount of unscheduled 'hold' time. We have also previously developed an approach to determining the reduction in E(S(sub N)) by reducing some of the causes of unscheduled holds and increasing the speed at which the problems causing the holds may be 'fixed.' We provided a mathematical (binary linear programming) model for measuring the percent reduction in E(S(sub N)) given such improvements. In this presentation we shall exercise the model which was developed and draw some conclusions about the following: methods used, data available and needed, and make suggestions for areas of improvement in 'real world' application of the model.

  15. Modelling breast cancer tumour growth for a stable disease population.

    PubMed

    Isheden, Gabriel; Humphreys, Keith

    2017-01-01

    Statistical models of breast cancer tumour progression have been used to further our knowledge of the natural history of breast cancer, to evaluate mammography screening in terms of mortality, to estimate overdiagnosis, and to estimate the impact of lead-time bias when comparing survival times between screen detected cancers and cancers found outside of screening programs. Multi-state Markov models have been widely used, but several research groups have proposed other modelling frameworks based on specifying an underlying biological continuous tumour growth process. These continuous models offer some advantages over multi-state models and have been used, for example, to quantify screening sensitivity in terms of mammographic density, and to quantify the effect of body size covariates on tumour growth and time to symptomatic detection. As of yet, however, the continuous tumour growth models are not sufficiently developed and require extensive computing to obtain parameter estimates. In this article, we provide a detailed description of the underlying assumptions of the continuous tumour growth model, derive new theoretical results for the model, and show how these results may help the development of this modelling framework. In illustrating the approach, we develop a model for mammography screening sensitivity, using a sample of 1901 post-menopausal women diagnosed with invasive breast cancer.

  16. Modeling the Secondary Drying Stage of Freeze Drying: Development and Validation of an Excel-Based Model.

    PubMed

    Sahni, Ekneet K; Pikal, Michael J

    2017-03-01

    Although several mathematical models of primary drying have been developed over the years, with significant impact on the efficiency of process design, models of secondary drying have been confined to highly complex models. The simple-to-use Excel-based model developed here is, in essence, a series of steady state calculations of heat and mass transfer in the 2 halves of the dry layer where drying time is divided into a large number of time steps, where in each time step steady state conditions prevail. Water desorption isotherm and mass transfer coefficient data are required. We use the Excel "Solver" to estimate the parameters that define the mass transfer coefficient by minimizing the deviations in water content between calculation and a calibration drying experiment. This tool allows the user to input the parameters specific to the product, process, container, and equipment. Temporal variations in average moisture contents and product temperatures are outputs and are compared with experiment. We observe good agreement between experiments and calculations, generally well within experimental error, for sucrose at various concentrations, temperatures, and ice nucleation temperatures. We conclude that this model can serve as an important process development tool for process design and manufacturing problem-solving. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  17. Compressed Sensing for Metrics Development

    NASA Astrophysics Data System (ADS)

    McGraw, R. L.; Giangrande, S. E.; Liu, Y.

    2012-12-01

    Models by their very nature tend to be sparse in the sense that they are designed, with a few optimally selected key parameters, to provide simple yet faithful representations of a complex observational dataset or computer simulation output. This paper seeks to apply methods from compressed sensing (CS), a new area of applied mathematics currently undergoing a very rapid development (see for example Candes et al., 2006), to FASTER needs for new approaches to model evaluation and metrics development. The CS approach will be illustrated for a time series generated using a few-parameter (i.e. sparse) model. A seemingly incomplete set of measurements, taken at a just few random sampling times, is then used to recover the hidden model parameters. Remarkably there is a sharp transition in the number of required measurements, beyond which both the model parameters and time series are recovered exactly. Applications to data compression, data sampling/collection strategies, and to the development of metrics for model evaluation by comparison with observation (e.g. evaluation of model predictions of cloud fraction using cloud radar observations) are presented and discussed in context of the CS approach. Cited reference: Candes, E. J., Romberg, J., and Tao, T. (2006), Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on Information Theory, 52, 489-509.

  18. Toward a Time-Domain Fractal Lightning Simulation

    NASA Astrophysics Data System (ADS)

    Liang, C.; Carlson, B. E.; Lehtinen, N. G.; Cohen, M.; Lauben, D.; Inan, U. S.

    2010-12-01

    Electromagnetic simulations of lightning are useful for prediction of lightning properties and exploration of the underlying physical behavior. Fractal lightning models predict the spatial structure of the discharge, but thus far do not provide much information about discharge behavior in time and therefore cannot predict electromagnetic wave emissions or current characteristics. Here we develop a time-domain fractal lightning simulation from Maxwell's equations, the method of moments with the thin wire approximation, an adaptive time-stepping scheme, and a simplified electrical model of the lightning channel. The model predicts current pulse structure and electromagnetic wave emissions and can be used to simulate the entire duration of a lightning discharge. The model can be used to explore the electrical characteristics of the lightning channel, the temporal development of the discharge, and the effects of these characteristics on observable electromagnetic wave emissions.

  19. Real-time control data wrangling for development of mathematical control models of technological processes

    NASA Astrophysics Data System (ADS)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  20. Time dependent data, time independent models: challenges of updating Australia's National Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Griffin, J.; Clark, D.; Allen, T.; Ghasemi, H.; Leonard, M.

    2017-12-01

    Standard probabilistic seismic hazard assessment (PSHA) simulates earthquake occurrence as a time-independent process. However paleoseismic studies in slowly deforming regions such as Australia show compelling evidence that large earthquakes on individual faults cluster within active periods, followed by long periods of quiescence. Therefore the instrumental earthquake catalog, which forms the basis of PSHA earthquake recurrence calculations, may only capture the state of the system over the period of the catalog. Together this means that data informing our PSHA may not be truly time-independent. This poses challenges in developing PSHAs for typical design probabilities (such as 10% in 50 years probability of exceedance): Is the present state observed through the instrumental catalog useful for estimating the next 50 years of earthquake hazard? Can paleo-earthquake data, that shows variations in earthquake frequency over time-scales of 10,000s of years or more, be robustly included in such PSHA models? Can a single PSHA logic tree be useful over a range of different probabilities of exceedance? In developing an updated PSHA for Australia, decadal-scale data based on instrumental earthquake catalogs (i.e. alternative area based source models and smoothed seismicity models) is integrated with paleo-earthquake data through inclusion of a fault source model. Use of time-dependent non-homogeneous Poisson models allows earthquake clustering to be modeled on fault sources with sufficient paleo-earthquake data. This study assesses the performance of alternative models by extracting decade-long segments of the instrumental catalog, developing earthquake probability models based on the remaining catalog, and testing performance against the extracted component of the catalog. Although this provides insights into model performance over the short-term, for longer timescales it is recognised that model choice is subject to considerable epistemic uncertainty. Therefore a formal expert elicitation process has been used to assign weights to alternative models for the 2018 update to Australia's national PSHA.

  1. MoPCoM Methodology: Focus on Models of Computation

    NASA Astrophysics Data System (ADS)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  2. Catalytic ignition model in a monolithic reactor with in-depth reaction

    NASA Technical Reports Server (NTRS)

    Tien, Ta-Ching; Tien, James S.

    1990-01-01

    Two transient models have been developed to study the catalytic ignition in a monolithic catalytic reactor. The special feature in these models is the inclusion of thermal and species structures in the porous catalytic layer. There are many time scales involved in the catalytic ignition problem, and these two models are developed with different time scales. In the full transient model, the equations are non-dimensionalized by the shortest time scale (mass diffusion across the catalytic layer). It is therefore accurate but is computationally costly. In the energy-integral model, only the slowest process (solid heat-up) is taken as nonsteady. It is approximate but computationally efficient. In the computations performed, the catalyst is platinum and the reactants are rich mixtures of hydrogen and oxygen. One-step global chemical reaction rates are used for both gas-phase homogeneous reaction and catalytic heterogeneous reaction. The computed results reveal the transient ignition processes in detail, including the structure variation with time in the reactive catalytic layer. An ignition map using reactor length and catalyst loading is constructed. The comparison of computed results between the two transient models verifies the applicability of the energy-integral model when the time is greater than the second largest time scale of the system. It also suggests that a proper combined use of the two models can catch all the transient phenomena while minimizing the computational cost.

  3. Ecological monitoring in a discrete-time prey-predator model.

    PubMed

    Gámez, M; López, I; Rodríguez, C; Varga, Z; Garay, J

    2017-09-21

    The paper is aimed at the methodological development of ecological monitoring in discrete-time dynamic models. In earlier papers, in the framework of continuous-time models, we have shown how a systems-theoretical methodology can be applied to the monitoring of the state process of a system of interacting populations, also estimating certain abiotic environmental changes such as pollution, climatic or seasonal changes. In practice, however, there may be good reasons to use discrete-time models. (For instance, there may be discrete cycles in the development of the populations, or observations can be made only at discrete time steps.) Therefore the present paper is devoted to the development of the monitoring methodology in the framework of discrete-time models of population ecology. By monitoring we mean that, observing only certain component(s) of the system, we reconstruct the whole state process. This may be necessary, e.g., when in a complex ecosystem the observation of the densities of certain species is impossible, or too expensive. For the first presentation of the offered methodology, we have chosen a discrete-time version of the classical Lotka-Volterra prey-predator model. This is a minimal but not trivial system where the methodology can still be presented. We also show how this methodology can be applied to estimate the effect of an abiotic environmental change, using a component of the population system as an environmental indicator. Although this approach is illustrated in a simplest possible case, it can be easily extended to larger ecosystems with several interacting populations and different types of abiotic environmental effects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Modeling commodity salam contract between two parties for discrete and continuous time series

    NASA Astrophysics Data System (ADS)

    Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd

    2017-08-01

    In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.

  5. Time optimal control of a jet engine using a quasi-Hermite interpolation model. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Comiskey, J. G.

    1979-01-01

    This work made preliminary efforts to generate nonlinear numerical models of a two-spooled turbofan jet engine, and subject these models to a known method of generating global, nonlinear, time optimal control laws. The models were derived numerically, directly from empirical data, as a first step in developing an automatic modelling procedure.

  6. Improving Gastric Cancer Outcome Prediction Using Single Time-Point Artificial Neural Network Models

    PubMed Central

    Nilsaz-Dezfouli, Hamid; Abu-Bakar, Mohd Rizam; Arasan, Jayanthi; Adam, Mohd Bakri; Pourhoseingholi, Mohamad Amin

    2017-01-01

    In cancer studies, the prediction of cancer outcome based on a set of prognostic variables has been a long-standing topic of interest. Current statistical methods for survival analysis offer the possibility of modelling cancer survivability but require unrealistic assumptions about the survival time distribution or proportionality of hazard. Therefore, attention must be paid in developing nonlinear models with less restrictive assumptions. Artificial neural network (ANN) models are primarily useful in prediction when nonlinear approaches are required to sift through the plethora of available information. The applications of ANN models for prognostic and diagnostic classification in medicine have attracted a lot of interest. The applications of ANN models in modelling the survival of patients with gastric cancer have been discussed in some studies without completely considering the censored data. This study proposes an ANN model for predicting gastric cancer survivability, considering the censored data. Five separate single time-point ANN models were developed to predict the outcome of patients after 1, 2, 3, 4, and 5 years. The performance of ANN model in predicting the probabilities of death is consistently high for all time points according to the accuracy and the area under the receiver operating characteristic curve. PMID:28469384

  7. On the Space-Time Structure of Sheared Turbulence

    NASA Astrophysics Data System (ADS)

    de Maré, Martin; Mann, Jakob

    2016-09-01

    We develop a model that predicts all two-point correlations in high Reynolds number turbulent flow, in both space and time. This is accomplished by combining the design philosophies behind two existing models, the Mann spectral velocity tensor, in which isotropic turbulence is distorted according to rapid distortion theory, and Kristensen's longitudinal coherence model, in which eddies are simultaneously advected by larger eddies as well as decaying. The model is compared with data from both observations and large-eddy simulations and is found to predict spatial correlations comparable to the Mann spectral tensor and temporal coherence better than any known model. Within the developed framework, Lagrangian two-point correlations in space and time are also predicted, and the predictions are compared with measurements of isotropic turbulence. The required input to the models, which are formulated as spectral velocity tensors, can be estimated from measured spectra or be derived from the rate of dissipation of turbulent kinetic energy, the friction velocity and the mean shear of the flow. The developed models can, for example, be used in wind-turbine engineering, in applications such as lidar-assisted feed forward control and wind-turbine wake modelling.

  8. Development of a subway operation incident delay model using accelerated failure time approaches.

    PubMed

    Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang

    2014-12-01

    This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A Distributed Online Curriculum and Courseware Development Model

    ERIC Educational Resources Information Center

    Durdu, Pinar Onay; Yalabik, Nese; Cagiltay, Kursat

    2009-01-01

    A distributed online curriculum and courseware development model (DONC[superscript 2]) is developed and tested in this study. Courseware development teams which may work in different institutions who need to develop high quality, reduced cost, on time products will be the users of DONC[superscript 2]. The related features from the disciplines of…

  10. Empirical modeling of environment-enhanced fatigue crack propagation in structural alloys for component life prediction

    NASA Technical Reports Server (NTRS)

    Richey, Edward, III

    1995-01-01

    This research aims to develop the methods and understanding needed to incorporate time and loading variable dependent environmental effects on fatigue crack propagation (FCP) into computerized fatigue life prediction codes such as NASA FLAGRO (NASGRO). In particular, the effect of loading frequency on FCP rates in alpha + beta titanium alloys exposed to an aqueous chloride solution is investigated. The approach couples empirical modeling of environmental FCP with corrosion fatigue experiments. Three different computer models have been developed and incorporated in the DOS executable program. UVAFAS. A multiple power law model is available, and can fit a set of fatigue data to a multiple power law equation. A model has also been developed which implements the Wei and Landes linear superposition model, as well as an interpolative model which can be utilized to interpolate trends in fatigue behavior based on changes in loading characteristics (stress ratio, frequency, and hold times).

  11. Predictive Microbiology and Food Safety Applications

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is the science of systematic study of recurrent events or phenomena. When models are properly developed, their applications may save costs and time. For microbial food safety research and applications, predictive microbiology models may be developed based on the fact that most ...

  12. Particle simulation of Coulomb collisions: Comparing the methods of Takizuka and Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Chiaming; Lin, Tungyou; Caflisch, Russel

    2008-04-20

    The interactions of charged particles in a plasma are governed by long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and statistical error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  13. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    PubMed

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  14. Estimating water temperatures in small streams in western Oregon using neural network models

    USGS Publications Warehouse

    Risley, John C.; Roehl, Edwin A.; Conrads, Paul

    2003-01-01

    Artificial neural network models were developed to estimate water temperatures in small streams using data collected at 148 sites throughout western Oregon from June to September 1999. The sites were located on 1st-, 2nd-, or 3rd-order streams having undisturbed or minimally disturbed conditions. Data collected at each site for model development included continuous hourly water temperature and description of riparian habitat. Additional data pertaining to the landscape characteristics of the basins upstream of the sites were assembled using geographic information system (GIS) techniques. Hourly meteorological time series data collected at 25 locations within the study region also were assembled. Clustering analysis was used to partition 142 sites into 3 groups. Separate models were developed for each group. The riparian habitat, basin characteristic, and meteorological time series data were independent variables and water temperature time series were dependent variables to the models, respectively. Approximately one-third of the data vectors were used for model training, and the remaining two-thirds were used for model testing. Critical input variables included riparian shade, site elevation, and percentage of forested area of the basin. Coefficient of determination and root mean square error for the models ranged from 0.88 to 0.99 and 0.05 to 0.59 oC, respectively. The models also were tested and validated using temperature time series, habitat, and basin landscape data from 6 sites that were separate from the 142 sites that were used to develop the models. The models are capable of estimating water temperatures at locations along 1st-, 2nd-, and 3rd-order streams in western Oregon. The model user must assemble riparian habitat and basin landscape characteristics data for a site of interest. These data, in addition to meteorological data, are model inputs. Output from the models include simulated hourly water temperatures for the June to September period. Adjustments can be made to the shade input data to simulate the effects of minimum or maximum shade on water temperatures.

  15. Spatial heterogeneity in the timing of birch budburst in response to future climate warming in Ireland

    NASA Astrophysics Data System (ADS)

    Caffarra, Amelia; Zottele, Fabio; Gleeson, Emily; Donnelly, Alison

    2014-05-01

    In order to predict the impact of future climate warming on trees it is important to quantify the effect climate has on their development. Our understanding of the phenological response to environmental drivers has given rise to various mathematical models of the annual growth cycle of plants. These models simulate the timing of phenophases by quantifying the relationship between development and its triggers, typically temperature. In addition, other environmental variables have an important role in determining the timing of budburst. For example, photoperiod has been shown to have a strong influence on phenological events of a number of tree species, including Betula pubescens (birch). A recently developed model for birch (DORMPHOT), which integrates the effects of temperature and photoperiod on budburst, was applied to future temperature projections from a 19-member ensemble of regional climate simulations (on a 25 km grid) generated as part of the ENSEMBLES project, to simulate the timing of birch budburst in Ireland each year up to the end of the present century. Gridded temperature time series data from the climate simulations were used as input to the DORMPHOT model to simulate future budburst timing. The results showed an advancing trend in the timing of birch budburst over most regions in Ireland up to 2100. Interestingly, this trend appeared greater in the northeast of the country than in the southwest, where budburst is currently relatively early. These results could have implications for future forest planning, species distribution modeling, and the birch allergy season.

  16. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  17. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  18. An adaptive time-stepping strategy for solving the phase field crystal model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhengru, E-mail: zrzhang@bnu.edu.cn; Ma, Yuan, E-mail: yuner1022@gmail.com; Qiao, Zhonghua, E-mail: zqiao@polyu.edu.hk

    2013-09-15

    In this work, we will propose an adaptive time step method for simulating the dynamics of the phase field crystal (PFC) model. The numerical simulation of the PFC model needs long time to reach steady state, and then large time-stepping method is necessary. Unconditionally energy stable schemes are used to solve the PFC model. The time steps are adaptively determined based on the time derivative of the corresponding energy. It is found that the use of the proposed time step adaptivity cannot only resolve the steady state solution, but also the dynamical development of the solution efficiently and accurately. Themore » numerical experiments demonstrate that the CPU time is significantly saved for long time simulations.« less

  19. Analysis of Parametric Adaptive Signal Detection with Applications to Radars and Hyperspectral Imaging

    DTIC Science & Technology

    2010-02-01

    98 8.4.5 Training Screening ............................. .................................................................99 8.5 Experimental...associated with the proposed parametric model. Several im- portant issues are discussed, including model order selection, training screening , and time...parameters associated with the NS-AR model. In addition, we develop model order selection, training screening , and time-series based whitening and

  20. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    ERIC Educational Resources Information Center

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  1. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  2. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  3. NAPL source zone depletion model and its application to railroad-tank-car spills.

    PubMed

    Marruffo, Amanda; Yoon, Hongkyu; Schaeffer, David J; Barkan, Christopher P L; Saat, Mohd Rapik; Werth, Charles J

    2012-01-01

    We developed a new semi-analytical source zone depletion model (SZDM) for multicomponent light nonaqueous phase liquids (LNAPLs) and incorporated this into an existing screening model for estimating cleanup times for chemical spills from railroad tank cars that previously considered only single-component LNAPLs. Results from the SZDM compare favorably to those from a three-dimensional numerical model, and from another semi-analytical model that does not consider source zone depletion. The model was used to evaluate groundwater contamination and cleanup times for four complex mixtures of concern in the railroad industry. Among the petroleum hydrocarbon mixtures considered, the cleanup time of diesel fuel was much longer than E95, gasoline, and crude oil. This is mainly due to the high fraction of low solubility components in diesel fuel. The results demonstrate that the updated screening model with the newly developed SZDM is computationally efficient, and provides valuable comparisons of cleanup times that can be used in assessing the health and financial risk associated with chemical mixture spills from railroad-tank-car accidents. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  4. Investigation on the Practicality of Developing Reduced Thermal Models

    NASA Technical Reports Server (NTRS)

    Lombardi, Giancarlo; Yang, Kan

    2015-01-01

    Throughout the spacecraft design and development process, detailed instrument thermal models are created to simulate their on-orbit behavior and to ensure that they do not exceed any thermal limits. These detailed models, while generating highly accurate predictions, can sometimes lead to long simulation run times, especially when integrated with a spacecraft observatory model. Therefore, reduced models containing less detail are typically produced in tandem with the detailed models so that results may be more readily available, albeit less accurate. In the current study, both reduced and detailed instrument models are integrated with their associated spacecraft bus models to examine the impact of instrument model reduction on run time and accuracy. Preexisting instrument bus thermal model pairs from several projects were used to determine trends between detailed and reduced thermal models; namely, the Mirror Optical Bench (MOB) on the Gravity and Extreme Magnetism Small Explorer (GEMS) spacecraft, Advanced Topography Laser Altimeter System (ATLAS) on the Ice, Cloud, and Elevation Satellite 2 (ICESat-2), and the Neutral Mass Spectrometer (NMS) on the Lunar Atmosphere and Dust Environment Explorer (LADEE). Hot and cold cases were run for each model to capture the behavior of the models at both thermal extremes. It was found that, though decreasing the number of nodes from a detailed to reduced model brought about a reduction in the run-time, a large time savings was not observed, nor was it a linear relationship between the percentage of nodes reduced and time saved. However, significant losses in accuracy were observed with greater model reduction. It was found that while reduced models are useful in decreasing run time, there exists a threshold of reduction where, once exceeded, the loss in accuracy outweighs the benefit from reduced model runtime.

  5. Delay functions in trip assignment for transport planning process

    NASA Astrophysics Data System (ADS)

    Leong, Lee Vien

    2017-10-01

    In transportation planning process, volume-delay and turn-penalty functions are the functions needed in traffic assignment to determine travel time on road network links. Volume-delay function is the delay function describing speed-flow relationship while turn-penalty function is the delay function associated to making a turn at intersection. The volume-delay function used in this study is the revised Bureau of Public Roads (BPR) function with the constant parameters, α and β values of 0.8298 and 3.361 while the turn-penalty functions for signalized intersection were developed based on uniform, random and overflow delay models. Parameters such as green time, cycle time and saturation flow were used in the development of turn-penalty functions. In order to assess the accuracy of the delay functions, road network in areas of Nibong Tebal, Penang and Parit Buntar, Perak was developed and modelled using transportation demand forecasting software. In order to calibrate the models, phase times and traffic volumes at fourteen signalised intersections within the study area were collected during morning and evening peak hours. The prediction of assigned volumes using the revised BPR function and the developed turn-penalty functions show close agreement to actual recorded traffic volume with the lowest percentage of accuracy, 80.08% and the highest, 93.04% for the morning peak model. As for the evening peak model, they were 75.59% and 95.33% respectively for lowest and highest percentage of accuracy. As for the yield left-turn lanes, the lowest percentage of accuracy obtained for the morning and evening peak models were 60.94% and 69.74% respectively while the highest percentage of accuracy obtained for both models were 100%. Therefore, can be concluded that the development and utilisation of delay functions based on local road conditions are important as localised delay functions can produce better estimate of link travel times and hence better planning for future scenarios.

  6. Predictive Modeling and Concentration of the Risk of Suicide: Implications for Preventive Interventions in the US Department of Veterans Affairs

    PubMed Central

    McCarthy, John F.; Katz, Ira R.; Thompson, Caitlin; Kemp, Janet; Hannemann, Claire M.; Nielson, Christopher; Schoenbaum, Michael

    2015-01-01

    Objectives. The Veterans Health Administration (VHA) evaluated the use of predictive modeling to identify patients at risk for suicide and to supplement ongoing care with risk-stratified interventions. Methods. Suicide data came from the National Death Index. Predictors were measures from VHA clinical records incorporating patient-months from October 1, 2008, to September 30, 2011, for all suicide decedents and 1% of living patients, divided randomly into development and validation samples. We used data on all patients alive on September 30, 2010, to evaluate predictions of suicide risk over 1 year. Results. Modeling demonstrated that suicide rates were 82 and 60 times greater than the rate in the overall sample in the highest 0.01% stratum for calculated risk for the development and validation samples, respectively; 39 and 30 times greater in the highest 0.10%; 14 and 12 times greater in the highest 1.00%; and 6.3 and 5.7 times greater in the highest 5.00%. Conclusions. Predictive modeling can identify high-risk patients who were not identified on clinical grounds. VHA is developing modeling to enhance clinical care and to guide the delivery of preventive interventions. PMID:26066914

  7. Predictive Modeling and Concentration of the Risk of Suicide: Implications for Preventive Interventions in the US Department of Veterans Affairs.

    PubMed

    McCarthy, John F; Bossarte, Robert M; Katz, Ira R; Thompson, Caitlin; Kemp, Janet; Hannemann, Claire M; Nielson, Christopher; Schoenbaum, Michael

    2015-09-01

    The Veterans Health Administration (VHA) evaluated the use of predictive modeling to identify patients at risk for suicide and to supplement ongoing care with risk-stratified interventions. Suicide data came from the National Death Index. Predictors were measures from VHA clinical records incorporating patient-months from October 1, 2008, to September 30, 2011, for all suicide decedents and 1% of living patients, divided randomly into development and validation samples. We used data on all patients alive on September 30, 2010, to evaluate predictions of suicide risk over 1 year. Modeling demonstrated that suicide rates were 82 and 60 times greater than the rate in the overall sample in the highest 0.01% stratum for calculated risk for the development and validation samples, respectively; 39 and 30 times greater in the highest 0.10%; 14 and 12 times greater in the highest 1.00%; and 6.3 and 5.7 times greater in the highest 5.00%. Predictive modeling can identify high-risk patients who were not identified on clinical grounds. VHA is developing modeling to enhance clinical care and to guide the delivery of preventive interventions.

  8. Comparative study of predicted and experimentally detected interplanetary shocks

    NASA Astrophysics Data System (ADS)

    Kartalev, M. D.; Grigorov, K. G.; Smith, Z.; Dryer, M.; Fry, C. D.; Sun, Wei; Deehr, C. S.

    2002-03-01

    We compare the real time space weather prediction shock arrival times at 1 AU made by the USAF/NOAA Shock Time of Arrival (STOA) and Interplanetary Shock Propagation Model (ISPM) models, and the Exploration Physics International/University of Alaska Hakamada-Akasofu-Fry Solar Wind Model (HAF-v2) to a real time analysis analysis of plasma and field ACE data. The comparison is made using an algorithm that was developed on the basis of wavelet data analysis and MHD identification procedure. The shock parameters are estimated for selected "candidate events". An appropriate automatically performing Web-based interface periodically utilizes solar wind observations made by the ACE at L1. Near real time results as well an archive of the registered interesting events are available on a specially developed web site. A number of events are considered. These studies are essential for the validation of real time space weather forecasts made from solar data.

  9. Some methodological issues in the longitudinal analysis of demographic data.

    PubMed

    Krishinan, P

    1982-12-01

    Most demographic data are macro (or aggregate) in nature. Some relevant methodological issues are presented here in a time series study using aggregate data. The micro-macro distinction is relative. Time enters into the micro and macro variables in different ways. A simple micro model of rural-urban migration is given. Method 1 is to assume homogeneity in behavior. Method 2 is a Bayesian estimation. A discusssion of the results follows. Time series models of aggregate data are given. The nature of the model--predictive or explanatory--must be decided on. Explanatory models in longitudinal studies have been developed. Ways to go to the micro level from the macro are discussed. The aggregation-disaggregation problem in demography is not similar to that in econometrics. To understand small populations, separate micro level data have to be collected and analyzed and appropriate models developed. Both types of models have their uses.

  10. A simple dynamic engine model for use in a real-time aircraft simulation with thrust vectoring

    NASA Technical Reports Server (NTRS)

    Johnson, Steven A.

    1990-01-01

    A simple dynamic engine model was developed at the NASA Ames Research Center, Dryden Flight Research Facility, for use in thrust vectoring control law development and real-time aircraft simulation. The simple dynamic engine model of the F404-GE-400 engine (General Electric, Lynn, Massachusetts) operates within the aircraft simulator. It was developed using tabular data generated from a complete nonlinear dynamic engine model supplied by the manufacturer. Engine dynamics were simulated using a throttle rate limiter and low-pass filter. Included is a description of a method to account for axial thrust loss resulting from thrust vectoring. In addition, the development of the simple dynamic engine model and its incorporation into the F-18 high alpha research vehicle (HARV) thrust vectoring simulation. The simple dynamic engine model was evaluated at Mach 0.2, 35,000 ft altitude and at Mach 0.7, 35,000 ft altitude. The simple dynamic engine model is within 3 percent of the steady state response, and within 25 percent of the transient response of the complete nonlinear dynamic engine model.

  11. Response-Time Tests of Logical-Rule Models of Categorization

    ERIC Educational Resources Information Center

    Little, Daniel R.; Nosofsky, Robert M.; Denton, Stephen E.

    2011-01-01

    A recent resurgence in logical-rule theories of categorization has motivated the development of a class of models that predict not only choice probabilities but also categorization response times (RTs; Fific, Little, & Nosofsky, 2010). The new models combine mental-architecture and random-walk approaches within an integrated framework and…

  12. Improved theory of time domain reflectometry with variable coaxial cable length for electrical conductivity measurements

    USDA-ARS?s Scientific Manuscript database

    Although empirical models have been developed previously, a mechanistic model is needed for estimating electrical conductivity (EC) using time domain reflectometry (TDR) with variable lengths of coaxial cable. The goals of this study are to: (1) derive a mechanistic model based on multisection tra...

  13. Time series analysis of monthly pulpwood use in the Northeast

    Treesearch

    James T. Bones

    1980-01-01

    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  14. DEVELOPING EMISSION INVENTORIES FOR BIOMASS BURNING FOR REAL-TIME AND RETROSPECTIVE MODELING

    EPA Science Inventory

    The EPA uses chemical transport models to simulate historic meteorological episodes for developing air quality management strategies. In addition, chemical transport models are now being used operationally to create air quality forecasts. There are currently a number of methods a...

  15. Development of a unified constitutive model for an isotropic nickel base superalloy Rene 80

    NASA Technical Reports Server (NTRS)

    Ramaswamy, V. G.; Vanstone, R. H.; Laflen, J. H.; Stouffer, D. C.

    1988-01-01

    Accurate analysis of stress-strain behavior is of critical importance in the evaluation of life capabilities of hot section turbine engine components such as turbine blades and vanes. The constitutive equations used in the finite element analysis of such components must be capable of modeling a variety of complex behavior exhibited at high temperatures by cast superalloys. The classical separation of plasticity and creep employed in most of the finite element codes in use today is known to be deficient in modeling elevated temperature time dependent phenomena. Rate dependent, unified constitutive theories can overcome many of these difficulties. A new unified constitutive theory was developed to model the high temperature, time dependent behavior of Rene' 80 which is a cast turbine blade and vane nickel base superalloy. Considerations in model development included the cyclic softening behavior of Rene' 80, rate independence at lower temperatures and the development of a new model for static recovery.

  16. Simscape Modeling Verification in the Simulink Development Environment

    NASA Technical Reports Server (NTRS)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  17. Use of high performance networks and supercomputers for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  18. Application of troposphere model from NWP and GNSS data into real-time precise positioning

    NASA Astrophysics Data System (ADS)

    Wilgan, Karina; Hadas, Tomasz; Kazmierski, Kamil; Rohm, Witold; Bosy, Jaroslaw

    2016-04-01

    The tropospheric delay empirical models are usually functions of meteorological parameters (temperature, pressure and humidity). The application of standard atmosphere parameters or global models, such as GPT (global pressure/temperature) model or UNB3 (University of New Brunswick, version 3) model, may not be sufficient, especially for positioning in non-standard weather conditions. The possible solution is to use regional troposphere models based on real-time or near-real time measurements. We implement a regional troposphere model into the PPP (Precise Point Positioning) software GNSS-WARP (Wroclaw Algorithms for Real-time Positioning) developed at Wroclaw University of Environmental and Life Sciences. The software is capable of processing static and kinematic multi-GNSS data in real-time and post-processing mode and takes advantage of final IGS (International GNSS Service) products as well as IGS RTS (Real-Time Service) products. A shortcoming of PPP technique is the time required for the solution to converge. One of the reasons is the high correlation among the estimated parameters: troposphere delay, receiver clock offset and receiver height. To efficiently decorrelate these parameters, a significant change in satellite geometry is required. Alternative solution is to introduce the external high-quality regional troposphere delay model to constrain troposphere estimates. The proposed model consists of zenith total delays (ZTD) and mapping functions calculated from meteorological parameters from Numerical Weather Prediction model WRF (Weather Research and Forecasting) and ZTDs from ground-based GNSS stations using the least-squares collocation software COMEDIE (Collocation of Meteorological Data for Interpretation and Estimation of Tropospheric Pathdelays) developed at ETH Zurich.

  19. Nonlinear modeling of chaotic time series: Theory and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casdagli, M.; Eubank, S.; Farmer, J.D.

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less

  20. Using features of Arden Syntax with object-oriented medical data models for guideline modeling.

    PubMed

    Peleg, M; Ogunyemi, O; Tu, S; Boxwala, A A; Zeng, Q; Greenes, R A; Shortliffe, E H

    2001-01-01

    Computer-interpretable guidelines (CIGs) can deliver patient-specific decision support at the point of care. CIGs base their recommendations on eligibility and decision criteria that relate medical concepts to patient data. CIG models use expression languages for specifying these criteria, and define models for medical data to which the expressions can refer. In developing version 3 of the GuideLine Interchange Format (GLIF3), we used existing standards as the medical data model and expression language. We investigated the object-oriented HL7 Reference Information Model (RIM) as a default data model. We developed an expression language, called GEL, based on Arden Syntax's logic grammar. Together with other GLIF constructs, GEL reconciles incompatibilities between the data models of Arden Syntax and the HL7 RIM. These incompatibilities include Arden's lack of support for complex data types and time intervals, and the mismatch between Arden's single primary time and multiple time attributes of the HL7 RIM.

  1. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  2. A Box-Cox normal model for response times.

    PubMed

    Klein Entink, R H; van der Linden, W J; Fox, J-P

    2009-11-01

    The log-transform has been a convenient choice in response time modelling on test items. However, motivated by a dataset of the Medical College Admission Test where the lognormal model violated the normality assumption, the possibilities of the broader class of Box-Cox transformations for response time modelling are investigated. After an introduction and an outline of a broader framework for analysing responses and response times simultaneously, the performance of a Box-Cox normal model for describing response times is investigated using simulation studies and a real data example. A transformation-invariant implementation of the deviance information criterium (DIC) is developed that allows for comparing model fit between models with different transformation parameters. Showing an enhanced description of the shape of the response time distributions, its application in an educational measurement context is discussed at length.

  3. NASA AVOSS Fast-Time Models for Aircraft Wake Prediction: User's Guide (APA3.8 and TDP2.1)

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew J.; Limon Duparcmeur, Fanny M.

    2016-01-01

    NASA's current distribution of fast-time wake vortex decay and transport models includes APA (Version 3.8) and TDP (Version 2.1). This User's Guide provides detailed information on the model inputs, file formats, and model outputs. A brief description of the Memphis 1995, Dallas/Fort Worth 1997, and the Denver 2003 wake vortex datasets is given along with the evaluation of models. A detailed bibliography is provided which includes publications on model development, wake field experiment descriptions, and applications of the fast-time wake vortex models.

  4. USE OF TRANS-CONTEXTUAL MODEL-BASED PHYSICAL ACTIVITY COURSE IN DEVELOPING LEISURE-TIME PHYSICAL ACTIVITY BEHAVIOR OF UNIVERSITY STUDENTS.

    PubMed

    Müftüler, Mine; İnce, Mustafa Levent

    2015-08-01

    This study examined how a physical activity course based on the Trans-Contextual Model affected the variables of perceived autonomy support, autonomous motivation, determinants of leisure-time physical activity behavior, basic psychological needs satisfaction, and leisure-time physical activity behaviors. The participants were 70 Turkish university students (M age=23.3 yr., SD=3.2). A pre-test-post-test control group design was constructed. Initially, the participants were randomly assigned into an experimental (n=35) and a control (n=35) group. The experimental group followed a 12 wk. trans-contextual model-based intervention. The participants were pre- and post-tested in terms of Trans-Contextual Model constructs and of self-reported leisure-time physical activity behaviors. Multivariate analyses showed significant increases over the 12 wk. period for perceived autonomy support from instructor and peers, autonomous motivation in leisure-time physical activity setting, positive intention and perceived behavioral control over leisure-time physical activity behavior, more fulfillment of psychological needs, and more engagement in leisure-time physical activity behavior in the experimental group. These results indicated that the intervention was effective in developing leisure-time physical activity and indicated that the Trans-Contextual Model is a useful way to conceptualize these relationships.

  5. Study on Development of 1D-2D Coupled Real-time Urban Inundation Prediction model

    NASA Astrophysics Data System (ADS)

    Lee, Seungsoo

    2017-04-01

    In recent years, we are suffering abnormal weather condition due to climate change around the world. Therefore, countermeasures for flood defense are urgent task. In this research, study on development of 1D-2D coupled real-time urban inundation prediction model using predicted precipitation data based on remote sensing technology is conducted. 1 dimensional (1D) sewerage system analysis model which was introduced by Lee et al. (2015) is used to simulate inlet and overflow phenomena by interacting with surface flown as well as flows in conduits. 2 dimensional (2D) grid mesh refinement method is applied to depict road networks for effective calculation time. 2D surface model is coupled with 1D sewerage analysis model in order to consider bi-directional flow between both. Also parallel computing method, OpenMP, is applied to reduce calculation time. The model is estimated by applying to 25 August 2014 extreme rainfall event which caused severe inundation damages in Busan, Korea. Oncheoncheon basin is selected for study basin and observed radar data are assumed as predicted rainfall data. The model shows acceptable calculation speed with accuracy. Therefore it is expected that the model can be used for real-time urban inundation forecasting system to minimize damages.

  6. Modelling land use change with generalized linear models--a multi-model analysis of change between 1860 and 2000 in Gallatin Valley, Montana.

    PubMed

    Aspinall, Richard

    2004-08-01

    This paper develops an approach to modelling land use change that links model selection and multi-model inference with empirical models and GIS. Land use change is frequently studied, and understanding gained, through a process of modelling that is an empirical analysis of documented changes in land cover or land use patterns. The approach here is based on analysis and comparison of multiple models of land use patterns using model selection and multi-model inference. The approach is illustrated with a case study of rural housing as it has developed for part of Gallatin County, Montana, USA. A GIS contains the location of rural housing on a yearly basis from 1860 to 2000. The database also documents a variety of environmental and socio-economic conditions. A general model of settlement development describes the evolution of drivers of land use change and their impacts in the region. This model is used to develop a series of different models reflecting drivers of change at different periods in the history of the study area. These period specific models represent a series of multiple working hypotheses describing (a) the effects of spatial variables as a representation of social, economic and environmental drivers of land use change, and (b) temporal changes in the effects of the spatial variables as the drivers of change evolve over time. Logistic regression is used to calibrate and interpret these models and the models are then compared and evaluated with model selection techniques. Results show that different models are 'best' for the different periods. The different models for different periods demonstrate that models are not invariant over time which presents challenges for validation and testing of empirical models. The research demonstrates (i) model selection as a mechanism for rating among many plausible models that describe land cover or land use patterns, (ii) inference from a set of models rather than from a single model, (iii) that models can be developed based on hypothesised relationships based on consideration of underlying and proximate causes of change, and (iv) that models are not invariant over time.

  7. Personalized long-term prediction of cognitive function: Using sequential assessments to improve model performance.

    PubMed

    Chi, Chih-Lin; Zeng, Wenjun; Oh, Wonsuk; Borson, Soo; Lenskaia, Tatiana; Shen, Xinpeng; Tonellato, Peter J

    2017-12-01

    Prediction of onset and progression of cognitive decline and dementia is important both for understanding the underlying disease processes and for planning health care for populations at risk. Predictors identified in research studies are typically accessed at one point in time. In this manuscript, we argue that an accurate model for predicting cognitive status over relatively long periods requires inclusion of time-varying components that are sequentially assessed at multiple time points (e.g., in multiple follow-up visits). We developed a pilot model to test the feasibility of using either estimated or observed risk factors to predict cognitive status. We developed two models, the first using a sequential estimation of risk factors originally obtained from 8 years prior, then improved by optimization. This model can predict how cognition will change over relatively long time periods. The second model uses observed rather than estimated time-varying risk factors and, as expected, results in better prediction. This model can predict when newly observed data are acquired in a follow-up visit. Performances of both models that are evaluated in10-fold cross-validation and various patient subgroups show supporting evidence for these pilot models. Each model consists of multiple base prediction units (BPUs), which were trained using the same set of data. The difference in usage and function between the two models is the source of input data: either estimated or observed data. In the next step of model refinement, we plan to integrate the two types of data together to flexibly predict dementia status and changes over time, when some time-varying predictors are measured only once and others are measured repeatedly. Computationally, both data provide upper and lower bounds for predictive performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Program of research in severe storms

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Two modeling areas, the development of a mesoscale chemistry-meteorology interaction model, and the development of a combined urban chemical kinetics-transport model are examined. The problems associated with developing a three dimensional combined meteorological-chemical kinetics computer program package are defined. A similar three dimensional hydrostatic real time model which solves the fundamental Navier-Stokes equations for nonviscous flow is described. An urban air quality simulation model, developed to predict the temporal and spatial distribution of reactive and nonreactive gases in and around an urban area and to support a remote sensor evaluation program is reported.

  9. Assessing patient risk of central line-associated bacteremia via machine learning.

    PubMed

    Beeler, Cole; Dbeibo, Lana; Kelley, Kristen; Thatcher, Levi; Webb, Douglas; Bah, Amadou; Monahan, Patrick; Fowler, Nicole R; Nicol, Spencer; Judy-Malcolm, Alisa; Azar, Jose

    2018-04-13

    Central line-associated bloodstream infections (CLABSIs) contribute to increased morbidity, length of hospital stay, and cost. Despite progress in understanding the risk factors, there remains a need to accurately predict the risk of CLABSIs and, in real time, prevent them from occurring. A predictive model was developed using retrospective data from a large academic healthcare system. Models were developed with machine learning via construction of random forests using validated input variables. Fifteen variables accounted for the most significant effect on CLABSI prediction based on a retrospective study of 70,218 unique patient encounters between January 1, 2013, and May 31, 2016. The area under the receiver operating characteristic curve for the best-performing model was 0.82 in production. This model has multiple applications for resource allocation for CLABSI prevention, including serving as a tool to target patients at highest risk for potentially cost-effective but otherwise time-limited interventions. Machine learning can be used to develop accurate models to predict the risk of CLABSI in real time prior to the development of infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  10. A conceptual model for the development process of confirmatory adaptive clinical trials within an emergency research network.

    PubMed

    Mawocha, Samkeliso C; Fetters, Michael D; Legocki, Laurie J; Guetterman, Timothy C; Frederiksen, Shirley; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J

    2017-06-01

    Adaptive clinical trials use accumulating data from enrolled subjects to alter trial conduct in pre-specified ways based on quantitative decision rules. In this research, we sought to characterize the perspectives of key stakeholders during the development process of confirmatory-phase adaptive clinical trials within an emergency clinical trials network and to build a model to guide future development of adaptive clinical trials. We used an ethnographic, qualitative approach to evaluate key stakeholders' views about the adaptive clinical trial development process. Stakeholders participated in a series of multidisciplinary meetings during the development of five adaptive clinical trials and completed a Strengths-Weaknesses-Opportunities-Threats questionnaire. In the analysis, we elucidated overarching themes across the stakeholders' responses to develop a conceptual model. Four major overarching themes emerged during the analysis of stakeholders' responses to questioning: the perceived statistical complexity of adaptive clinical trials and the roles of collaboration, communication, and time during the development process. Frequent and open communication and collaboration were viewed by stakeholders as critical during the development process, as were the careful management of time and logistical issues related to the complexity of planning adaptive clinical trials. The Adaptive Design Development Model illustrates how statistical complexity, time, communication, and collaboration are moderating factors in the adaptive design development process. The intensity and iterative nature of this process underscores the need for funding mechanisms for the development of novel trial proposals in academic settings.

  11. A General and Efficient Method for Incorporating Precise Spike Times in Globally Time-Driven Simulations

    PubMed Central

    Hanuschkin, Alexander; Kunkel, Susanne; Helias, Moritz; Morrison, Abigail; Diesmann, Markus

    2010-01-01

    Traditionally, event-driven simulations have been limited to the very restricted class of neuronal models for which the timing of future spikes can be expressed in closed form. Recently, the class of models that is amenable to event-driven simulation has been extended by the development of techniques to accurately calculate firing times for some integrate-and-fire neuron models that do not enable the prediction of future spikes in closed form. The motivation of this development is the general perception that time-driven simulations are imprecise. Here, we demonstrate that a globally time-driven scheme can calculate firing times that cannot be discriminated from those calculated by an event-driven implementation of the same model; moreover, the time-driven scheme incurs lower computational costs. The key insight is that time-driven methods are based on identifying a threshold crossing in the recent past, which can be implemented by a much simpler algorithm than the techniques for predicting future threshold crossings that are necessary for event-driven approaches. As run time is dominated by the cost of the operations performed at each incoming spike, which includes spike prediction in the case of event-driven simulation and retrospective detection in the case of time-driven simulation, the simple time-driven algorithm outperforms the event-driven approaches. Additionally, our method is generally applicable to all commonly used integrate-and-fire neuronal models; we show that a non-linear model employing a standard adaptive solver can reproduce a reference spike train with a high degree of precision. PMID:21031031

  12. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  13. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    PubMed

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Summary of the key features of seven biomathematical models of human fatigue and performance.

    PubMed

    Mallis, Melissa M; Mejdal, Sig; Nguyen, Tammy T; Dinges, David F

    2004-03-01

    Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbély, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.

  15. Summary of the key features of seven biomathematical models of human fatigue and performance

    NASA Technical Reports Server (NTRS)

    Mallis, Melissa M.; Mejdal, Sig; Nguyen, Tammy T.; Dinges, David F.

    2004-01-01

    BACKGROUND: Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. METHODS: An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. RESULTS: Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. CONCLUSIONS: Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbely, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.

  16. Diagnosis of delay-deadline failures in real time discrete event models.

    PubMed

    Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha

    2007-10-01

    In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.

  17. Extrapolation of a predictive model for growth of a low inoculum size of Salmonella typhimurium DT104 on chicken skin to higher inoculum sizes

    USDA-ARS?s Scientific Manuscript database

    Validation of model predictions for independent variables not included in model development can save time and money by identifying conditions for which new models are not needed. A single strain of Salmonella Typhimurium DT104 was used to develop a general regression neural network model for growth...

  18. Development of Aeroservoelastic Analytical Models and Gust Load Alleviation Control Laws of a SensorCraft Wind-Tunnel Model Using Measured Data

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Vartio, Eric; Shimko, Anthony; Kvaternik, Raymond G.; Eure, Kenneth W.; Scott,Robert C.

    2007-01-01

    Aeroservoelastic (ASE) analytical models of a SensorCraft wind-tunnel model are generated using measured data. The data was acquired during the ASE wind-tunnel test of the HiLDA (High Lift-to-Drag Active) Wing model, tested in the NASA Langley Transonic Dynamics Tunnel (TDT) in late 2004. Two time-domain system identification techniques are applied to the development of the ASE analytical models: impulse response (IR) method and the Generalized Predictive Control (GPC) method. Using measured control surface inputs (frequency sweeps) and associated sensor responses, the IR method is used to extract corresponding input/output impulse response pairs. These impulse responses are then transformed into state-space models for use in ASE analyses. Similarly, the GPC method transforms measured random control surface inputs and associated sensor responses into an AutoRegressive with eXogenous input (ARX) model. The ARX model is then used to develop the gust load alleviation (GLA) control law. For the IR method, comparison of measured with simulated responses are presented to investigate the accuracy of the ASE analytical models developed. For the GPC method, comparison of simulated open-loop and closed-loop (GLA) time histories are presented.

  19. Development of Aeroservoelastic Analytical Models and Gust Load Alleviation Control Laws of a SensorCraft Wind-Tunnel Model Using Measured Data

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Shimko, Anthony; Kvaternik, Raymond G.; Eure, Kenneth W.; Scott, Robert C.

    2006-01-01

    Aeroservoelastic (ASE) analytical models of a SensorCraft wind-tunnel model are generated using measured data. The data was acquired during the ASE wind-tunnel test of the HiLDA (High Lift-to-Drag Active) Wing model, tested in the NASA Langley Transonic Dynamics Tunnel (TDT) in late 2004. Two time-domain system identification techniques are applied to the development of the ASE analytical models: impulse response (IR) method and the Generalized Predictive Control (GPC) method. Using measured control surface inputs (frequency sweeps) and associated sensor responses, the IR method is used to extract corresponding input/output impulse response pairs. These impulse responses are then transformed into state-space models for use in ASE analyses. Similarly, the GPC method transforms measured random control surface inputs and associated sensor responses into an AutoRegressive with eXogenous input (ARX) model. The ARX model is then used to develop the gust load alleviation (GLA) control law. For the IR method, comparison of measured with simulated responses are presented to investigate the accuracy of the ASE analytical models developed. For the GPC method, comparison of simulated open-loop and closed-loop (GLA) time histories are presented.

  20. The promise of the state space approach to time series analysis for nursing research.

    PubMed

    Levy, Janet A; Elser, Heather E; Knobel, Robin B

    2012-01-01

    Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.

  1. Is questionnaire-based sitting time inaccurate and can it be improved? A cross-sectional investigation using accelerometer-based sitting time

    PubMed Central

    Gupta, Nidhi; Christiansen, Caroline Stordal; Hanisch, Christiana; Bay, Hans; Burr, Hermann; Holtermann, Andreas

    2017-01-01

    Objectives To investigate the differences between a questionnaire-based and accelerometer-based sitting time, and develop a model for improving the accuracy of questionnaire-based sitting time for predicting accelerometer-based sitting time. Methods 183 workers in a cross-sectional study reported sitting time per day using a single question during the measurement period, and wore 2 Actigraph GT3X+ accelerometers on the thigh and trunk for 1–4 working days to determine their actual sitting time per day using the validated Acti4 software. Least squares regression models were fitted with questionnaire-based siting time and other self-reported predictors to predict accelerometer-based sitting time. Results Questionnaire-based and accelerometer-based average sitting times were ≈272 and ≈476 min/day, respectively. A low Pearson correlation (r=0.32), high mean bias (204.1 min) and wide limits of agreement (549.8 to −139.7 min) between questionnaire-based and accelerometer-based sitting time were found. The prediction model based on questionnaire-based sitting explained 10% of the variance in accelerometer-based sitting time. Inclusion of 9 self-reported predictors in the model increased the explained variance to 41%, with 10% optimism using a resampling bootstrap validation. Based on a split validation analysis, the developed prediction model on ≈75% of the workers (n=132) reduced the mean and the SD of the difference between questionnaire-based and accelerometer-based sitting time by 64% and 42%, respectively, in the remaining 25% of the workers. Conclusions This study indicates that questionnaire-based sitting time has low validity and that a prediction model can be one solution to materially improve the precision of questionnaire-based sitting time. PMID:28093433

  2. Predicting the cumulative risk of death during hospitalization by modeling weekend, weekday and diurnal mortality risks.

    PubMed

    Coiera, Enrico; Wang, Ying; Magrabi, Farah; Concha, Oscar Perez; Gallego, Blanca; Runciman, William

    2014-05-21

    Current prognostic models factor in patient and disease specific variables but do not consider cumulative risks of hospitalization over time. We developed risk models of the likelihood of death associated with cumulative exposure to hospitalization, based on time-varying risks of hospitalization over any given day, as well as day of the week. Model performance was evaluated alone, and in combination with simple disease-specific models. Patients admitted between 2000 and 2006 from 501 public and private hospitals in NSW, Australia were used for training and 2007 data for evaluation. The impact of hospital care delivered over different days of the week and or times of the day was modeled by separating hospitalization risk into 21 separate time periods (morning, day, night across the days of the week). Three models were developed to predict death up to 7-days post-discharge: 1/a simple background risk model using age, gender; 2/a time-varying risk model for exposure to hospitalization (admission time, days in hospital); 3/disease specific models (Charlson co-morbidity index, DRG). Combining these three generated a full model. Models were evaluated by accuracy, AUC, Akaike and Bayesian information criteria. There was a clear diurnal rhythm to hospital mortality in the data set, peaking in the evening, as well as the well-known 'weekend-effect' where mortality peaks with weekend admissions. Individual models had modest performance on the test data set (AUC 0.71, 0.79 and 0.79 respectively). The combined model which included time-varying risk however yielded an average AUC of 0.92. This model performed best for stays up to 7-days (93% of admissions), peaking at days 3 to 5 (AUC 0.94). Risks of hospitalization vary not just with the day of the week but also time of the day, and can be used to make predictions about the cumulative risk of death associated with an individual's hospitalization. Combining disease specific models with such time varying- estimates appears to result in robust predictive performance. Such risk exposure models should find utility both in enhancing standard prognostic models as well as estimating the risk of continuation of hospitalization.

  3. Real-time simulation of a Doubly-Fed Induction Generator based wind power system on eMEGASimRTM Real-Time Digital Simulator

    NASA Astrophysics Data System (ADS)

    Boakye-Boateng, Nasir Abdulai

    The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.

  4. Time-Dependent Moment Tensors of the First Four Source Physics Experiments (SPE) Explosions

    NASA Astrophysics Data System (ADS)

    Yang, X.

    2015-12-01

    We use mainly vertical-component geophone data within 2 km from the epicenter to invert for time-dependent moment tensors of the first four SPE explosions: SPE-1, SPE-2, SPE-3 and SPE-4Prime. We employ a one-dimensional (1D) velocity model developed from P- and Rg-wave travel times for Green's function calculations. The attenuation structure of the model is developed from P- and Rg-wave amplitudes. We select data for the inversion based on the criterion that they show consistent travel times and amplitude behavior as those predicted by the 1D model. Due to limited azimuthal coverage of the sources and the mostly vertical-component-only nature of the dataset, only long-period, diagonal components of the moment tensors are well constrained. Nevertheless, the moment tensors, particularly their isotropic components, provide reasonable estimates of the long-period source amplitudes as well as estimates of corner frequencies, albeit with larger uncertainties. The estimated corner frequencies, however, are consistent with estimates from ratios of seismogram spectra from different explosions. These long-period source amplitudes and corner frequencies cannot be fit by classical P-wave explosion source models. The results motivate the development of new P-wave source models suitable for these chemical explosions. To that end, we fit inverted moment-tensor spectra by modifying the classical explosion model using regressions of estimated source parameters. Although the number of data points used in the regression is small, the approach suggests a way for the new-model development when more data are collected.

  5. Development of a nonlinear model for the prediction of response times of glucose affinity sensors using concanavalin A and dextran and the development of a differential osmotic glucose affinity sensor

    NASA Astrophysics Data System (ADS)

    Reis, Louis G.

    With the increasing prevalence of diabetes in the United States and worldwide, blood glucose monitoring must be accurate and reliable. Current enzymatic sensors have numerous disadvantages that make them unreliable and unfavorable among patients. Recent research in glucose affinity sensors correct some of the problems that enzymatic sensors experience. Dextran and concanavalin A are two of the more common components used in glucose affinity sensors. When these sensors were first explored, a model was derived to predict the response time of a glucose affinity sensor using concanavalin A and dextran. However, the model assumed the system was linear and fell short of calculating times representative of the response times determined through experimental tests with the sensors. In this work, a new model that uses the Stokes-Einstein Equation to demonstrate the nonlinear behavior of the glucose affinity assay was developed to predict the response times of similar glucose affinity sensors. In addition to the device tested by the original linear model, additional devices were identified and tested with the proposed model. The nonlinear model was designed to accommodate the many different variations between systems. The proposed model was able to accurately calculate response times for sensors using the concanavalin A-dextran affinity assay with respect to the experimentally reported times by the independent research groups. Parameter studies using the nonlinear model were able to identify possible setbacks that could compromise the response of thesystem. Specifically, the model showed that the improper use of asymmetrical membranes could increase the response time by as little as 20% or more as the device is miniaturized. The model also demonstrated that systems using the concanavalin Adextran assay would experience higher response times in the hypoglycemic range. This work attempted to replicate and improve an osmotic glucose affinity sensor. The system was designed to negate additional effects that could cause artifacts or irregular readings such as external osmotic differences and external pressure differences. However, the experimental setup and execution faced numerous setbacks that highlighted the additional difficulty that sensors using asymmetrical ceramic membranes and the concanavalin A-dextran affinity assay may experience.

  6. The Interaction Between Pubertal Timing and Peer Popularity for Boys and Girls: An Integration of Biological and Interpersonal Perspectives on Adolescent Depression.

    PubMed

    Teunissen, Hanneke A; Adelman, Caroline B; Prinstein, Mitchell J; Spijkerman, Renske; Poelen, Evelien A P; Engels, Rutger C M E; Scholte, Ron H J

    2011-04-01

    The transition to adolescence marks a time of sharply increased vulnerability to the development of depression, particularly among girls. Past research has examined isolated risk factors from individual theoretical models (e.g., biological, interpersonal, and cognitive) of depression, but few have examined integrative models. This study investigated the conjoint effects of early pubertal timing and popularity in the longitudinal prediction of depressive symptoms. A total of 319 girls and 294 boys (ages 11-14) provided information on their pubertal status, depressive symptoms, and the social status (i.e., popularity) of their peers. Adolescents completed a second measure of depressive symptoms 11 months after the initial time point. Findings supported an integrated biological-interpersonal model in explaining the development of depressive symptoms during adolescence. Early pubertal development was associated with increase in depressive symptoms only when accompanied by low levels of popularity. High levels of popularity buffered the association between early pubertal development and later depressive symptoms. Unexpectedly, these results were significant both for girls and boys. Results are discussed in terms of dynamic systems theories.

  7. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR CO FOR PREDICTING REAL-TIME MOTOR VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) has initiated a project to improve the methodology for modeling human exposure to motor vehicle emission. The overall project goal is to develop improved methods for modeling...

  8. Developing Local Scale, High Resolution, Data to Interface with Numerical Storm Models

    NASA Astrophysics Data System (ADS)

    Witkop, R.; Becker, A.; Stempel, P.

    2017-12-01

    High resolution, physical storm models that can rapidly predict storm surge, inundation, rainfall, wind velocity and wave height at the intra-facility scale for any storm affecting Rhode Island have been developed by Researchers at the University of Rhode Island's (URI's) Graduate School of Oceanography (GSO) (Ginis et al., 2017). At the same time, URI's Marine Affairs Department has developed methods that inhere individual geographic points into GSO's models and enable the models to accurately incorporate local scale, high resolution data (Stempel et al., 2017). This combination allows URI's storm models to predict any storm's impacts on individual Rhode Island facilities in near real time. The research presented here determines how a coastal Rhode Island town's critical facility managers (FMs) perceive their assets as being vulnerable to quantifiable hurricane-related forces at the individual facility scale and explores methods to elicit this information from FMs in a format usable for incorporation into URI's storm models.

  9. Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.

    PubMed

    Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis

    2015-01-01

    Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.

  10. A Rescorla-Wagner drift-diffusion model of conditioning and timing

    PubMed Central

    Alonso, Eduardo

    2017-01-01

    Computational models of classical conditioning have made significant contributions to the theoretic understanding of associative learning, yet they still struggle when the temporal aspects of conditioning are taken into account. Interval timing models have contributed a rich variety of time representations and provided accurate predictions for the timing of responses, but they usually have little to say about associative learning. In this article we present a unified model of conditioning and timing that is based on the influential Rescorla-Wagner conditioning model and the more recently developed Timing Drift-Diffusion model. We test the model by simulating 10 experimental phenomena and show that it can provide an adequate account for 8, and a partial account for the other 2. We argue that the model can account for more phenomena in the chosen set than these other similar in scope models: CSC-TD, MS-TD, Learning to Time and Modular Theory. A comparison and analysis of the mechanisms in these models is provided, with a focus on the types of time representation and associative learning rule used. PMID:29095819

  11. Character Development among Youth: Linking Lives in Time and Place

    ERIC Educational Resources Information Center

    Lerner, Richard M.

    2018-01-01

    This article embeds the study of character development within the two-decades-long research program framed by the Lerner and Lerner model of positive youth development. Character development involves attaining the feelings, thoughts, and skills needed to act coherently across time and place to serve self and others in mutually beneficial, positive…

  12. MEASURE: An integrated data-analysis and model identification facility

    NASA Technical Reports Server (NTRS)

    Singh, Jaidip; Iyer, Ravi K.

    1990-01-01

    The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

  13. Sampling through time and phylodynamic inference with coalescent and birth–death models

    PubMed Central

    Volz, Erik M.; Frost, Simon D. W.

    2014-01-01

    Many population genetic models have been developed for the purpose of inferring population size and growth rates from random samples of genetic data. We examine two popular approaches to this problem, the coalescent and the birth–death-sampling model (BDM), in the context of estimating population size and birth rates in a population growing exponentially according to the birth–death branching process. For sequences sampled at a single time, we found the coalescent and the BDM gave virtually indistinguishable results in terms of the growth rates and fraction of the population sampled, even when sampling from a small population. For sequences sampled at multiple time points, we find that the birth–death model estimators are subject to large bias if the sampling process is misspecified. Since BDMs incorporate a model of the sampling process, we show how much of the statistical power of BDMs arises from the sequence of sample times and not from the genealogical tree. This motivates the development of a new coalescent estimator, which is augmented with a model of the known sampling process and is potentially more precise than the coalescent that does not use sample time information. PMID:25401173

  14. Using a composite grid approach in a complex coastal domain to estimate estuarine residence time

    USGS Publications Warehouse

    Warner, John C.; Geyer, W. Rockwell; Arango, Herman G.

    2010-01-01

    We investigate the processes that influence residence time in a partially mixed estuary using a three-dimensional circulation model. The complex geometry of the study region is not optimal for a structured grid model and so we developed a new method of grid connectivity. This involves a novel approach that allows an unlimited number of individual grids to be combined in an efficient manner to produce a composite grid. We then implemented this new method into the numerical Regional Ocean Modeling System (ROMS) and developed a composite grid of the Hudson River estuary region to investigate the residence time of a passive tracer. Results show that the residence time is a strong function of the time of release (spring vs. neap tide), the along-channel location, and the initial vertical placement. During neap tides there is a maximum in residence time near the bottom of the estuary at the mid-salt intrusion length. During spring tides the residence time is primarily a function of along-channel location and does not exhibit a strong vertical variability. This model study of residence time illustrates the utility of the grid connectivity method for circulation and dispersion studies in regions of complex geometry.

  15. Comparative analysis of neural network and regression based condition monitoring approaches for wind turbine fault detection

    NASA Astrophysics Data System (ADS)

    Schlechtingen, Meik; Ferreira Santos, Ilmar

    2011-07-01

    This paper presents the research results of a comparison of three different model based approaches for wind turbine fault detection in online SCADA data, by applying developed models to five real measured faults and anomalies. The regression based model as the simplest approach to build a normal behavior model is compared to two artificial neural network based approaches, which are a full signal reconstruction and an autoregressive normal behavior model. Based on a real time series containing two generator bearing damages the capabilities of identifying the incipient fault prior to the actual failure are investigated. The period after the first bearing damage is used to develop the three normal behavior models. The developed or trained models are used to investigate how the second damage manifests in the prediction error. Furthermore the full signal reconstruction and the autoregressive approach are applied to further real time series containing gearbox bearing damages and stator temperature anomalies. The comparison revealed all three models being capable of detecting incipient faults. However, they differ in the effort required for model development and the remaining operational time after first indication of damage. The general nonlinear neural network approaches outperform the regression model. The remaining seasonality in the regression model prediction error makes it difficult to detect abnormality and leads to increased alarm levels and thus a shorter remaining operational period. For the bearing damages and the stator anomalies under investigation the full signal reconstruction neural network gave the best fault visibility and thus led to the highest confidence level.

  16. Educational Aspirations: Markov and Poisson Models. Rural Industrial Development Project Working Paper Number 14, August 1971.

    ERIC Educational Resources Information Center

    Kayser, Brian D.

    The fit of educational aspirations of Illinois rural high school youths to 3 related one-parameter mathematical models was investigated. The models used were the continuous-time Markov chain model, the discrete-time Markov chain, and the Poisson distribution. The sample of 635 students responded to questionnaires from 1966 to 1969 as part of an…

  17. V/STOL tilt rotor aircraft study mathematical model for a real time simulation of a tilt rotor aircraft (Boeing Vertol Model 222), volume 8

    NASA Technical Reports Server (NTRS)

    Rosenstein, H.; Mcveigh, M. A.; Mollenkof, P. A.

    1973-01-01

    A mathematical model for a real time simulation of a tilt rotor aircraft was developed. The mathematical model is used for evaluating aircraft performance and handling qualities. The model is based on an eleven degree of freedom total force representation. The rotor is treated as a point source of forces and moments with appropriate response time lags and actuator dynamics. The aerodynamics of the wing, tail, rotors, landing gear, and fuselage are included.

  18. The time series approach to short term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, M.T.; Behr, S.M.

    The application of time series analysis methods to load forecasting is reviewed. It is shown than Box and Jenkins time series models, in particular, are well suited to this application. The logical and organized procedures for model development using the autocorrelation function make these models particularly attractive. One of the drawbacks of these models is the inability to accurately represent the nonlinear relationship between load and temperature. A simple procedure for overcoming this difficulty is introduced, and several Box and Jenkins models are compared with a forecasting procedure currently used by a utility company.

  19. Validation of Models Used to Inform Colorectal Cancer Screening Guidelines: Accuracy and Implications.

    PubMed

    Rutter, Carolyn M; Knudsen, Amy B; Marsh, Tracey L; Doria-Rose, V Paul; Johnson, Eric; Pabiniak, Chester; Kuntz, Karen M; van Ballegooijen, Marjolein; Zauber, Ann G; Lansdorp-Vogelaar, Iris

    2016-07-01

    Microsimulation models synthesize evidence about disease processes and interventions, providing a method for predicting long-term benefits and harms of prevention, screening, and treatment strategies. Because models often require assumptions about unobservable processes, assessing a model's predictive accuracy is important. We validated 3 colorectal cancer (CRC) microsimulation models against outcomes from the United Kingdom Flexible Sigmoidoscopy Screening (UKFSS) Trial, a randomized controlled trial that examined the effectiveness of one-time flexible sigmoidoscopy screening to reduce CRC mortality. The models incorporate different assumptions about the time from adenoma initiation to development of preclinical and symptomatic CRC. Analyses compare model predictions to study estimates across a range of outcomes to provide insight into the accuracy of model assumptions. All 3 models accurately predicted the relative reduction in CRC mortality 10 years after screening (predicted hazard ratios, with 95% percentile intervals: 0.56 [0.44, 0.71], 0.63 [0.51, 0.75], 0.68 [0.53, 0.83]; estimated with 95% confidence interval: 0.56 [0.45, 0.69]). Two models with longer average preclinical duration accurately predicted the relative reduction in 10-year CRC incidence. Two models with longer mean sojourn time accurately predicted the number of screen-detected cancers. All 3 models predicted too many proximal adenomas among patients referred to colonoscopy. Model accuracy can only be established through external validation. Analyses such as these are therefore essential for any decision model. Results supported the assumptions that the average time from adenoma initiation to development of preclinical cancer is long (up to 25 years), and mean sojourn time is close to 4 years, suggesting the window for early detection and intervention by screening is relatively long. Variation in dwell time remains uncertain and could have important clinical and policy implications. © The Author(s) 2016.

  20. A gene regulatory network model for floral transition of the shoot apex in maize and its dynamic modeling.

    PubMed

    Dong, Zhanshan; Danilevskaya, Olga; Abadie, Tabare; Messina, Carlos; Coles, Nathan; Cooper, Mark

    2012-01-01

    The transition from the vegetative to reproductive development is a critical event in the plant life cycle. The accurate prediction of flowering time in elite germplasm is important for decisions in maize breeding programs and best agronomic practices. The understanding of the genetic control of flowering time in maize has significantly advanced in the past decade. Through comparative genomics, mutant analysis, genetic analysis and QTL cloning, and transgenic approaches, more than 30 flowering time candidate genes in maize have been revealed and the relationships among these genes have been partially uncovered. Based on the knowledge of the flowering time candidate genes, a conceptual gene regulatory network model for the genetic control of flowering time in maize is proposed. To demonstrate the potential of the proposed gene regulatory network model, a first attempt was made to develop a dynamic gene network model to predict flowering time of maize genotypes varying for specific genes. The dynamic gene network model is composed of four genes and was built on the basis of gene expression dynamics of the two late flowering id1 and dlf1 mutants, the early flowering landrace Gaspe Flint and the temperate inbred B73. The model was evaluated against the phenotypic data of the id1 dlf1 double mutant and the ZMM4 overexpressed transgenic lines. The model provides a working example that leverages knowledge from model organisms for the utilization of maize genomic information to predict a whole plant trait phenotype, flowering time, of maize genotypes.

  1. Predicting drowsy driving in real-time situations: Using an advanced driving simulator, accelerated failure time model, and virtual location-based services.

    PubMed

    Wang, Junhua; Sun, Shuaiyi; Fang, Shouen; Fu, Ting; Stipancic, Joshua

    2017-02-01

    This paper aims to both identify the factors affecting driver drowsiness and to develop a real-time drowsy driving probability model based on virtual Location-Based Services (LBS) data obtained using a driving simulator. A driving simulation experiment was designed and conducted using 32 participant drivers. Collected data included the continuous driving time before detection of drowsiness and virtual LBS data related to temperature, time of day, lane width, average travel speed, driving time in heavy traffic, and driving time on different roadway types. Demographic information, such as nap habit, age, gender, and driving experience was also collected through questionnaires distributed to the participants. An Accelerated Failure Time (AFT) model was developed to estimate the driving time before detection of drowsiness. The results of the AFT model showed driving time before drowsiness was longer during the day than at night, and was longer at lower temperatures. Additionally, drivers who identified as having a nap habit were more vulnerable to drowsiness. Generally, higher average travel speeds were correlated to a higher risk of drowsy driving, as were longer periods of low-speed driving in traffic jam conditions. Considering different road types, drivers felt drowsy more quickly on freeways compared to other facilities. The proposed model provides a better understanding of how driver drowsiness is influenced by different environmental and demographic factors. The model can be used to provide real-time data for the LBS-based drowsy driving warning system, improving past methods based only on a fixed driving. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Generation of the Human Biped Stance by a Neural Controller Able to Compensate Neurological Time Delay

    PubMed Central

    Jiang, Ping; Chiba, Ryosuke; Takakusaki, Kaoru; Ota, Jun

    2016-01-01

    The development of a physiologically plausible computational model of a neural controller that can realize a human-like biped stance is important for a large number of potential applications, such as assisting device development and designing robotic control systems. In this paper, we develop a computational model of a neural controller that can maintain a musculoskeletal model in a standing position, while incorporating a 120-ms neurological time delay. Unlike previous studies that have used an inverted pendulum model, a musculoskeletal model with seven joints and 70 muscular-tendon actuators is adopted to represent the human anatomy. Our proposed neural controller is composed of both feed-forward and feedback controls. The feed-forward control corresponds to the constant activation input necessary for the musculoskeletal model to maintain a standing posture. This compensates for gravity and regulates stiffness. The developed neural controller model can replicate two salient features of the human biped stance: (1) physiologically plausible muscle activations for quiet standing; and (2) selection of a low active stiffness for low energy consumption. PMID:27655271

  3. The PEDA Model. An advocacy tool modeling the interrelationships between population, development, the environment and agriculture in Africa.

    PubMed

    1999-01-01

    This article reports on the PEDA (population changes, environment, socioeconomic development and agriculture) model and its implication for policy-making in Africa. PEDA is an interactive computer simulation model (developed for a Windows environment) demonstrating the long-term impacts of alternative national policies on food security status of the population. The model is based on multistate demographic techniques, projecting at the same time 8 different subgroups (by age and sex) in the population, and based on 3 dichotomous individual characteristics: urban/rural place of residence; literacy status; and food security status. Through the manipulation of scenario variables, the model enables the user to project the proportion of the population that will be food secure and food insecure for a chosen point in time. This model developed by Dr. W. Lutz, Director of the International Institute for Applied Systems Analysis, will serve as an advocacy tool to help convince policy-makers and country experts in Africa of the negative synergy arising from the interconnections of population growth, environmental deterioration, and declining agricultural production.

  4. Turbulence Modeling: Progress and Future Outlook

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.; Huang, George P.

    1996-01-01

    Progress in the development of the hierarchy of turbulence models for Reynolds-averaged Navier-Stokes codes used in aerodynamic applications is reviewed. Steady progress is demonstrated, but transfer of the modeling technology has not kept pace with the development and demands of the computational fluid dynamics (CFD) tools. An examination of the process of model development leads to recommendations for a mid-course correction involving close coordination between modelers, CFD developers, and application engineers. In instances where the old process is changed and cooperation enhanced, timely transfer is realized. A turbulence modeling information database is proposed to refine the process and open it to greater participation among modeling and CFD practitioners.

  5. Electron Induced Discharge Modeling, Testing, and Analysis for Scatha. Volume I. Phenomenology Study and Model Testing.

    DTIC Science & Technology

    1978-12-31

    Dielectric Discharge. .. ......... 23 3.2.1 Total Emitted Charge .. ........... ........ 26 3.2.2 Emission Time History .. .. ................. 29 3.3...taken to be a rise time of 10 ns and a fall time of 10 to 100 ns. In addition, a physical model of the discharge mechanism has been developed in which...scale model of the P78-2, dubbed the SCATSAT was constructed whose design was chosen to simulate the basic structure of the real satellite, including the

  6. Software Tools For Building Decision-support Models For Flood Emergency Situations

    NASA Astrophysics Data System (ADS)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  7. Review of numerical models to predict cooling tower performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, B.M.; Nomura, K.K.; Bartz, J.A.

    1987-01-01

    Four state-of-the-art computer models developed to predict the thermal performance of evaporative cooling towers are summarized. The formulation of these models, STAR and TEFERI (developed in Europe) and FACTS and VERA2D (developed in the U.S.), is summarized. A fifth code, based on Merkel analysis, is also discussed. Principal features of the codes, computation time and storage requirements are described. A discussion of model validation is also provided.

  8. Sub-seasonal-to-seasonal Reservoir Inflow Forecast using Bayesian Hierarchical Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Arumugam, S.

    2017-12-01

    Sub-seasonal-to-seasonal (S2S) (15-90 days) streamflow forecasting is an emerging area of research that provides seamless information for reservoir operation from weather time scales to seasonal time scales. From an operational perspective, sub-seasonal inflow forecasts are highly valuable as these enable water managers to decide short-term releases (15-30 days), while holding water for seasonal needs (e.g., irrigation and municipal supply) and to meet end-of-the-season target storage at a desired level. We propose a Bayesian Hierarchical Hidden Markov Model (BHHMM) to develop S2S inflow forecasts for the Tennessee Valley Area (TVA) reservoir system. Here, the hidden states are predicted by relevant indices that influence the inflows at S2S time scale. The hidden Markov model also captures the both spatial and temporal hierarchy in predictors that operate at S2S time scale with model parameters being estimated as a posterior distribution using a Bayesian framework. We present our work in two steps, namely single site model and multi-site model. For proof of concept, we consider inflows to Douglas Dam, Tennessee, in the single site model. For multisite model we consider reservoirs in the upper Tennessee valley. Streamflow forecasts are issued and updated continuously every day at S2S time scale. We considered precipitation forecasts obtained from NOAA Climate Forecast System (CFSv2) GCM as predictors for developing S2S streamflow forecasts along with relevant indices for predicting hidden states. Spatial dependence of the inflow series of reservoirs are also preserved in the multi-site model. To circumvent the non-normality of the data, we consider the HMM in a Generalized Linear Model setting. Skill of the proposed approach is tested using split sample validation against a traditional multi-site canonical correlation model developed using the same set of predictors. From the posterior distribution of the inflow forecasts, we also highlight different system behavior under varied global and local scale climatic influences from the developed BHMM.

  9. Modeling of switching regulator power stages with and without zero-inductor-current dwell time

    NASA Technical Reports Server (NTRS)

    Lee, F. C. Y.; Yu, Y.

    1979-01-01

    State-space techniques are employed to derive accurate models for the three basic switching converter power stages: buck, boost, and buck/boost operating with and without zero-inductor-current dwell time. A generalized procedure is developed which treats the continuous-inductor-current mode without dwell time as a special case of the discontinuous-current mode when the dwell time vanishes. Abrupt changes of system behavior, including a reduction of the system order when the dwell time appears, are shown both analytically and experimentally. Merits resulting from the present modeling technique in comparison with existing modeling techniques are illustrated.

  10. Better Modeling of Electrostatic Discharge in an Insulator

    NASA Technical Reports Server (NTRS)

    Pekov, Mihail

    2010-01-01

    An improved mathematical model has been developed of the time dependence of buildup or decay of electric charge in a high-resistivity (nominally insulating) material. The model is intended primarily for use in extracting the DC electrical resistivity of such a material from voltage -vs.- current measurements performed repeatedly on a sample of the material over a time comparable to the longest characteristic times (typically of the order of months) that govern the evolution of relevant properties of the material. This model is an alternative to a prior simplistic macroscopic model that yields results differing from the results of the time-dependent measurements by two to three orders of magnitude.

  11. A meshless EFG-based algorithm for 3D deformable modeling of soft tissue in real-time.

    PubMed

    Abdi, Elahe; Farahmand, Farzam; Durali, Mohammad

    2012-01-01

    The meshless element-free Galerkin method was generalized and an algorithm was developed for 3D dynamic modeling of deformable bodies in real time. The efficacy of the algorithm was investigated in a 3D linear viscoelastic model of human spleen subjected to a time-varying compressive force exerted by a surgical grasper. The model remained stable in spite of the considerably large deformations occurred. There was a good agreement between the results and those of an equivalent finite element model. The computational cost, however, was much lower, enabling the proposed algorithm to be effectively used in real-time applications.

  12. "It's about Improving My Practice": The Learner Experience of Real-Time Coaching

    ERIC Educational Resources Information Center

    Sharplin, Erica J.; Stahl, Garth; Kehrwald, Ben

    2016-01-01

    This article reports on pre-service teachers' experience of the Real-Time Coaching model, an innovative technology-based approach to teacher training. The Real-Time Coaching model uses multiple feedback cycles via wireless technology to develop within pre-service teachers the specific skills and mindset toward continual improvement. Results of…

  13. Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2013-01-01

    A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.

  14. Theoretical and Numerical Investigation of the Cavity Evolution in Gypsum Rock

    NASA Astrophysics Data System (ADS)

    Li, Wei; Einstein, Herbert H.

    2017-11-01

    When water flows through a preexisting cylindrical tube in gypsum rock, the nonuniform dissolution alters the tube into an enlarged tapered tube. A 2-D analytical model is developed to study the transport-controlled dissolution in an enlarged tapered tube, with explicit consideration of the tapered geometry and induced radial flow. The analytical model shows that the Graetz solution can be extended to model dissolution in the tapered tube. An alternative form of the governing equations is proposed to take advantage of the invariant quantities in the Graetz solution to facilitate modeling cavity evolution in gypsum rock. A 2-D finite volume model was developed to validate the extended Graetz solution. The time evolution of the transport-controlled and the reaction-controlled dissolution models for a single tube with time-invariant flow rate are compared. This comparison shows that for time-invariant flow rate, the reaction-controlled dissolution model produces a positive feedback between the tube enlargement and dissolution, while the transport-controlled dissolution does not.

  15. Modelling Faculty Replacement Strategies Using a Time-Dependent Finite Markov-Chain Process.

    ERIC Educational Resources Information Center

    Hackett, E. Raymond; Magg, Alexander A.; Carrigan, Sarah D.

    1999-01-01

    Describes the use of a time-dependent Markov-chain model to develop faculty-replacement strategies within a college at a research university. The study suggests that a stochastic modelling approach can provide valuable insight when planning for personnel needs in the immediate (five-to-ten year) future. (MSE)

  16. PARTICLE FLOW, MIXING, AND CHEMICAL REACTION IN CIRCULATING FLUIDIZED BED ABSORBERS

    EPA Science Inventory

    A mixing model has been developed to simulate the particle residence time distribution (RTD) in a circulating fluidized bed absorber (CFBA). Also, a gas/solid reaction model for sulfur dioxide (SO2) removal by lime has been developed. For the reaction model that considers RTD dis...

  17. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR PARTICULATE MATTER (MICROFACPM) FOR PREDICTING REAL-TIME MOTOR VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...

  18. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR CO (MICROFACCO) FOR PREDICTING REAL-TIME VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...

  19. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR PARTICULATE MATTER (MICROFACPM) FOR PREDICTING REAL-TIME MOTOR VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's National Exposure Research Laboratory is pursuing a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project is to develop improved methods for modeling the source through...

  20. CHARACTERIZING SPATIAL AND TEMPORAL DYNAMICS: DEVELOPMENT OF A GRID-BASED WATERSHED MERCURY LOADING MODEL

    EPA Science Inventory

    A distributed grid-based watershed mercury loading model has been developed to characterize spatial and temporal dynamics of mercury from both point and non-point sources. The model simulates flow, sediment transport, and mercury dynamics on a daily time step across a diverse lan...

  1. Benchmark Dose Software Development and Maintenance Ten Berge Cxt Models

    EPA Science Inventory

    This report is intended to provide an overview of beta version 1.0 of the implementation of a concentration-time (CxT) model originally programmed and provided by Wil ten Berge (referred to hereafter as the ten Berge model). The recoding and development described here represent ...

  2. Embracing Complexity: Using Technology to Develop a Life-Long Learning Model for Non-Working Time in the Interdependent Homes for Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Chiang, I-Tsun; Chen, Mei-Li

    2011-01-01

    The purpose of this study was to employ complexity theory as a theoretical framework and technology to facilitate the development of a life-long learning model for non-working time in the interdependent homes for adults with Autism Spectrum Disorders (ASD). A "Shining Star Sustainable Action Project" of the ROC Foundation for Autistic…

  3. Application of Wavelet Filters in an Evaluation of ...

    EPA Pesticide Factsheets

    Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model performance metrics lead one to devote resources to stochastic variations in model outputs. In this analysis, observations are compared with model outputs at seasonal, weekly, diurnal and intra-day time scales. Filters provide frequency specific information that can be used to compare the strength (amplitude) and timing (phase) of observations and model estimates. The National Exposure Research Laboratory′s (NERL′s) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA′s mission to protect human health and the environment. AMAD′s research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the Nation′s air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollu

  4. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    PubMed

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Particle Simulation of Coulomb Collisions: Comparing the Methods of Takizuka & Abe and Nanbu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C; Lin, T; Caflisch, R

    2007-05-22

    The interactions of charged particles in a plasma are in a plasma is governed by the long-range Coulomb collision. We compare two widely used Monte Carlo models for Coulomb collisions. One was developed by Takizuka and Abe in 1977, the other was developed by Nanbu in 1997. We perform deterministic and stochastic error analysis with respect to particle number and time step. The two models produce similar stochastic errors, but Nanbu's model gives smaller time step errors. Error comparisons between these two methods are presented.

  6. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  7. The predictive ability of six pharmacokinetic models of rocuronium developed using a single bolus: evaluation with bolus and continuous infusion regimen.

    PubMed

    Sasakawa, Tomoki; Masui, Kenichi; Kazama, Tomiei; Iwasaki, Hiroshi

    2016-08-01

    Rocuronium concentration prediction using pharmacokinetic (PK) models would be useful for controlling rocuronium effects because neuromuscular monitoring throughout anesthesia can be difficult. This study assessed whether six different compartmental PK models developed from data obtained after bolus administration only could predict the measured plasma concentration (Cp) values of rocuronium delivered by bolus followed by continuous infusion. Rocuronium Cp values from 19 healthy subjects who received a bolus dose followed by continuous infusion in a phase III multicenter trial in Japan were used retrospectively as evaluation datasets. Six different compartmental PK models of rocuronium were used to simulate rocuronium Cp time course values, which were compared with measured Cp values. Prediction error (PE) derivatives of median absolute PE (MDAPE), median PE (MDPE), wobble, divergence absolute PE, and divergence PE were used to assess inaccuracy, bias, intra-individual variability, and time-related trends in APE and PE values. MDAPE and MDPE values were acceptable only for the Magorian and Kleijn models. The divergence PE value for the Kleijn model was lower than -10 %/h, indicating unstable prediction over time. The Szenohradszky model had the lowest divergence PE (-2.7 %/h) and wobble (5.4 %) values with negative bias (MDPE = -25.9 %). These three models were developed using the mixed-effects modeling approach. The Magorian model showed the best PE derivatives among the models assessed. A PK model developed from data obtained after single-bolus dosing can predict Cp values during bolus and continuous infusion. Thus, a mixed-effects modeling approach may be preferable in extrapolating such data.

  8. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  9. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    PubMed

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and responsive nature of the applications described has the potential to allow complex environmental social and political considerations to be incorporated and visualised. Through supporting evidence-based planning the innovative modelling practices described have the potential to help local health and emergency response planning in the developing world.

  10. A discrete time-varying internal model-based approach for high precision tracking of a multi-axis servo gantry.

    PubMed

    Zhang, Zhen; Yan, Peng; Jiang, Huan; Ye, Peiqing

    2014-09-01

    In this paper, we consider the discrete time-varying internal model-based control design for high precision tracking of complicated reference trajectories generated by time-varying systems. Based on a novel parallel time-varying internal model structure, asymptotic tracking conditions for the design of internal model units are developed, and a low order robust time-varying stabilizer is further synthesized. In a discrete time setting, the high precision tracking control architecture is deployed on a Voice Coil Motor (VCM) actuated servo gantry system, where numerical simulations and real time experimental results are provided, achieving the tracking errors around 3.5‰ for frequency-varying signals. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Application of a dynamic population-based model for evaluation of exposure reduction strategies in the baking industry

    NASA Astrophysics Data System (ADS)

    Meijster, Tim; Warren, Nick; Heederik, Dick; Tielemans, Erik

    2009-02-01

    Recently a dynamic population model was developed that simulates a population of bakery workers longitudinally through time and tracks the development of work-related sensitisation and respiratory symptoms in each worker. Input for this model comes from cross-sectional and longitudinal epidemiological studies which allowed estimation of exposure response relationships and disease transition probabilities This model allows us to study the development of diseases and transitions between disease states over time in relation to determinants of disease including flour dust and/or allergen exposure. Furthermore it enables more realistic modelling of the health impact of different intervention strategies at the workplace (e.g. changes in exposure may take several years to impact on ill-health and often occur as a gradual trend). A large dataset of individual full-shift exposure measurements and real-time exposure measurements were used to obtain detailed insight into the effectiveness of control measures and other determinants of exposure. Given this information a population wide reduction of the median exposure with 50% was evaluated in this paper.

  12. Changing Behavior by Memory Aids: A Social Psychological Model of Prospective Memory and Habit Development Tested with Dynamic Field Data

    ERIC Educational Resources Information Center

    Tobias, Robert

    2009-01-01

    This article presents a social psychological model of prospective memory and habit development. The model is based on relevant research literature, and its dynamics were investigated by computer simulations. Time-series data from a behavior-change campaign in Cuba were used for calibration and validation of the model. The model scored well in…

  13. Analysis of an inventory model for both linearly decreasing demand and holding cost

    NASA Astrophysics Data System (ADS)

    Malik, A. K.; Singh, Parth Raj; Tomar, Ajay; Kumar, Satish; Yadav, S. K.

    2016-03-01

    This study proposes the analysis of an inventory model for linearly decreasing demand and holding cost for non-instantaneous deteriorating items. The inventory model focuses on commodities having linearly decreasing demand without shortages. The holding cost doesn't remain uniform with time due to any form of variation in the time value of money. Here we consider that the holding cost decreases with respect to time. The optimal time interval for the total profit and the optimal order quantity are determined. The developed inventory model is pointed up through a numerical example. It also includes the sensitivity analysis.

  14. On the Use of Calibration Explosions at the Former Semipalatinsk Test Site for Compiling a Travel-time Model of the Crust and Upper Mantle

    NASA Astrophysics Data System (ADS)

    Belyashova, N. N.; Shacilov, V. I.; Mikhailova, N. N.; Komarov, I. I.; Sinyova, Z. I.; Belyashov, A. V.; Malakhova, M. N.

    - Two chemical calibration explosions, conducted at the former Semipalatinsk nuclear test site in 1998 with charges of 25 tons and 100 tons TNT, have been used for developing travel-time curves and generalized one-dimensional velocity models of the crust and upper mantle of the platform region of Kazakhstan. The explosions were recorded by a number of digital seismic stations, located in Kazakhstan at distances ranging from 0 to 720km. The travel-time tables developed in this paper cover the phases P, Pn, Pg, S, Sn, Lg in a range of 0-740km and the velocity models apply to the crust down to 44km depth and to the mantle down to 120km. A comparison of the compiled travel-time tables with existing travel-time tables of CSE and IASPEI91 is presented.

  15. Space Weather Models at the CCMC And Their Capabilities

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2007-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. In this presentation, we will provide an overview of the community-provided, space weather-relevant, model suite, which resides at CCMC. We will discuss current capabilities, and analyze expected future developments of space weather related modeling.

  16. A multi-period optimization model for energy planning with CO(2) emission consideration.

    PubMed

    Mirzaesmaeeli, H; Elkamel, A; Douglas, P L; Croiset, E; Gupta, M

    2010-05-01

    A novel deterministic multi-period mixed-integer linear programming (MILP) model for the power generation planning of electric systems is described and evaluated in this paper. The model is developed with the objective of determining the optimal mix of energy supply sources and pollutant mitigation options that meet a specified electricity demand and CO(2) emission targets at minimum cost. Several time-dependent parameters are included in the model formulation; they include forecasted energy demand, fuel price variability, construction lead time, conservation initiatives, and increase in fixed operational and maintenance costs over time. The developed model is applied to two case studies. The objective of the case studies is to examine the economical, structural, and environmental effects that would result if the electricity sector was required to reduce its CO(2) emissions to a specified limit. Copyright 2009 Elsevier Ltd. All rights reserved.

  17. Development of multilayer perceptron networks for isothermal time temperature transformation prediction of U-Mo-X alloys

    NASA Astrophysics Data System (ADS)

    Johns, Jesse M.; Burkes, Douglas

    2017-07-01

    In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model's ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. These models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.

  18. Evaluation of software maintain ability with open EHR - a comparison of architectures.

    PubMed

    Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, James R

    2014-11-01

    To assess whether it is easier to maintain a clinical information system developed using open EHR model driven development versus mainstream methods. A new open source application (GastrOS) has been developed following open EHR's multi-level modelling approach using .Net/C# based on the same requirements of an existing clinically used application developed using Microsoft Visual Basic and Access database. Almost all the domain knowledge was embedded into the software code and data model in the latter. The same domain knowledge has been expressed as a set of open EHR Archetypes in GastrOS. We then introduced eight real-world change requests that had accumulated during live clinical usage, and implemented these in both systems while measuring time for various development tasks and change in software size for each change request. Overall it took half the time to implement changes in GastrOS. However it was the more difficult application to modify for one change request, suggesting the nature of change is also important. It was not possible to implement changes by modelling only. Comparison of relative measures of time and software size change within each application highlights how architectural differences affected maintain ability across change requests. The use of open EHR model driven development can result in better software maintain ability. The degree to which open EHR affects software maintain ability depends on the extent and nature of domain knowledge involved in changes. Although we used relative measures for time and software size, confounding factors could not be totally excluded as a controlled study design was not feasible. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Modeling hurricane evacuation traffic : development of a time-dependent hurricane evacuation demand model.

    DOT National Transportation Integrated Search

    2006-04-01

    Little attention has been given to estimating dynamic travel demand in transportation planning in the past. However, when factors influencing travel are changing significantly over time such as with an approaching hurricane - dynamic demand and t...

  20. Development and characterization of an ex-vivo brain slice culture model of chronic wasting disease

    USDA-ARS?s Scientific Manuscript database

    Prion diseases have long incubation times in vivo, therefore, modeling the diseases ex-vivo will advance the development of rationale-based therapeutic strategies. An organotypic slice culture assay (POSCA) was recently developed for scrapie prions by inoculating mouse cerebellar brain slices with R...

  1. Developing Predictive Approaches to Characterize Adaptive Responses of the Reproductive Endocrine Axis to Aromatase Inhibition: Computational Modeling

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC)...

  2. A Biopsychosocial Model of the Development of Chronic Conduct Problems in Adolescence

    PubMed Central

    Dodge, Kenneth A.; Pettit, Gregory S.

    2009-01-01

    A biopsychosocial model of the development of adolescent chronic conduct problems is presented and supported through a review of empirical findings. This model posits that biological dispositions and sociocultural contexts place certain children at risk in early life but that life experiences with parents, peers, and social institutions increment and mediate this risk. A transactional developmental model is best equipped to describe the emergence of chronic antisocial behavior across time. Reciprocal influences among dispositions, contexts, and life experiences lead to recursive iterations across time that exacerbate or diminish antisocial development. Cognitive and emotional processes within the child, including the acquisition of knowledge and social-information-processing patterns, mediate the relation between life experiences and conduct problem outcomes. Implications for prevention research and public policy are noted. PMID:12661890

  3. Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.

    PubMed

    Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi

    2018-01-31

    Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.

  4. A joint model of persistent human papillomavirus infection and cervical cancer risk: Implications for cervical cancer screening

    PubMed Central

    Katki, Hormuzd A.; Cheung, Li C.; Fetterman, Barbara; Castle, Philip E.; Sundaram, Rajeshwari

    2014-01-01

    Summary New cervical cancer screening guidelines in the US and many European countries recommend that women get tested for human papillomavirus (HPV). To inform decisions about screening intervals, we calculate the increase in precancer/cancer risk per year of continued HPV infection. However, both time to onset of precancer/cancer and time to HPV clearance are interval-censored, and onset of precancer/cancer strongly informatively censors HPV clearance. We analyze this bivariate informatively interval-censored data by developing a novel joint model for time to clearance of HPV and time to precancer/cancer using shared random-effects, where the estimated mean duration of each woman’s HPV infection is a covariate in the submodel for time to precancer/cancer. The model was fit to data on 9,553 HPV-positive/Pap-negative women undergoing cervical cancer screening at Kaiser Permanente Northern California, data that were pivotal to the development of US screening guidelines. We compare the implications for screening intervals of this joint model to those from population-average marginal models of precancer/cancer risk. In particular, after 2 years the marginal population-average precancer/cancer risk was 5%, suggesting a 2-year interval to control population-average risk at 5%. In contrast, the joint model reveals that almost all women exceeding 5% individual risk in 2 years also exceeded 5% in 1 year, suggesting that a 1-year interval is better to control individual risk at 5%. The example suggests that sophisticated risk models capable of predicting individual risk may have different implications than population-average risk models that are currently used for informing medical guideline development. PMID:26556961

  5. A joint model of persistent human papillomavirus infection and cervical cancer risk: Implications for cervical cancer screening.

    PubMed

    Katki, Hormuzd A; Cheung, Li C; Fetterman, Barbara; Castle, Philip E; Sundaram, Rajeshwari

    2015-10-01

    New cervical cancer screening guidelines in the US and many European countries recommend that women get tested for human papillomavirus (HPV). To inform decisions about screening intervals, we calculate the increase in precancer/cancer risk per year of continued HPV infection. However, both time to onset of precancer/cancer and time to HPV clearance are interval-censored, and onset of precancer/cancer strongly informatively censors HPV clearance. We analyze this bivariate informatively interval-censored data by developing a novel joint model for time to clearance of HPV and time to precancer/cancer using shared random-effects, where the estimated mean duration of each woman's HPV infection is a covariate in the submodel for time to precancer/cancer. The model was fit to data on 9,553 HPV-positive/Pap-negative women undergoing cervical cancer screening at Kaiser Permanente Northern California, data that were pivotal to the development of US screening guidelines. We compare the implications for screening intervals of this joint model to those from population-average marginal models of precancer/cancer risk. In particular, after 2 years the marginal population-average precancer/cancer risk was 5%, suggesting a 2-year interval to control population-average risk at 5%. In contrast, the joint model reveals that almost all women exceeding 5% individual risk in 2 years also exceeded 5% in 1 year, suggesting that a 1-year interval is better to control individual risk at 5%. The example suggests that sophisticated risk models capable of predicting individual risk may have different implications than population-average risk models that are currently used for informing medical guideline development.

  6. 3D-FE Modeling of 316 SS under Strain-Controlled Fatigue Loading and CFD Simulation of PWR Surge Line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohanty, Subhasish; Barua, Bipul; Listwan, Joseph

    In financial year 2017, we are focusing on developing a mechanistic fatigue model of surge line pipes for pressurized water reactors (PWRs). To that end, we plan to perform the following tasks: (1) conduct stress- and strain-controlled fatigue testing of surge-line base metal such as 316 stainless steel (SS) under constant, variable, and random fatigue loading, (2) develop cyclic plasticity material models of 316 SS, (3) develop one-dimensional (1D) analytical or closed-form model to validate the material models and to understand the mechanics associated with 316 SS cyclic hardening and/or softening, (4) develop three-dimensional (3D) finite element (FE) models withmore » implementation of evolutionary cyclic plasticity, and (5) develop computational fluid dynamics (CFD) model for thermal stratification, thermal-mechanical stress, and fatigue of example reactor components, such as a PWR surge line under plant heat-up, cool-down, and normal operation with/without grid-load-following. This semi-annual progress report presents the work completed on the above tasks for a 316 SS laboratory-scale specimen subjected to strain-controlled cyclic loading with constant, variable, and random amplitude. This is the first time that the accurate 3D-FE modeling of the specimen for its entire fatigue life, including the hardening and softening behavior, has been achieved. We anticipate that this work will pave the way for the development of a fully mechanistic-computer model that can be used for fatigue evaluation of safety-critical metallic components, which are traditionally evaluated by heavy reliance on time-consuming and costly test-based approaches. This basic research will not only help the nuclear reactor industry for fatigue evaluation of reactor components in a cost effective and less time-consuming way, but will also help other safety-related industries, such as aerospace, which is heavily dependent on test-based approaches, where a single full-scale fatigue test can cost millions of dollars and require years of effort to conduct. Toward our goal of demonstration of fully mechanistic fatigue evaluation of reactor components, we also started work on developing a component-level computer model of reactor components, such as 316 SS surge line pipe. This requires developing a thermal-mechanical stress analysis model of the reactor surge line, which, in turn, requires time-dependent temperature and stratification information along the boundary of the pipe. Toward that goal, CFD models of surge lines are being developed. In this report, we also present some preliminary results showing the temperature conditions along the surge line wall under reactor heat-up, cool-down, and steady-state power operation.« less

  7. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  8. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  9. Development of Standard Fuel Models in Boreal Forests of Northeast China through Calibration and Validation

    PubMed Central

    Cai, Longyan; He, Hong S.; Wu, Zhiwei; Lewis, Benard L.; Liang, Yu

    2014-01-01

    Understanding the fire prediction capabilities of fuel models is vital to forest fire management. Various fuel models have been developed in the Great Xing'an Mountains in Northeast China. However, the performances of these fuel models have not been tested for historical occurrences of wildfires. Consequently, the applicability of these models requires further investigation. Thus, this paper aims to develop standard fuel models. Seven vegetation types were combined into three fuel models according to potential fire behaviors which were clustered using Euclidean distance algorithms. Fuel model parameter sensitivity was analyzed by the Morris screening method. Results showed that the fuel model parameters 1-hour time-lag loading, dead heat content, live heat content, 1-hour time-lag SAV(Surface Area-to-Volume), live shrub SAV, and fuel bed depth have high sensitivity. Two main sensitive fuel parameters: 1-hour time-lag loading and fuel bed depth, were determined as adjustment parameters because of their high spatio-temporal variability. The FARSITE model was then used to test the fire prediction capabilities of the combined fuel models (uncalibrated fuel models). FARSITE was shown to yield an unrealistic prediction of the historical fire. However, the calibrated fuel models significantly improved the capabilities of the fuel models to predict the actual fire with an accuracy of 89%. Validation results also showed that the model can estimate the actual fires with an accuracy exceeding 56% by using the calibrated fuel models. Therefore, these fuel models can be efficiently used to calculate fire behaviors, which can be helpful in forest fire management. PMID:24714164

  10. Empirical models for predicting volumes of sediment deposited by debris flows and sediment-laden floods in the transverse ranges of southern California

    USGS Publications Warehouse

    Gartner, Joseph E.; Cannon, Susan H.; Santi, Paul M

    2014-01-01

    Debris flows and sediment-laden floods in the Transverse Ranges of southern California pose severe hazards to nearby communities and infrastructure. Frequent wildfires denude hillslopes and increase the likelihood of these hazardous events. Debris-retention basins protect communities and infrastructure from the impacts of debris flows and sediment-laden floods and also provide critical data for volumes of sediment deposited at watershed outlets. In this study, we supplement existing data for the volumes of sediment deposited at watershed outlets with newly acquired data to develop new empirical models for predicting volumes of sediment produced by watersheds located in the Transverse Ranges of southern California. The sediment volume data represent a broad sample of conditions found in Ventura, Los Angeles and San Bernardino Counties, California. The measured volumes of sediment, watershed morphology, distributions of burn severity within each watershed, the time since the most recent fire, triggering storm rainfall conditions, and engineering soil properties were analyzed using multiple linear regressions to develop two models. A “long-term model” was developed for predicting volumes of sediment deposited by both debris flows and floods at various times since the most recent fire from a database of volumes of sediment deposited by a combination of debris flows and sediment-laden floods with no time limit since the most recent fire (n = 344). A subset of this database was used to develop an “emergency assessment model” for predicting volumes of sediment deposited by debris flows within two years of a fire (n = 92). Prior to developing the models, 32 volumes of sediment, and related parameters for watershed morphology, burn severity and rainfall conditions were retained to independently validate the long-term model. Ten of these volumes of sediment were deposited by debris flows within two years of a fire and were used to validate the emergency assessment model. The models were validated by comparing predicted and measured volumes of sediment. These validations were also performed for previously developed models and identify that the models developed here best predict volumes of sediment for burned watersheds in comparison to previously developed models.

  11. Improving models of democracy: the example of lagged effects of economic development, education, and gender equality.

    PubMed

    Balaev, Mikhail

    2014-07-01

    The author examines how time delayed effects of economic development, education, and gender equality influence political democracy. Literature review shows inadequate understanding of lagged effects, which raises methodological and theoretical issues with the current quantitative studies of democracy. Using country-years as a unit of analysis, the author estimates a series of OLS PCSE models for each predictor with a systematic analysis of the distributions of the lagged effects. The second set of multiple OLS PCSE regressions are estimated including all three independent variables. The results show that economic development, education, and gender have three unique trajectories of the time-delayed effects: Economic development has long-term effects, education produces continuous effects regardless of the timing, and gender equality has the most prominent immediate and short term effects. The results call for the reassessment of model specifications and theoretical setups in the quantitative studies of democracy. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Collaborative testing of turbulence models

    NASA Technical Reports Server (NTRS)

    Bradshaw, Peter; Launder, Brian E.; Lumley, John L.

    1991-01-01

    A review is given of an ongoing international project, in which data from experiments on, and simulations of, turbulent flows are distributed to developers of (time-averaged) engineering turbulence models. The predictions of each model are sent to the organizers and redistributed to all the modelers, plus some experimentalists and other experts (total approx. 120), for comment. The 'reaction time' of modelers has proved to be much longer than anticipated, partly because the comparisons with data have prompted many modelers to improve their models or numerics.

  13. P80 SRM low torque flex-seal development - thermal and chemical modeling of molding process

    NASA Astrophysics Data System (ADS)

    Descamps, C.; Gautronneau, E.; Rousseau, G.; Daurat, M.

    2009-09-01

    The development of the flex-seal component of the P80 nozzle gave the opportunity to set up new design and manufacturing process methods. Due to the short development lead time required by VEGA program, the usual manufacturing iterative tests work flow, which is usually time consuming, had to be enhanced in order to use a more predictive approach. A newly refined rubber vulcanization description was built up and identified on laboratory samples. This chemical model was implemented in a thermal analysis code. The complete model successfully supports the manufacturing processes. These activities were conducted with the support of ESA/CNES Research & Technologies and DGA (General Delegation for Armament).

  14. Multiobjective optimization model of intersection signal timing considering emissions based on field data: A case study of Beijing.

    PubMed

    Kou, Weibin; Chen, Xumei; Yu, Lei; Gong, Huibo

    2018-04-18

    Most existing signal timing models are aimed to minimize the total delay and stops at intersections, without considering environmental factors. This paper analyzes the trade-off between vehicle emissions and traffic efficiencies on the basis of field data. First, considering the different operating modes of cruising, acceleration, deceleration, and idling, field data of emissions and Global Positioning System (GPS) are collected to estimate emission rates for heavy-duty and light-duty vehicles. Second, multiobjective signal timing optimization model is established based on a genetic algorithm to minimize delay, stops, and emissions. Finally, a case study is conducted in Beijing. Nine scenarios are designed considering different weights of emission and traffic efficiency. The results compared with those using Highway Capacity Manual (HCM) 2010 show that signal timing optimized by the model proposed in this paper can decrease vehicles delay and emissions more significantly. The optimization model can be applied in different cities, which provides supports for eco-signal design and development. Vehicle emissions are heavily at signal intersections in urban area. The multiobjective signal timing optimization model is proposed considering the trade-off between vehicle emissions and traffic efficiencies on the basis of field data. The results indicate that signal timing optimized by the model proposed in this paper can decrease vehicle emissions and delays more significantly. The optimization model can be applied in different cities, which provides supports for eco-signal design and development.

  15. Development of a human cadaver model for training in laparoscopic donor nephrectomy.

    PubMed

    Sutton, Erica R H; Billeter, Adrian; Druen, Devin; Roberts, Henry; Rice, Jonathan

    2017-06-01

    The organ procurement network recommends a surgeon record 15 cases as surgeon or assistant for laparoscopic donor nephrectomies (LDN) prior to independent practice. The literature suggests that the learning curve for improved perioperative and patient outcomes is closer to 35 cases. In this article, we describe our development of a model utilizing fresh tissue and objective, quantifiable endpoints to document surgical progress, and efficiency in each of the major steps involved in LDN. Phase I of model development focused on the modifications necessary to maintain visualization for laparoscopic surgery in a human cadaver. Phase II tested proposed learner-based metrics of procedural competency for multiport LDN by timing procedural steps of LDN in a novice learner. Phases I and II required 12 and nine cadavers, with a total of 35 kidneys utilized. The following metrics improved with trial number for multiport LDN: time taken for dissection of the gonadal vein, ureter, renal hilum, adrenal and lumbrical veins, simulated warm ischemic time (WIT), and operative time. Human cadavers can be used for training in LDN as evidenced by improvements in timed learner-based metrics. This simulation-based model fills a gap in available training options for surgeons. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Integrated modeling of storm drain and natural channel networks for real-time flash flood forecasting in large urban areas

    NASA Astrophysics Data System (ADS)

    Habibi, H.; Norouzi, A.; Habib, A.; Seo, D. J.

    2016-12-01

    To produce accurate predictions of flooding in urban areas, it is necessary to model both natural channel and storm drain networks. While there exist many urban hydraulic models of varying sophistication, most of them are not practical for real-time application for large urban areas. On the other hand, most distributed hydrologic models developed for real-time applications lack the ability to explicitly simulate storm drains. In this work, we develop a storm drain model that can be coupled with distributed hydrologic models such as the National Weather Service Hydrology Laboratory's Distributed Hydrologic Model, for real-time flash flood prediction in large urban areas to improve prediction and to advance the understanding of integrated response of natural channels and storm drains to rainfall events of varying magnitude and spatiotemporal extent in urban catchments of varying sizes. The initial study area is the Johnson Creek Catchment (40.1 km2) in the City of Arlington, TX. For observed rainfall, the high-resolution (500 m, 1 min) precipitation data from the Dallas-Fort Worth Demonstration Network of the Collaborative Adaptive Sensing of the Atmosphere radars is used.

  17. A new order splitting model with stochastic lead times for deterioration items

    NASA Astrophysics Data System (ADS)

    Sazvar, Zeinab; Akbari Jokar, Mohammad Reza; Baboli, Armand

    2014-09-01

    In unreliable supply environments, the strategy of pooling lead time risks by splitting replenishment orders among multiple suppliers simultaneously is an attractive sourcing policy that has captured the attention of academic researchers and corporate managers alike. While various assumptions are considered in the models developed, researchers tend to overlook an important inventory category in order splitting models: deteriorating items. In this paper, we study an order splitting policy for a retailer that sells a deteriorating product. The inventory system is modelled as a continuous review system (s, Q) under stochastic lead time. Demand rate per unit time is assumed to be constant over an infinite planning horizon and shortages are backordered completely. We develop two inventory models. In the first model, it is assumed that all the requirements are supplied by only one source, whereas in the second, two suppliers are available. We use sensitivity analysis to determine the situations in which each sourcing policy is the most economic. We then study a real case from the European pharmaceutical industry to demonstrate the applicability and effectiveness of the proposed models. Finally, more promising directions are suggested for future research.

  18. Development of the Complex General Linear Model in the Fourier Domain: Application to fMRI Multiple Input-Output Evoked Responses for Single Subjects

    PubMed Central

    Rio, Daniel E.; Rawlings, Robert R.; Woltz, Lawrence A.; Gilman, Jodi; Hommer, Daniel W.

    2013-01-01

    A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function. PMID:23840281

  19. Development of the complex general linear model in the Fourier domain: application to fMRI multiple input-output evoked responses for single subjects.

    PubMed

    Rio, Daniel E; Rawlings, Robert R; Woltz, Lawrence A; Gilman, Jodi; Hommer, Daniel W

    2013-01-01

    A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function.

  20. [Quantitative relationship between gas chromatographic retention time and structural parameters of alkylphenols].

    PubMed

    Ruan, Xiaofang; Zhang, Ruisheng; Yao, Xiaojun; Liu, Mancang; Fan, Botao

    2007-03-01

    Alkylphenols are a group of permanent pollutants in the environment and could adversely disturb the human endocrine system. It is therefore important to effectively separate and measure the alkylphenols. To guide the chromatographic analysis of these compounds in practice, the development of quantitative relationship between the molecular structure and the retention time of alkylphenols becomes necessary. In this study, topological, constitutional, geometrical, electrostatic and quantum-chemical descriptors of 44 alkylphenols were calculated using a software, CODESSA, and these descriptors were pre-selected using the heuristic method. As a result, three-descriptor linear model (LM) was developed to describe the relationship between the molecular structure and the retention time of alkylphenols. Meanwhile, the non-linear regression model was also developed based on support vector machine (SVM) using the same three descriptors. The correlation coefficient (R(2)) for the LM and SVM was 0.98 and 0. 92, and the corresponding root-mean-square error was 0. 99 and 2. 77, respectively. By comparing the stability and prediction ability of the two models, it was found that the linear model was a better method for describing the quantitative relationship between the retention time of alkylphenols and the molecular structure. The results obtained suggested that the linear model could be applied for the chromatographic analysis of alkylphenols with known molecular structural parameters.

  1. Forecast of dengue incidence using temperature and rainfall.

    PubMed

    Hii, Yien Ling; Zhu, Huaiping; Ng, Nawi; Ng, Lee Ching; Rocklöv, Joacim

    2012-01-01

    An accurate early warning system to predict impending epidemics enhances the effectiveness of preventive measures against dengue fever. The aim of this study was to develop and validate a forecasting model that could predict dengue cases and provide timely early warning in Singapore. We developed a time series Poisson multivariate regression model using weekly mean temperature and cumulative rainfall over the period 2000-2010. Weather data were modeled using piecewise linear spline functions. We analyzed various lag times between dengue and weather variables to identify the optimal dengue forecasting period. Autoregression, seasonality and trend were considered in the model. We validated the model by forecasting dengue cases for week 1 of 2011 up to week 16 of 2012 using weather data alone. Model selection and validation were based on Akaike's Information Criterion, standardized Root Mean Square Error, and residuals diagnoses. A Receiver Operating Characteristics curve was used to analyze the sensitivity of the forecast of epidemics. The optimal period for dengue forecast was 16 weeks. Our model forecasted correctly with errors of 0.3 and 0.32 of the standard deviation of reported cases during the model training and validation periods, respectively. It was sensitive enough to distinguish between outbreak and non-outbreak to a 96% (CI = 93-98%) in 2004-2010 and 98% (CI = 95%-100%) in 2011. The model predicted the outbreak in 2011 accurately with less than 3% possibility of false alarm. We have developed a weather-based dengue forecasting model that allows warning 16 weeks in advance of dengue epidemics with high sensitivity and specificity. We demonstrate that models using temperature and rainfall could be simple, precise, and low cost tools for dengue forecasting which could be used to enhance decision making on the timing, scale of vector control operations, and utilization of limited resources.

  2. Development of an On-board Failure Diagnostics and Prognostics System for Solid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, Vadim N.; Luchinsky, Dmitry G.; Osipov, Vyatcheslav V.; Timucin, Dogan A.; Uckun, Serdar

    2009-01-01

    We develop a case breach model for the on-board fault diagnostics and prognostics system for subscale solid-rocket boosters (SRBs). The model development was motivated by recent ground firing tests, in which a deviation of measured time-traces from the predicted time-series was observed. A modified model takes into account the nozzle ablation, including the effect of roughness of the nozzle surface, the geometry of the fault, and erosion and burning of the walls of the hole in the metal case. The derived low-dimensional performance model (LDPM) of the fault can reproduce the observed time-series data very well. To verify the performance of the LDPM we build a FLUENT model of the case breach fault and demonstrate a good agreement between theoretical predictions based on the analytical solution of the model equations and the results of the FLUENT simulations. We then incorporate the derived LDPM into an inferential Bayesian framework and verify performance of the Bayesian algorithm for the diagnostics and prognostics of the case breach fault. It is shown that the obtained LDPM allows one to track parameters of the SRB during the flight in real time, to diagnose case breach fault, and to predict its values in the future. The application of the method to fault diagnostics and prognostics (FD&P) of other SRB faults modes is discussed.

  3. Time-lapse imaging of neural development: zebrafish lead the way into the fourth dimension.

    PubMed

    Rieger, Sandra; Wang, Fang; Sagasti, Alvaro

    2011-07-01

    Time-lapse imaging is often the only way to appreciate fully the many dynamic cell movements critical to neural development. Zebrafish possess many advantages that make them the best vertebrate model organism for live imaging of dynamic development events. This review will discuss technical considerations of time-lapse imaging experiments in zebrafish, describe selected examples of imaging studies in zebrafish that revealed new features or principles of neural development, and consider the promise and challenges of future time-lapse studies of neural development in zebrafish embryos and adults. Copyright © 2011 Wiley-Liss, Inc.

  4. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009).

    PubMed

    Nishiura, Hiroshi

    2011-02-16

    Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance.

  5. Univariate and multivariate spatial models of health facility utilisation for childhood fevers in an area on the coast of Kenya.

    PubMed

    Ouma, Paul O; Agutu, Nathan O; Snow, Robert W; Noor, Abdisalan M

    2017-09-18

    Precise quantification of health service utilisation is important for the estimation of disease burden and allocation of health resources. Current approaches to mapping health facility utilisation rely on spatial accessibility alone as the predictor. However, other spatially varying social, demographic and economic factors may affect the use of health services. The exclusion of these factors can lead to the inaccurate estimation of health facility utilisation. Here, we compare the accuracy of a univariate spatial model, developed only from estimated travel time, to a multivariate model that also includes relevant social, demographic and economic factors. A theoretical surface of travel time to the nearest public health facility was developed. These were assigned to each child reported to have had fever in the Kenya demographic and health survey of 2014 (KDHS 2014). The relationship of child treatment seeking for fever with travel time, household and individual factors from the KDHS2014 were determined using multilevel mixed modelling. Bayesian information criterion (BIC) and likelihood ratio test (LRT) tests were carried out to measure how selected factors improve parsimony and goodness of fit of the time model. Using the mixed model, a univariate spatial model of health facility utilisation was fitted using travel time as the predictor. The mixed model was also used to compute a multivariate spatial model of utilisation, using travel time and modelled surfaces of selected household and individual factors as predictors. The univariate and multivariate spatial models were then compared using the receiver operating area under the curve (AUC) and a percent correct prediction (PCP) test. The best fitting multivariate model had travel time, household wealth index and number of children in household as the predictors. These factors reduced BIC of the time model from 4008 to 2959, a change which was confirmed by the LRT test. Although there was a high correlation of the two modelled probability surfaces (Adj R 2  = 88%), the multivariate model had better AUC compared to the univariate model; 0.83 versus 0.73 and PCP 0.61 versus 0.45 values. Our study shows that a model that uses travel time, as well as household and individual-level socio-demographic factors, results in a more accurate estimation of use of health facilities for the treatment of childhood fever, compared to one that relies on only travel time.

  6. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. New solar irradiances for use in space research

    NASA Astrophysics Data System (ADS)

    Tobiska, W.; Bouwer, D.; Jones, A.

    Space environment research applications require solar irradiances in a variety of time scales and spectral formats We describe the development of research grade modeled solar irradiances using four models and systems that are also used for space weather operations The four models systems include SOLAR2000 S2K SOLARFLARE SFLR APEX and IDAR which are used by Space Environment Technologies SET to provide solar irradiances from the soft X-rays through the visible spectrum SFLR uses the GOES 0 1--0 8 nm X-rays in combination with a Mewe model subroutine to provide 0 1--30 0 nm irradiances at 0 1 nm spectral resolution at 1 minute time resolution and in a 6-hour XUV--EUV spectral solar flare evolution forecast with a 7 minute latency and a 2 minute cadence These irradiances have been calibrated with the SORCE XPS observations and we report on the inclusion of these irradiances in the S2K model There are additional developments with S2K that we discuss particularly the method by which S2K is emerging as a hybrid model empirical plus physics-based and real-time data integration platform Numerous new solar indices have been recently developed for the operations community and we describe their inclusion in S2K The APEX system is a real-time data retrieval system developed under contract to the University of Southern California Space Sciences Center SSC to provide SOHO SEM data processing and distribution SSC provides the updated SEM data to the research community and SET provides the operational data to the space operations community We

  8. Statistical modelling of sea lice count data from salmon farms in the Faroe Islands.

    PubMed

    Gislason, H

    2018-06-01

    Fiskaaling regularly counts the number of sea lice in the attached development stages (chalimus, mobiles and adult) for the salmon farms in the Faroe Islands. A statistical model of the data is developed. In the model, the sea-lice infection is represented by the chalimus (or mobile) lice developing into adult lice and is used to simulate past and current levels of adult lice-including treatments-as well as to predict the adult sea lice level 1-2 months into the future. Time series of the chalimus and adult lice show cross-correlations that shift in time and grow in size with temperature. This implies in situ the temperature-dependent development times of about 56 down to 42 days and the inverted development times (growth rates) of 0.018 up to 0.024 lice/day at 8-10°C. The temperature dependence DT=α1T+α2α3=17,840T+7.439-2.128is approximated byD1T=105.2-6.578T≈49 days at the mean temperature 8.5°C-similar to DchaT=100.6-6.507T≈45 days from EWOS data. The observed development times at four sites for a year (2010-11) were 49, 50, 51 and 52 days, respectively. Finally, we estimate the sea lice production from fish farms to discuss approaches to control the sea lice epidemics-preferably by natural means. This study is useful for understanding sea lice levels and treatments, and for in situ analysis of the sea-lice development times and growth rates. © 2017 John Wiley & Sons Ltd.

  9. A heteroscedastic generalized linear model with a non-normal speed factor for responses and response times.

    PubMed

    Molenaar, Dylan; Bolsinova, Maria

    2017-05-01

    In generalized linear modelling of responses and response times, the observed response time variables are commonly transformed to make their distribution approximately normal. A normal distribution for the transformed response times is desirable as it justifies the linearity and homoscedasticity assumptions in the underlying linear model. Past research has, however, shown that the transformed response times are not always normal. Models have been developed to accommodate this violation. In the present study, we propose a modelling approach for responses and response times to test and model non-normality in the transformed response times. Most importantly, we distinguish between non-normality due to heteroscedastic residual variances, and non-normality due to a skewed speed factor. In a simulation study, we establish parameter recovery and the power to separate both effects. In addition, we apply the model to a real data set. © 2017 The Authors. British Journal of Mathematical and Statistical Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  10. Estimation of stochastic volatility by using Ornstein-Uhlenbeck type models

    NASA Astrophysics Data System (ADS)

    Mariani, Maria C.; Bhuiyan, Md Al Masum; Tweneboah, Osei K.

    2018-02-01

    In this study, we develop a technique for estimating the stochastic volatility (SV) of a financial time series by using Ornstein-Uhlenbeck type models. Using the daily closing prices from developed and emergent stock markets, we conclude that the incorporation of stochastic volatility into the time varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. Furthermore, our estimation algorithm is feasible with large data sets and have good convergence properties.

  11. A Response-Time Approach to Comparing Generalized Rational and Take-the-Best Models of Decision Making

    ERIC Educational Resources Information Center

    Bergert, F. Bryan; Nosofsky, Robert M.

    2007-01-01

    The authors develop and test generalized versions of take-the-best (TTB) and rational (RAT) models of multiattribute paired-comparison inference. The generalized models make allowances for subjective attribute weighting, probabilistic orders of attribute inspection, and noisy decision making. A key new test involves a response-time (RT)…

  12. A risk-model for hospital mortality among patients with severe sepsis or septic shock based on German national administrative claims data

    PubMed Central

    Fleischmann-Struzek, Carolin; Rüddel, Hendrik; Reinhart, Konrad; Thomas-Rüddel, Daniel O.

    2018-01-01

    Background Sepsis is a major cause of preventable deaths in hospitals. Feasible and valid methods for comparing quality of sepsis care between hospitals are needed. The aim of this study was to develop a risk-adjustment model suitable for comparing sepsis-related mortality between German hospitals. Methods We developed a risk-model using national German claims data. Since these data are available with a time-lag of 1.5 years only, the stability of the model across time was investigated. The model was derived from inpatient cases with severe sepsis or septic shock treated in 2013 using logistic regression with backward selection and generalized estimating equations to correct for clustering. It was validated among cases treated in 2015. Finally, the model development was repeated in 2015. To investigate secular changes, the risk-adjusted trajectory of mortality across the years 2010–2015 was analyzed. Results The 2013 deviation sample consisted of 113,750 cases; the 2015 validation sample consisted of 134,851 cases. The model developed in 2013 showed good validity regarding discrimination (AUC = 0.74), calibration (observed mortality in 1st and 10th risk-decile: 11%-78%), and fit (R2 = 0.16). Validity remained stable when the model was applied to 2015 (AUC = 0.74, 1st and 10th risk-decile: 10%-77%, R2 = 0.17). There was no indication of overfitting of the model. The final model developed in year 2015 contained 40 risk-factors. Between 2010 and 2015 hospital mortality in sepsis decreased from 48% to 42%. Adjusted for risk-factors the trajectory of decrease was still significant. Conclusions The risk-model shows good predictive validity and stability across time. The model is suitable to be used as an external algorithm for comparing risk-adjusted sepsis mortality among German hospitals or regions based on administrative claims data, but secular changes need to be taken into account when interpreting risk-adjusted mortality. PMID:29558486

  13. A risk-model for hospital mortality among patients with severe sepsis or septic shock based on German national administrative claims data.

    PubMed

    Schwarzkopf, Daniel; Fleischmann-Struzek, Carolin; Rüddel, Hendrik; Reinhart, Konrad; Thomas-Rüddel, Daniel O

    2018-01-01

    Sepsis is a major cause of preventable deaths in hospitals. Feasible and valid methods for comparing quality of sepsis care between hospitals are needed. The aim of this study was to develop a risk-adjustment model suitable for comparing sepsis-related mortality between German hospitals. We developed a risk-model using national German claims data. Since these data are available with a time-lag of 1.5 years only, the stability of the model across time was investigated. The model was derived from inpatient cases with severe sepsis or septic shock treated in 2013 using logistic regression with backward selection and generalized estimating equations to correct for clustering. It was validated among cases treated in 2015. Finally, the model development was repeated in 2015. To investigate secular changes, the risk-adjusted trajectory of mortality across the years 2010-2015 was analyzed. The 2013 deviation sample consisted of 113,750 cases; the 2015 validation sample consisted of 134,851 cases. The model developed in 2013 showed good validity regarding discrimination (AUC = 0.74), calibration (observed mortality in 1st and 10th risk-decile: 11%-78%), and fit (R2 = 0.16). Validity remained stable when the model was applied to 2015 (AUC = 0.74, 1st and 10th risk-decile: 10%-77%, R2 = 0.17). There was no indication of overfitting of the model. The final model developed in year 2015 contained 40 risk-factors. Between 2010 and 2015 hospital mortality in sepsis decreased from 48% to 42%. Adjusted for risk-factors the trajectory of decrease was still significant. The risk-model shows good predictive validity and stability across time. The model is suitable to be used as an external algorithm for comparing risk-adjusted sepsis mortality among German hospitals or regions based on administrative claims data, but secular changes need to be taken into account when interpreting risk-adjusted mortality.

  14. Structure and application of an interface program between a geographic-information system and a ground-water flow model

    USGS Publications Warehouse

    Van Metre, P.C.

    1990-01-01

    A computer-program interface between a geographic-information system and a groundwater flow model links two unrelated software systems for use in developing the flow models. The interface program allows the modeler to compile and manage geographic components of a groundwater model within the geographic information system. A significant savings of time and effort is realized in developing, calibrating, and displaying the groundwater flow model. Four major guidelines were followed in developing the interface program: (1) no changes to the groundwater flow model code were to be made; (2) a data structure was to be designed within the geographic information system that follows the same basic data structure as the groundwater flow model; (3) the interface program was to be flexible enough to support all basic data options available within the model; and (4) the interface program was to be as efficient as possible in terms of computer time used and online-storage space needed. Because some programs in the interface are written in control-program language, the interface will run only on a computer with the PRIMOS operating system. (USGS)

  15. Status on the Development of a Modeling and Simulation Framework for the Economic Assessment of Nuclear Hybrid Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epiney, Aaron Simon; Chen, Jun; Rabiti, Cristian

    Continued effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year (FY) 2016. The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status of their progress. Several tasks have been accomplished. First, a synthetic time history generator has been developed in RAVEN, which consists of Fourier series and autoregressive moving average model. The former is used to capture the seasonal trend in historical data, while the latter is to characterizemore » the autocorrelation in residue time series (e.g., measurements with seasonal trends subtracted). As demonstration, both synthetic wind speed and grid demand are generated, showing matching statistics with database. In order to build a design and operations optimizer in RAVEN, a new type of sampler has been developed with highly object-oriented design. In particular, simultaneous perturbation stochastic approximation algorithm is implemented. The optimizer is capable to drive the model to optimize a scalar objective function without constraint in the input space, while the constraints handling is a work in progress and will be implemented to improve the optimization capability. Furthermore, a simplified cash flow model of the performance of an NHES in the electric market has been developed in Python and used as external model in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces. Finally, an example calculation is performed that shows the integration and proper data passing in RAVEN of the synthetic time history generator, the cash flow model and the optimizer. It has been shown that the developed Python models external to RAVEN are able to communicate with RAVEN and each other through the newly developed RAVEN capability called “EnsembleModel”.« less

  16. Time Series Remote Sensing in Monitoring the Spatio-Temporal Dynamics of Plant Invasions: A Study of Invasive Saltcedar (Tamarix Spp.)

    NASA Astrophysics Data System (ADS)

    Diao, Chunyuan

    In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.

  17. Sharpening a District's Leadership Model

    ERIC Educational Resources Information Center

    Namit, Chuck

    2008-01-01

    To create an integrated board self-assessment and superintendent evaluation process, district leaders must develop a school leadership model by adopting a coherent governance model. At the same time, they must also develop goals at the appropriate level that ensure quality governance of a school system. In the second of a two-part series, the…

  18. A Multimodel Global Drought Information System (GDIS) for Near Real-Time Monitoring of Surface Water Conditions (Invited)

    NASA Astrophysics Data System (ADS)

    Nijssen, B.

    2013-12-01

    While the absolute magnitude of economic losses associated with weather and climate disasters such as droughts is greatest in the developed world, the relative impact is much larger in the developing world, where agriculture typically constitutes a much larger percentage of the labor force and food insecurity is a major concern. Nonetheless, our ability to monitor and predict the development and occurrence of droughts at a global scale in near real-time is limited and long-term records of soil moisture are essentially non-existent globally The problem is particularly critical given that many of the most damaging droughts occur in parts of the world that are most deficient in terms of in situ precipitation observations. In recent years, a number of near real-time drought monitoring systems have been developed with regional or global extent. While direct observations of key variables such as moisture storage are missing, the evolution of land surface models that are globally applicable provides a means of reconstructing them. The implementation of a multi-model drought monitoring system is described, which provides near real-time estimates of surface moisture storage for the global land areas between 50S and 50N with a time lag of about one day. Near real-time forcings are derived from satellite-based precipitation estimates and modeled air temperatures. The system is distinguished from other operational systems in that it uses multiple land surface models to simulate surface moisture storage, which are then combined to derive a multi-model estimate of drought. Previous work has shown that while land surface models agree in broad context, particularly in terms of soil moisture percentiles, important differences remain, which motivates a multi-model ensemble approach. The system is an extension of similar systems developed by at the University of Washington for the Pacific Northwest and for the United States, but global application of the protocols used in the U.S. systems poses new challenges, particularly with respect to the generation of meteorological forcings that drive the land surface models. Agricultural and hydrological droughts are inherently defined in the context of a long-term climatology. Changes in observing platforms can be misinterpreted as droughts (or as excessively wet periods). This problem cannot simply be addressed through the addition of more observations or through the development of new observing platforms. Instead, it will require careful (re)construction of long-term records that are updated in near real-time in a consistent manner so that changes in surface meteorological forcings reflect actual conditions rather than changes in methods or sources.

  19. Pharmacokinetic Modeling and Limited Sampling Strategies Based on Healthy Volunteers for Monitoring of Ertapenem in Patients with Multidrug-Resistant Tuberculosis.

    PubMed

    van Rijn, S P; Zuur, M A; van Altena, R; Akkerman, O W; Proost, J H; de Lange, W C M; Kerstjens, H A M; Touw, D J; van der Werf, T S; Kosterink, J G W; Alffenaar, J W C

    2017-04-01

    Ertapenem is a broad-spectrum carbapenem antibiotic whose activity against Mycobacterium tuberculosis is being explored. Carbapenems have antibacterial activity when the plasma concentration exceeds the MIC at least 40% of the time (40% T MIC ). To assess the 40% T MIC in multidrug-resistant tuberculosis (MDR-TB) patients, a limited sampling strategy was developed using a population pharmacokinetic model based on data for healthy volunteers. A two-compartment population pharmacokinetic model was developed with data for 42 healthy volunteers using an iterative two-stage Bayesian method. External validation was performed by Bayesian fitting of the model developed with data for volunteers to the data for individual MDR-TB patients (in which the fitted values of the area under the concentration-time curve from 0 to 24 h [AUC 0-24, fit values] were used) using the population model developed for volunteers as a prior. A Monte Carlo simulation ( n = 1,000) was used to evaluate limited sampling strategies. Additionally, the 40% T MIC with the free fraction ( f 40% T MIC ) of ertapenem in MDR-TB patients was estimated with the population pharmacokinetic model. The population pharmacokinetic model that was developed was shown to overestimate the area under the concentration-time curve from 0 to 24 h (AUC 0-24 ) in MDR-TB patients by 6.8% (range, -17.2 to 30.7%). The best-performing limited sampling strategy, which had a time restriction of 0 to 6 h, was found to be sampling at 1 and 5 h ( r 2 = 0.78, mean prediction error = -0.33%, root mean square error = 5.5%). Drug exposure was overestimated by a mean percentage of 4.2% (range, -15.2 to 23.6%). When a free fraction of 5% was considered and the MIC was set at 0.5 mg/liter, the minimum f 40% T MIC would have been exceeded in 9 out of 12 patients. A population pharmacokinetic model and limited sampling strategy, developed using data from healthy volunteers, were shown to be adequate to predict ertapenem exposure in MDR-TB patients. Copyright © 2017 American Society for Microbiology.

  20. Pharmacokinetic Modeling and Limited Sampling Strategies Based on Healthy Volunteers for Monitoring of Ertapenem in Patients with Multidrug-Resistant Tuberculosis

    PubMed Central

    van Rijn, S. P.; Zuur, M. A.; van Altena, R.; Akkerman, O. W.; Proost, J. H.; de Lange, W. C. M.; Kerstjens, H. A. M.; Touw, D. J.; van der Werf, T. S.; Kosterink, J. G. W.

    2017-01-01

    ABSTRACT Ertapenem is a broad-spectrum carbapenem antibiotic whose activity against Mycobacterium tuberculosis is being explored. Carbapenems have antibacterial activity when the plasma concentration exceeds the MIC at least 40% of the time (40% TMIC). To assess the 40% TMIC in multidrug-resistant tuberculosis (MDR-TB) patients, a limited sampling strategy was developed using a population pharmacokinetic model based on data for healthy volunteers. A two-compartment population pharmacokinetic model was developed with data for 42 healthy volunteers using an iterative two-stage Bayesian method. External validation was performed by Bayesian fitting of the model developed with data for volunteers to the data for individual MDR-TB patients (in which the fitted values of the area under the concentration-time curve from 0 to 24 h [AUC0–24, fit values] were used) using the population model developed for volunteers as a prior. A Monte Carlo simulation (n = 1,000) was used to evaluate limited sampling strategies. Additionally, the 40% TMIC with the free fraction (f 40% TMIC) of ertapenem in MDR-TB patients was estimated with the population pharmacokinetic model. The population pharmacokinetic model that was developed was shown to overestimate the area under the concentration-time curve from 0 to 24 h (AUC0–24) in MDR-TB patients by 6.8% (range, −17.2 to 30.7%). The best-performing limited sampling strategy, which had a time restriction of 0 to 6 h, was found to be sampling at 1 and 5 h (r2 = 0.78, mean prediction error = −0.33%, root mean square error = 5.5%). Drug exposure was overestimated by a mean percentage of 4.2% (range, −15.2 to 23.6%). When a free fraction of 5% was considered and the MIC was set at 0.5 mg/liter, the minimum f 40% TMIC would have been exceeded in 9 out of 12 patients. A population pharmacokinetic model and limited sampling strategy, developed using data from healthy volunteers, were shown to be adequate to predict ertapenem exposure in MDR-TB patients. PMID:28137814

  1. Time-series-based hybrid mathematical modelling method adapted to forecast automotive and medical waste generation: Case study of Lithuania.

    PubMed

    Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras

    2018-05-01

    The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.

  2. The inclusion of capillary distribution in the adiabatic tissue homogeneity model of blood flow

    NASA Astrophysics Data System (ADS)

    Koh, T. S.; Zeman, V.; Darko, J.; Lee, T.-Y.; Milosevic, M. F.; Haider, M.; Warde, P.; Yeung, I. W. T.

    2001-05-01

    We have developed a non-invasive imaging tracer kinetic model for blood flow which takes into account the distribution of capillaries in tissue. Each individual capillary is assumed to follow the adiabatic tissue homogeneity model. The main strength of our new model is in its ability to quantify the functional distribution of capillaries by the standard deviation in the time taken by blood to pass through the tissue. We have applied our model to the human prostate and have tested two different types of distribution functions. Both distribution functions yielded very similar predictions for the various model parameters, and in particular for the standard deviation in transit time. Our motivation for developing this model is the fact that the capillary distribution in cancerous tissue is drastically different from in normal tissue. We believe that there is great potential for our model to be used as a prognostic tool in cancer treatment. For example, an accurate knowledge of the distribution in transit times might result in an accurate estimate of the degree of tumour hypoxia, which is crucial to the success of radiation therapy.

  3. Atmospheric density models

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An atmospheric model developed by Jacchia, quite accurate but requiring a large amount of computer storage and execution time, was found to be ill-suited for the space shuttle onboard program. The development of a simple atmospheric density model to simulate the Jacchia model was studied. Required characteristics including variation with solar activity, diurnal variation, variation with geomagnetic activity, semiannual variation, and variation with height were met by the new atmospheric density model.

  4. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    PubMed

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  5. A practical model of thin disk regenerative amplifier based on analytical expression of ASE lifetime

    NASA Astrophysics Data System (ADS)

    Zhou, Huang; Chyla, Michal; Nagisetty, Siva Sankar; Chen, Liyuan; Endo, Akira; Smrz, Martin; Mocek, Tomas

    2017-12-01

    In this paper, a practical model of a thin disk regenerative amplifier has been developed based on an analytical approach, in which Drew A. Copeland [1] had evaluated the loss rate of the upper state laser level due to ASE and derived the analytical expression of the effective life-time of the upper-state laser level by taking the Lorentzian stimulated emission line-shape and total internal reflection into account. By adopting the analytical expression of effective life-time in the rate equations, we have developed a less numerically intensive model for predicting and analyzing the performance of a thin disk regenerative amplifier. Thanks to the model, optimized combination of various parameters can be obtained to avoid saturation, period-doubling bifurcation or first pulse suppression prior to experiments. The effective life-time due to ASE is also analyzed against various parameters. The simulated results fit well with experimental data. By fitting more experimental results with numerical model, we can improve the parameters of the model, such as reflective factor which is used to determine the weight of boundary reflection within the influence of ASE. This practical model will be used to explore the scaling limits imposed by ASE of the thin disk regenerative amplifier being developed in HiLASE Centre.

  6. Discrete time Markov chains (DTMC) susceptible infected susceptible (SIS) epidemic model with two pathogens in two patches

    NASA Astrophysics Data System (ADS)

    Lismawati, Eka; Respatiwulan; Widyaningsih, Purnami

    2017-06-01

    The SIS epidemic model describes the pattern of disease spread with characteristics that recovered individuals can be infected more than once. The number of susceptible and infected individuals every time follows the discrete time Markov process. It can be represented by the discrete time Markov chains (DTMC) SIS. The DTMC SIS epidemic model can be developed for two pathogens in two patches. The aims of this paper are to reconstruct and to apply the DTMC SIS epidemic model with two pathogens in two patches. The model was presented as transition probabilities. The application of the model obtain that the number of susceptible individuals decreases while the number of infected individuals increases for each pathogen in each patch.

  7. Reverse time migration by Krylov subspace reduced order modeling

    NASA Astrophysics Data System (ADS)

    Basir, Hadi Mahdavi; Javaherian, Abdolrahim; Shomali, Zaher Hossein; Firouz-Abadi, Roohollah Dehghani; Gholamy, Shaban Ali

    2018-04-01

    Imaging is a key step in seismic data processing. To date, a myriad of advanced pre-stack depth migration approaches have been developed; however, reverse time migration (RTM) is still considered as the high-end imaging algorithm. The main limitations associated with the performance cost of reverse time migration are the intensive computation of the forward and backward simulations, time consumption, and memory allocation related to imaging condition. Based on the reduced order modeling, we proposed an algorithm, which can be adapted to all the aforementioned factors. Our proposed method benefit from Krylov subspaces method to compute certain mode shapes of the velocity model computed by as an orthogonal base of reduced order modeling. Reverse time migration by reduced order modeling is helpful concerning the highly parallel computation and strongly reduces the memory requirement of reverse time migration. The synthetic model results showed that suggested method can decrease the computational costs of reverse time migration by several orders of magnitudes, compared with reverse time migration by finite element method.

  8. Feasibility of dynamic risk prediction for hepatocellular carcinoma development in patients with chronic hepatitis B.

    PubMed

    Jeon, Mi Young; Lee, Hye Won; Kim, Seung Up; Kim, Beom Kyung; Park, Jun Yong; Kim, Do Young; Han, Kwang-Hyub; Ahn, Sang Hoon

    2018-04-01

    Several risk prediction models for hepatocellular carcinoma (HCC) development are available. We explored whether the use of risk prediction models can dynamically predict HCC development at different time points in chronic hepatitis B (CHB) patients. Between 2006 and 2014, 1397 CHB patients were recruited. All patients underwent serial transient elastography at intervals of >6 months. The median age of this study population (931 males and 466 females) was 49.0 years. The median CU-HCC, REACH-B, LSM-HCC and mREACH-B score at enrolment were 4.0, 9.0, 10.0 and 8.0 respectively. During the follow-up period (median, 68.0 months), 87 (6.2%) patients developed HCC. All risk prediction models were successful in predicting HCC development at both the first liver stiffness (LS) measurement (hazard ratio [HR] = 1.067-1.467 in the subgroup without antiviral therapy [AVT] and 1.096-1.458 in the subgroup with AVT) and second LS measurement (HR = 1.125-1.448 in the subgroup without AVT and 1.087-1.249 in the subgroup with AVT). In contrast, neither the absolute nor percentage change in the scores from the risk prediction models predicted HCC development (all P > .05). The mREACH-B score performed similarly or significantly better than did the other scores (AUROCs at 5 years, 0.694-0.862 vs 0.537-0.875). Dynamic prediction of HCC development at different time points was achieved using four risk prediction models, but not using the changes in the absolute and percentage values between two time points. The mREACH-B score was the most appropriate prediction model of HCC development among four prediction models. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Transient modeling in simulation of hospital operations for emergency response.

    PubMed

    Paul, Jomon Aliyas; George, Santhosh K; Yi, Pengfei; Lin, Li

    2006-01-01

    Rapid estimates of hospital capacity after an event that may cause a disaster can assist disaster-relief efforts. Due to the dynamics of hospitals, following such an event, it is necessary to accurately model the behavior of the system. A transient modeling approach using simulation and exponential functions is presented, along with its applications in an earthquake situation. The parameters of the exponential model are regressed using outputs from designed simulation experiments. The developed model is capable of representing transient, patient waiting times during a disaster. Most importantly, the modeling approach allows real-time capacity estimation of hospitals of various sizes and capabilities. Further, this research is an analysis of the effects of priority-based routing of patients within the hospital and the effects on patient waiting times determined using various patient mixes. The model guides the patients based on the severity of injuries and queues the patients requiring critical care depending on their remaining survivability time. The model also accounts the impact of prehospital transport time on patient waiting time.

  10. Assessing the Financial Benefits of Faster Development Times: The Case of Single-source Versus Multi-vendor Outsourced Biopharmaceutical Manufacturing.

    PubMed

    DiMasi, Joseph A; Smith, Zachary; Getz, Kenneth A

    2018-05-10

    The extent to which new drug developers can benefit financially from shorter development times has implications for development efficiency and innovation incentives. We provided a real-world example of such gains by using recent estimates of drug development costs and returns. Time and fee data were obtained on 5 single-source manufacturing projects. Time and fees were modeled for these projects as if the drug substance and drug product processes had been contracted separately from 2 vendors. The multi-vendor model was taken as the base case, and financial impacts from single-source contracting were determined relative to the base case. The mean and median after-tax financial benefits of shorter development times from single-source contracting were $44.7 million and $34.9 million, respectively (2016 dollars). The after-tax increases in sponsor fees from single-source contracting were small in comparison (mean and median of $0.65 million and $0.25 million). For the data we examined, single-source contracting yielded substantial financial benefits over multi-source contracting, even after accounting for somewhat higher sponsor fees. Copyright © 2018 Elsevier HS Journals, Inc. All rights reserved.

  11. Turnaround Time Modeling for Conceptual Rocket Engines

    NASA Technical Reports Server (NTRS)

    Nix, Michael; Staton, Eric J.

    2004-01-01

    Recent years have brought about a paradigm shift within NASA and the Space Launch Community regarding the performance of conceptual design. Reliability, maintainability, supportability, and operability are no longer effects of design; they have moved to the forefront and are affecting design. A primary focus of this shift has been a planned decrease in vehicle turnaround time. Potentials for instituting this decrease include attacking the issues of removing, refurbishing, and replacing the engines after each flight. less, it is important to understand the operational affects of an engine on turnaround time, ground support personnel and equipment. One tool for visualizing this relationship involves the creation of a Discrete Event Simulation (DES). A DES model can be used to run a series of trade studies to determine if the engine is meeting its requirements, and, if not, what can be altered to bring it into compliance. Using DES, it is possible to look at the ways in which labor requirements, parallel maintenance versus serial maintenance, and maintenance scheduling affect the overall turnaround time. A detailed DES model of the Space Shuttle Main Engines (SSME) has been developed. Trades may be performed using the SSME Processing Model to see where maintenance bottlenecks occur, what the benefits (if any) are of increasing the numbers of personnel, or the number and location of facilities, in addition to trades previously mentioned, all with the goal of optimizing the operational turnaround time and minimizing operational cost. The SSME Processing Model was developed in such a way that it can easily be used as a foundation for developing DES models of other operational or developmental reusable engines. Performing a DES on a developmental engine during the conceptual phase makes it easier to affect the design and make changes to bring about a decrease in turnaround time and costs.

  12. Resource utilization model for the algorithm to architecture mapping model

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Patel, Rakesh R.

    1993-01-01

    The analytical model for resource utilization and the variable node time and conditional node model for the enhanced ATAMM model for a real-time data flow architecture are presented in this research. The Algorithm To Architecture Mapping Model, ATAMM, is a Petri net based graph theoretic model developed at Old Dominion University, and is capable of modeling the execution of large-grained algorithms on a real-time data flow architecture. Using the resource utilization model, the resource envelope may be obtained directly from a given graph and, consequently, the maximum number of required resources may be evaluated. The node timing diagram for one iteration period may be obtained using the analytical resource envelope. The variable node time model, which describes the change in resource requirement for the execution of an algorithm under node time variation, is useful to expand the applicability of the ATAMM model to heterogeneous architectures. The model also describes a method of detecting the presence of resource limited mode and its subsequent prevention. Graphs with conditional nodes are shown to be reduced to equivalent graphs with time varying nodes and, subsequently, may be analyzed using the variable node time model to determine resource requirements. Case studies are performed on three graphs for the illustration of applicability of the analytical theories.

  13. Time limited field of regard search

    NASA Astrophysics Data System (ADS)

    Flug, Eric; Maurer, Tana; Nguyen, Oanh-Tho

    2005-05-01

    Recent work by the US Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) has led to the Time-Limited Search (TLS) model, which has given new formulations for the field of view (FOV) search times. The next step in the evaluation of the overall search model (ACQUIRE) is to apply these parameters to the field of regard (FOR) model. Human perception experiments were conducted using synthetic imagery developed at NVESD. The experiments were competitive player-on-player search tests with the intention of imposing realistic time constraints on the observers. FOR detection probabilities, search times, and false alarm data are analyzed and compared to predictions using both the TLS model and ACQUIRE.

  14. Path integral for equities: Dynamic correlation and empirical analysis

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Cao, Yang; Lau, Ada; Tang, Pan

    2012-02-01

    This paper develops a model to describe the unequal time correlation between rate of returns of different stocks. A non-trivial fourth order derivative Lagrangian is defined to provide an unequal time propagator, which can be fitted to the market data. A calibration algorithm is designed to find the empirical parameters for this model and different de-noising methods are used to capture the signals concealed in the rate of return. The detailed results of this Gaussian model show that the different stocks can have strong correlation and the empirical unequal time correlator can be described by the model's propagator. This preliminary study provides a novel model for the correlator of different instruments at different times.

  15. Personalized prediction of chronic wound healing: an exponential mixed effects model using stereophotogrammetric measurement.

    PubMed

    Xu, Yifan; Sun, Jiayang; Carter, Rebecca R; Bogie, Kath M

    2014-05-01

    Stereophotogrammetric digital imaging enables rapid and accurate detailed 3D wound monitoring. This rich data source was used to develop a statistically validated model to provide personalized predictive healing information for chronic wounds. 147 valid wound images were obtained from a sample of 13 category III/IV pressure ulcers from 10 individuals with spinal cord injury. Statistical comparison of several models indicated the best fit for the clinical data was a personalized mixed-effects exponential model (pMEE), with initial wound size and time as predictors and observed wound size as the response variable. Random effects capture personalized differences. Other models are only valid when wound size constantly decreases. This is often not achieved for clinical wounds. Our model accommodates this reality. Two criteria to determine effective healing time outcomes are proposed: r-fold wound size reduction time, t(r-fold), is defined as the time when wound size reduces to 1/r of initial size. t(δ) is defined as the time when the rate of the wound healing/size change reduces to a predetermined threshold δ < 0. Healing rate differs from patient to patient. Model development and validation indicates that accurate monitoring of wound geometry can adaptively predict healing progression and that larger wounds heal more rapidly. Accuracy of the prediction curve in the current model improves with each additional evaluation. Routine assessment of wounds using detailed stereophotogrammetric imaging can provide personalized predictions of wound healing time. Application of a valid model will help the clinical team to determine wound management care pathways. Published by Elsevier Ltd.

  16. Real-time dynamic simulation of the Cassini spacecraft using DARTS. Part 2: Parallel/vectorized real-time implementation

    NASA Technical Reports Server (NTRS)

    Fijany, A.; Roberts, J. A.; Jain, A.; Man, G. K.

    1993-01-01

    Part 1 of this paper presented the requirements for the real-time simulation of Cassini spacecraft along with some discussion of the DARTS algorithm. Here, in Part 2 we discuss the development and implementation of parallel/vectorized DARTS algorithm and architecture for real-time simulation. Development of the fast algorithms and architecture for real-time hardware-in-the-loop simulation of spacecraft dynamics is motivated by the fact that it represents a hard real-time problem, in the sense that the correctness of the simulation depends on both the numerical accuracy and the exact timing of the computation. For a given model fidelity, the computation should be computed within a predefined time period. Further reduction in computation time allows increasing the fidelity of the model (i.e., inclusion of more flexible modes) and the integration routine.

  17. Time series evapotranspiration maps at a regional scale: A methodology, evaluation, and their use in water resources management

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.

    2016-12-01

    Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.

  18. Development of a model-based flood emergency management system in Yujiang River Basin, South China

    NASA Astrophysics Data System (ADS)

    Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu

    2014-06-01

    Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.

  19. Minimizing patient waiting time in emergency department of public hospital using simulation optimization approach

    NASA Astrophysics Data System (ADS)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2017-04-01

    Emergency department (ED) is the main unit of a hospital that provides emergency treatment. Operating 24 hours a day with limited number of resources invites more problems to the current chaotic situation in some hospitals in Malaysia. Delays in getting treatments that caused patients to wait for a long period of time are among the frequent complaints against government hospitals. Therefore, the ED management needs a model that can be used to examine and understand resource capacity which can assist the hospital managers to reduce patients waiting time. Simulation model was developed based on 24 hours data collection. The model developed using Arena simulation replicates the actual ED's operations of a public hospital in Selangor, Malaysia. The OptQuest optimization in Arena is used to find the possible combinations of a number of resources that can minimize patients waiting time while increasing the number of patients served. The simulation model was modified for improvement based on results from OptQuest. The improvement model significantly improves ED's efficiency with an average of 32% reduction in average patients waiting times and 25% increase in the total number of patients served.

  20. A STUDY ON TEMPORAL DISTRIBUTION OF FREIGHT TRANSPORTATION IN CONSIDERATION OF DAILY WORK-LIFE CYCLE

    NASA Astrophysics Data System (ADS)

    Kitaoka, Daiki; Hara, Hidetaka; Oeda, Yoshinao; Sumi, Tomonori

    As advanced freight service is demanded, the time related requirements fo r freight transportation becomes more and more significant. This study, focusing on temporal distribution of freight transportation responding to the travel time, developed a shipment departure time decision model for each item, aiming at quantitatively grasping social requirement in the time domain. The model takes account of the daily work cycle of both work cy cles of shippers and carriers along with the travel time. The proposed model has a similar structure as that derived from the previous studies taking account of the daily living cycle of individuals. This model properly reproduced temporal distribution of shipment departure time that changes depending on the length of necessary lead time for each item.

  1. Bio-physical modeling of time-resolved forward scattering by Listeria colonies

    NASA Astrophysics Data System (ADS)

    Bae, Euiwon; Banada, Padmapriya P.; Bhunia, Arun K.; Hirleman, E. Daniel

    2006-10-01

    We have developed a detection system and associated protocol based on optical forward scattering where the bacterial colonies of various species and strains growing on solid nutrient surfaces produced unique scatter signatures. The aim of the present investigation was to develop a bio-physical model for the relevant phenomena. In particular, we considered time-varying macroscopic morphological properties of the growing colonies and modeled the scattering using scalar diffraction theory. For the present work we performed detailed studies with three species of Listeria; L. innocua, L. monocytogenes, and L. ivanovii. The baseline experiments involved cultures grown on brain heart infusion (BHI) agar and the scatter images were captured every six hours for an incubation period of 42 hours. The morphologies of the colonies were studied by phase contrast microscopy, including measurement of the diameter of the colony. Growth curves, represented by colony diameter as a function of time, were compared with the time-evolution of scattering signatures. Similar studies were carried out with L. monocytogenes grown on different substrates. Non-dimensionalizing incubation time in terms of the time to reach stationary phase was effective in reducing the dimensionality of the model. Bio-physical properties of the colony such as diameter, bacteria density variation, surface curvature/profile, and transmission coefficient are important parameters in predicting the features of the forward scattering signatures. These parameters are included in a baseline model that treats the colony as a concentric structure with radial variations in phase modulation. In some cases azimuthal variations and random phase inclusions were included as well. The end result is a protocol (growth media, incubation time and conditions) that produces reproducible and distinguishable scatter patterns for a variety of harmful food borne pathogens in a short period of time. Further, the bio-physical model we developed is very effective in predicting the dominant features of the scattering signatures required by the identification process and will be effective for informing further improvements in the instrumentation.

  2. Using a discrete-event simulation to balance ambulance availability and demand in static deployment systems.

    PubMed

    Wu, Ching-Han; Hwang, Kevin P

    2009-12-01

    To improve ambulance response time, matching ambulance availability with the emergency demand is crucial. To maintain the standard of 90% of response times within 9 minutes, the authors introduce a discrete-event simulation method to estimate the threshold for expanding the ambulance fleet when demand increases and to find the optimal dispatching strategies when provisional events create temporary decreases in ambulance availability. The simulation model was developed with information from the literature. Although the development was theoretical, the model was validated on the emergency medical services (EMS) system of Tainan City. The data are divided: one part is for model development, and the other for validation. For increasing demand, the effect was modeled on response time when call arrival rates increased. For temporary availability decreases, the authors simulated all possible alternatives of ambulance deployment in accordance with the number of out-of-routine-duty ambulances and the durations of three types of mass gatherings: marathon races (06:00-10:00 hr), rock concerts (18:00-22:00 hr), and New Year's Eve parties (20:00-01:00 hr). Statistical analysis confirmed that the model reasonably represented the actual Tainan EMS system. The response-time standard could not be reached when the incremental ratio of call arrivals exceeded 56%, which is the threshold for the Tainan EMS system to expand its ambulance fleet. When provisional events created temporary availability decreases, the Tainan EMS system could spare at most two ambulances from the standard configuration, except between 20:00 and 01:00, when it could spare three. The model also demonstrated that the current Tainan EMS has two excess ambulances that could be dropped. The authors suggest dispatching strategies to minimize the response times in routine daily emergencies. Strategies of capacity management based on this model improved response times. The more ambulances that are out of routine duty, the better the performance of the optimal strategies that are based on this model.

  3. Multivariate time series modeling of short-term system scale irrigation demand

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as changing water policy, continued development of water markets, drought and changing technology.

  4. Probabilistic models for the prediction of target growth interfaces of Listeria monocytogenes on ham and turkey breast products.

    PubMed

    Yoon, Yohan; Geornaras, Ifigenia; Scanga, John A; Belk, Keith E; Smith, Gary C; Kendall, Patricia A; Sofos, John N

    2011-08-01

    This study developed growth/no growth models for predicting growth boundaries of Listeria monocytogenes on ready-to-eat cured ham and uncured turkey breast slices as a function of lactic acid concentration (0% to 4%), dipping time (0 to 4 min), and storage temperature (4 to 10 °C). A 10-strain composite of L. monocytogenes was inoculated (2 to 3 log CFU/cm²) on slices, followed by dipping into lactic acid and storage in vacuum packages for up to 30 d. Total bacterial (tryptic soy agar plus 0.6% yeast extract) and L. monocytogenes (PALCAM agar) populations were determined on day 0 and at the endpoint of storage. The combinations of parameters that allowed increases in cell counts of L. monocytogenes of at least l log CFU/cm² were assigned the value of 1, while those limiting growth to <1 log CFU/cm² were given the value of 0. The binary data were used in logistic regression analysis for development of models to predict boundaries between growth and no growth of the pathogen at desired probabilities. Indices of model performance and validation with limited available data indicated that the models developed had acceptable goodness of fit. Thus, the described procedures using bacterial growth data from studies with food products may be appropriate in developing growth/no growth models to predict growth and to select lactic acid concentrations and dipping times for control of L. monocytogenes. The models developed in this study may be useful in selecting lactic acid concentrations and dipping times to control growth of Listeria monocytogenes on cured ham and uncured turkey breast during product storage, and in determining probabilities of growth under selected conditions. The modeling procedures followed may also be used for application in model development for other products, conditions, or pathogens. © 2011 Institute of Food Technologists®

  5. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  6. Continuous piecewise-linear, reduced-order electrochemical model for lithium-ion batteries in real-time applications

    NASA Astrophysics Data System (ADS)

    Farag, Mohammed; Fleckenstein, Matthias; Habibi, Saeid

    2017-02-01

    Model-order reduction and minimization of the CPU run-time while maintaining the model accuracy are critical requirements for real-time implementation of lithium-ion electrochemical battery models. In this paper, an isothermal, continuous, piecewise-linear, electrode-average model is developed by using an optimal knot placement technique. The proposed model reduces the univariate nonlinear function of the electrode's open circuit potential dependence on the state of charge to continuous piecewise regions. The parameterization experiments were chosen to provide a trade-off between extensive experimental characterization techniques and purely identifying all parameters using optimization techniques. The model is then parameterized in each continuous, piecewise-linear, region. Applying the proposed technique cuts down the CPU run-time by around 20%, compared to the reduced-order, electrode-average model. Finally, the model validation against real-time driving profiles (FTP-72, WLTP) demonstrates the ability of the model to predict the cell voltage accurately with less than 2% error.

  7. Australia's marine virtual laboratory

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Gillibrand, Philip; Oke, Peter; Rosebrock, Uwe

    2014-05-01

    In all modelling studies of realistic scenarios, a researcher has to go through a number of steps to set up a model in order to produce a model simulation of value. The steps are generally the same, independent of the modelling system chosen. These steps include determining the time and space scales and processes of the required simulation; obtaining data for the initial set up and for input during the simulation time; obtaining observation data for validation or data assimilation; implementing scripts to run the simulation(s); and running utilities or custom-built software to extract results. These steps are time consuming and resource hungry, and have to be done every time irrespective of the simulation - the more complex the processes, the more effort is required to set up the simulation. The Australian Marine Virtual Laboratory (MARVL) is a new development in modelling frameworks for researchers in Australia. MARVL uses the TRIKE framework, a java-based control system developed by CSIRO that allows a non-specialist user configure and run a model, to automate many of the modelling preparation steps needed to bring the researcher faster to the stage of simulation and analysis. The tool is seen as enhancing the efficiency of researchers and marine managers, and is being considered as an educational aid in teaching. In MARVL we are developing a web-based open source application which provides a number of model choices and provides search and recovery of relevant observations, allowing researchers to: a) efficiently configure a range of different community ocean and wave models for any region, for any historical time period, with model specifications of their choice, through a user-friendly web application, b) access data sets to force a model and nest a model into, c) discover and assemble ocean observations from the Australian Ocean Data Network (AODN, http://portal.aodn.org.au/webportal/) in a format that is suitable for model evaluation or data assimilation, and d) run the assembled configuration in a cloud computing environment, or download the assembled configuration and packaged data to run on any other system of the user's choice. MARVL is now being applied in a number of case studies around Australia ranging in scale from locally confined estuaries to the Tasman Sea between Australia and New Zealand. In time we expect the range of models offered will include biogeochemical models.

  8. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. USEPA Resistance Management Model development

    EPA Science Inventory

    The US EPA requires registrants of plant incorporated protectant (PIP) crops to provide information relating to the time frame for pest resistance development related to the control traits of the crop. Simulation models are used to evaluate the future conditions for resistance de...

  18. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. A Symmetric Time-Varying Cluster Rate of Descent Model

    NASA Technical Reports Server (NTRS)

    Ray, Eric S.

    2015-01-01

    A model of the time-varying rate of descent of the Orion vehicle was developed based on the observed correlation between canopy projected area and drag coefficient. This initial version of the model assumes cluster symmetry and only varies the vertical component of velocity. The cluster fly-out angle is modeled as a series of sine waves based on flight test data. The projected area of each canopy is synchronized with the primary fly-out angle mode. The sudden loss of projected area during canopy collisions is modeled at minimum fly-out angles, leading to brief increases in rate of descent. The cluster geometry is converted to drag coefficient using empirically derived constants. A more complete model is under development, which computes the aerodynamic response of each canopy to its local incidence angle.

  20. A generalized linear factor model approach to the hierarchical framework for responses and response times.

    PubMed

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-05-01

    We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only the mild restriction that there is no hierarchical model at the item side. This result is valuable as it enables all well-developed modelling tools and extensions that come with these methods. We show that the restriction we impose on the hierarchical model does not influence parameter recovery under realistic circumstances. In addition, we present two illustrative real data analyses to demonstrate the practical benefits of our approach. © 2014 The British Psychological Society.

  1. Revisioning Faculty Development for Changing Times: The Foundation and Framework.

    ERIC Educational Resources Information Center

    Licklider, Barbara L.; Fulton, Carol; Schnelker, Diane L.

    1998-01-01

    Provides an interactive model of faculty development which draws from research on adult education and staff development. Argues that in order to improve the quality of undergraduate education college administrators can no longer assume that faculty will learn their craft on their own; they must provide time, opportunity and support. Contains 1…

  2. Development and application of a non-Gaussian atmospheric turbulence model for use in flight simulators

    NASA Technical Reports Server (NTRS)

    Reeves, P. M.; Campbell, G. S.; Ganzer, V. M.; Joppa, R. G.

    1974-01-01

    A method is described for generating time histories which model the frequency content and certain non-Gaussian probability characteristics of atmospheric turbulence including the large gusts and patchy nature of turbulence. Methods for time histories using either analog or digital computation are described. A STOL airplane was programmed into a 6-degree-of-freedom flight simulator, and turbulence time histories from several atmospheric turbulence models were introduced. The pilots' reactions are described.

  3. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework

    PubMed Central

    Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna

    2016-01-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462

  4. Investigation of Fully Three-Dimensional Helical RF Field Effects on TWT Beam/Circuit Interaction

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    2000-01-01

    A fully three-dimensional (3D), time-dependent, helical traveling wave-tube (TWT) interaction model has been developed using the electromagnetic particle-in-cell (PIC) code MAFIA. The model includes a short section of helical slow-wave circuit with excitation fed by RF input/output couplers, and electron beam contained by periodic permanent magnet (PPM) focusing. All components of the model are simulated in three dimensions allowing the effects of the fully 3D helical fields on RF circuit/beam interaction to be investigated for the first time. The development of the interaction model is presented, and predicted TWT performance using 2.5D and 3D models is compared to investigate the effect of conventional approximations used in TWT analyses.

  5. Sensitivity analysis of 1-D dynamical model for basin analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, S.

    1987-01-01

    Geological processes related to petroleum generation, migration and accumulation are very complicated in terms of time and variables involved, and it is very difficult to simulate these processes by laboratory experiments. For this reasons, many mathematic/computer models have been developed to simulate these geological processes based on geological, geophysical and geochemical principles. The sensitivity analysis in this study is a comprehensive examination on how geological, geophysical and geochemical parameters influence the reconstructions of geohistory, thermal history and hydrocarbon generation history using the 1-D fluid flow/compaction model developed in the Basin Modeling Group at the University of South Carolina. This studymore » shows the effects of some commonly used parameter such as depth, age, lithology, porosity, permeability, unconformity (eroded thickness and erosion time), temperature at sediment surface, bottom hole temperature, present day heat flow, thermal gradient, thermal conductivity and kerogen type and content on the evolutions of formation thickness, porosity, permeability, pressure with time and depth, heat flow with time, temperature with time and depth, vitrinite reflectance (Ro) and TTI with time and depth, and oil window in terms of time and depth, amount of hydrocarbons generated with time and depth. Lithology, present day heat flow and thermal conductivity are the most sensitive parameters in the reconstruction of temperature history.« less

  6. Mechanisms of interactive specialization and emergence of functional brain circuits supporting cognitive development in children

    NASA Astrophysics Data System (ADS)

    Battista, Christian; Evans, Tanya M.; Ngoon, Tricia J.; Chen, Tianwen; Chen, Lang; Kochalka, John; Menon, Vinod

    2018-01-01

    Cognitive development is thought to depend on the refinement and specialization of functional circuits over time, yet little is known about how this process unfolds over the course of childhood. Here we investigated growth trajectories of functional brain circuits and tested an interactive specialization model of neurocognitive development which posits that the refinement of task-related functional networks is driven by a shared history of co-activation between cortical regions. We tested this model in a longitudinal cohort of 30 children with behavioral and task-related functional brain imaging data at multiple time points spanning childhood and adolescence, focusing on the maturation of parietal circuits associated with numerical problem solving and learning. Hierarchical linear modeling revealed selective strengthening as well as weakening of functional brain circuits. Connectivity between parietal and prefrontal cortex decreased over time, while connectivity within posterior brain regions, including intra-hemispheric and inter-hemispheric parietal connectivity, as well as parietal connectivity with ventral temporal occipital cortex regions implicated in quantity manipulation and numerical symbol recognition, increased over time. Our study provides insights into the longitudinal maturation of functional circuits in the human brain and the mechanisms by which interactive specialization shapes children's cognitive development and learning.

  7. Piezoceramic devices and artificial intelligence time varying concepts in smart structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Calise, A. J.; Glass, B. J.

    1990-01-01

    The problem of development of smart structures and their vibration control by the use of piezoceramic sensors and actuators have been discussed. In particular, these structures are assumed to have time varying model form and parameters. The model form may change significantly and suddenly. Combined identification of the model from parameters of these structures and model adaptive control of these structures are discussed in this paper.

  8. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  9. The Co-Development of Parenting Stress and Childhood Internalizing and Externalizing Problems.

    PubMed

    Stone, Lisanne L; Mares, Suzanne H W; Otten, Roy; Engels, Rutger C M E; Janssens, Jan M A M

    Although the detrimental influence of parenting stress on child problem behavior is well established, it remains unknown how these constructs affect each other over time. In accordance with a transactional model, this study investigates how the development of internalizing and externalizing problems is related to the development of parenting stress in children aged 4-9. Mothers of 1582 children participated in three one-year interval data waves. Internalizing and externalizing problems as well as parenting stress were assessed by maternal self-report. Interrelated development of parenting with internalizing and externalizing problems was examined using Latent Growth Modeling. Directionality of effects was further investigated by using cross-lagged models. Parenting stress and externalizing problems showed a decrease over time, whereas internalizing problems remained stable. Initial levels of parenting stress were related to initial levels of both internalizing and externalizing problems. Decreases in parenting stress were related to larger decreases in externalizing problems and to the (stable) course of internalizing problems. Some evidence for reciprocity was found such that externalizing problems were associated with parenting stress and vice versa over time, specifically for boys. Our findings support the transactional model in explaining psychopathology.

  10. New analytic results for speciation times in neutral models.

    PubMed

    Gernhard, Tanja

    2008-05-01

    In this paper, we investigate the standard Yule model, and a recently studied model of speciation and extinction, the "critical branching process." We develop an analytic way-as opposed to the common simulation approach-for calculating the speciation times in a reconstructed phylogenetic tree. Simple expressions for the density and the moments of the speciation times are obtained. Methods for dating a speciation event become valuable, if for the reconstructed phylogenetic trees, no time scale is available. A missing time scale could be due to supertree methods, morphological data, or molecular data which violates the molecular clock. Our analytic approach is, in particular, useful for the model with extinction, since simulations of birth-death processes which are conditioned on obtaining n extant species today are quite delicate. Further, simulations are very time consuming for big n under both models.

  11. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  12. Logistic and linear regression model documentation for statistical relations between continuous real-time and discrete water-quality constituents in the Kansas River, Kansas, July 2012 through June 2015

    USGS Publications Warehouse

    Foster, Guy M.; Graham, Jennifer L.

    2016-04-06

    The Kansas River is a primary source of drinking water for about 800,000 people in northeastern Kansas. Source-water supplies are treated by a combination of chemical and physical processes to remove contaminants before distribution. Advanced notification of changing water-quality conditions and cyanobacteria and associated toxin and taste-and-odor compounds provides drinking-water treatment facilities time to develop and implement adequate treatment strategies. The U.S. Geological Survey (USGS), in cooperation with the Kansas Water Office (funded in part through the Kansas State Water Plan Fund), and the City of Lawrence, the City of Topeka, the City of Olathe, and Johnson County Water One, began a study in July 2012 to develop statistical models at two Kansas River sites located upstream from drinking-water intakes. Continuous water-quality monitors have been operated and discrete-water quality samples have been collected on the Kansas River at Wamego (USGS site number 06887500) and De Soto (USGS site number 06892350) since July 2012. Continuous and discrete water-quality data collected during July 2012 through June 2015 were used to develop statistical models for constituents of interest at the Wamego and De Soto sites. Logistic models to continuously estimate the probability of occurrence above selected thresholds were developed for cyanobacteria, microcystin, and geosmin. Linear regression models to continuously estimate constituent concentrations were developed for major ions, dissolved solids, alkalinity, nutrients (nitrogen and phosphorus species), suspended sediment, indicator bacteria (Escherichia coli, fecal coliform, and enterococci), and actinomycetes bacteria. These models will be used to provide real-time estimates of the probability that cyanobacteria and associated compounds exceed thresholds and of the concentrations of other water-quality constituents in the Kansas River. The models documented in this report are useful for characterizing changes in water-quality conditions through time, characterizing potentially harmful cyanobacterial events, and indicating changes in water-quality conditions that may affect drinking-water treatment processes.

  13. Germination parameterization and development of an after-ripening thermal-time model for primary dormancy release of Lithospermum arvense seeds.

    PubMed

    Chantre, Guillermo R; Batlla, Diego; Sabbatini, Mario R; Orioli, Gustavo

    2009-06-01

    Models based on thermal-time approaches have been a useful tool for characterizing and predicting seed germination and dormancy release in relation to time and temperature. The aims of the present work were to evaluate the relative accuracy of different thermal-time approaches for the description of germination in Lithospermum arvense and to develop an after-ripening thermal-time model for predicting seed dormancy release. Seeds were dry-stored at constant temperatures of 5, 15 or 24 degrees C for up to 210 d. After different storage periods, batches of 50 seeds were incubated at eight constant temperature regimes of 5, 8, 10, 13, 15, 17, 20 or 25 degrees C. Experimentally obtained cumulative-germination curves were analysed using a non-linear regression procedure to obtain optimal population thermal parameters for L. arvense. Changes in these parameters were described as a function of after-ripening thermal-time and storage temperature. The most accurate approach for simulating the thermal-germination response of L. arvense was achieved by assuming a normal distribution of both base and maximum germination temperatures. The results contradict the widely accepted assumption of a single T(b) value for the entire seed population. The after-ripening process was characterized by a progressive increase in the mean maximum germination temperature and a reduction in the thermal-time requirements for germination at sub-optimal temperatures. The after-ripening thermal-time model developed here gave an acceptable description of the observed field emergence patterns, thus indicating its usefulness as a predictive tool to enhance weed management tactics.

  14. Development of failure model for nickel cadmium cells

    NASA Technical Reports Server (NTRS)

    Gupta, A.

    1980-01-01

    The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.

  15. TEMPORAL CORRELATION OF CLASSIFICATIONS IN REMOTE SENSING

    EPA Science Inventory

    A bivariate binary model is developed for estimating the change in land cover from satellite images obtained at two different times. The binary classifications of a pixel at the two times are modeled as potentially correlated random variables, conditional on the true states of th...

  16. Proper Generalized Decomposition (PGD) for the numerical simulation of polycrystalline aggregates under cyclic loading

    NASA Astrophysics Data System (ADS)

    Nasri, Mohamed Aziz; Robert, Camille; Ammar, Amine; El Arem, Saber; Morel, Franck

    2018-02-01

    The numerical modelling of the behaviour of materials at the microstructural scale has been greatly developed over the last two decades. Unfortunately, conventional resolution methods cannot simulate polycrystalline aggregates beyond tens of loading cycles, and they do not remain quantitative due to the plasticity behaviour. This work presents the development of a numerical solver for the resolution of the Finite Element modelling of polycrystalline aggregates subjected to cyclic mechanical loading. The method is based on two concepts. The first one consists in maintaining a constant stiffness matrix. The second uses a time/space model reduction method. In order to analyse the applicability and the performance of the use of a space-time separated representation, the simulations are carried out on a three-dimensional polycrystalline aggregate under cyclic loading. Different numbers of elements per grain and two time increments per cycle are investigated. The results show a significant CPU time saving while maintaining good precision. Moreover, increasing the number of elements and the number of time increments per cycle, the model reduction method is faster than the standard solver.

  17. The University of British Columbia model of interprofessional education.

    PubMed

    Charles, Grant; Bainbridge, Lesley; Gilbert, John

    2010-01-01

    The College of Health Disciplines, at the University of British Columbia (UBC) has a long history of developing interprofessional learning opportunities for students and practitioners. Historically, many of the courses and programmes were developed because they intuitively made sense or because certain streams of funding were available at particular times. While each of them fit generally within our understanding of interprofessional education in the health and human service education programs, they were not systematically developed within an educational or theoretical framework. This paper discusses the model we have subsequently developed at the College for conceptualizing the various types of interprofessional experiences offered at UBC. It has been developed so that we can offer the broadest range of courses and most effective learning experiences for our students. Our model is based on the premise that there are optimal learning times for health and human services students (and practitioners) depending upon their stage of development as professionals in their respective disciplines and their readiness to learn and develop new perspectives on professional interaction.

  18. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  19. A k-epsilon modeling of near wall turbulence

    NASA Technical Reports Server (NTRS)

    Yang, Z.; Shih, T. H.

    1991-01-01

    A k-epsilon model is proposed for turbulent bounded flows. In this model, the turbulent velocity scale and turbulent time scale are used to define the eddy viscosity. The time scale is shown to be bounded from below by the Kolmogorov time scale. The dissipation equation is reformulated using the time scale, removing the need to introduce the pseudo-dissipation. A damping function is chosen such that the shear stress satisfies the near wall asymptotic behavior. The model constants used are the same as the model constants in the commonly used high turbulent Reynolds number k-epsilon model. Fully developed turbulent channel flows and turbulent boundary layer flows over a flat plate at various Reynolds numbers are used to validate the model. The model predictions were found to be in good agreement with the direct numerical simulation data.

  20. A quantitative approach to developing Parkinsonian monkeys (Macaca fascicularis) with intracerebroventricular 1-methyl-4-phenylpyridinium injections.

    PubMed

    Li, Hao; Lei, Xiaoguang; Huang, Baihui; Rizak, Joshua D; Yang, Lichuan; Yang, Shangchuan; Wu, Jing; Lü, Longbao; Wang, Jianhong; Yan, Ting; Li, Hongwei; Wang, Zhengbo; Hu, Yingzhou; Le, Weidong; Deng, Xingli; Li, Jiali; Xu, Lin; Zhang, Baorong; Hu, Xintian

    2015-08-15

    Non-human primate Parkinson's disease (PD) models are essential for PD research. The most extensively used PD monkey models are induced with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). However, the modeling processes of developing PD monkeys cannot be quantitatively controlled with MPTP. Therefore, a new approach to quantitatively develop chronic PD monkey models will help to advance the goals of "reduction, replacement and refinement" in animal experiments. A novel chronic PD monkey models was reported using the intracerebroventricular administration of 1-methyl-4-phenylpyridinium (MPP(+)) in Cynomolgus monkeys (Macaca fascicularis). This approach successfully produced stable and consistent PD monkeys with typical motor symptoms and pathological changes. More importantly, a sigmoidal relationship (Y=8.15801e(-0.245/x); R=0.73) was discovered between PD score (Y) and cumulative dose of MPP(+) (X). This relationship was then used to develop two additional PD monkeys under a specific time schedule (4 weeks), with planned PD scores (7) by controlling the dose and frequency of the MPP(+) administration as an independent validation of the formula. We developed Parkinsonian monkeys within controlled time frames by regulating the accumulated dose of MPP(+) intracerebroventricular administered, while limiting side effects often witnessed in models developed with the peripheral administration of MPTP, makes this model highly suitable for treatment development. This novel approach provides an edge in evaluating the mechanisms of PD pathology associated with environmental toxins and novel treatment approaches as the formula developed provides a "map" to control and predict the modeling processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  2. Development of the NASA Digital Astronaut Project Muscle Model

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Pennline, James A.; Thompson, W. K.; Humphreys, B. T.; Ryder, J. W.; Ploutz-Snyder, L. L.; Mulugeta, L.

    2015-01-01

    This abstract describes development work performed on the NASA Digital Astronaut Project Muscle Model. Muscle atrophy is a known physiological response to exposure to a low gravity environment. The DAP muscle model computationally predicts the change in muscle structure and function vs. time in a reduced gravity environment. The spaceflight muscle model can then be used in biomechanical models of exercise countermeasures and spaceflight tasks to: 1) develop site specific bone loading input to the DAP bone adaptation model over the course of a mission; 2) predict astronaut performance of spaceflight tasks; 3) inform effectiveness of new exercise countermeasures concepts.

  3. Development of optimization model for sputtering process parameter based on gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In the RF magnetron sputtering process, the desirable layer properties are largely influenced by the process parameters and conditions. If the quality of the thin film has not reached up to its intended level, the experiments have to be repeated until the desirable quality has been met. This research is proposing Gravitational Search Algorithm (GSA) as the optimization model to reduce the time and cost to be spent in the thin film fabrication. The optimization model's engine has been developed using Java. The model is developed based on GSA concept, which is inspired by the Newtonian laws of gravity and motion. In this research, the model is expected to optimize four deposition parameters which are RF power, deposition time, oxygen flow rate and substrate temperature. The results have turned out to be promising and it could be concluded that the performance of the model is satisfying in this parameter optimization problem. Future work could compare GSA with other nature based algorithms and test them with various set of data.

  4. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis.

  5. Prediction of objectively measured physical activity and sedentariness among blue-collar workers using survey questionnaires.

    PubMed

    Gupta, Nidhi; Heiden, Marina; Mathiassen, Svend Erik; Holtermann, Andreas

    2016-05-01

    We aimed at developing and evaluating statistical models predicting objectively measured occupational time spent sedentary or in physical activity from self-reported information available in large epidemiological studies and surveys. Two-hundred-and-fourteen blue-collar workers responded to a questionnaire containing information about personal and work related variables, available in most large epidemiological studies and surveys. Workers also wore accelerometers for 1-4 days measuring time spent sedentary and in physical activity, defined as non-sedentary time. Least-squares linear regression models were developed, predicting objectively measured exposures from selected predictors in the questionnaire. A full prediction model based on age, gender, body mass index, job group, self-reported occupational physical activity (OPA), and self-reported occupational sedentary time (OST) explained 63% (R (2)adjusted) of the variance of both objectively measured time spent sedentary and in physical activity since these two exposures were complementary. Single-predictor models based only on self-reported information about either OPA or OST explained 21% and 38%, respectively, of the variance of the objectively measured exposures. Internal validation using bootstrapping suggested that the full and single-predictor models would show almost the same performance in new datasets as in that used for modelling. Both full and single-predictor models based on self-reported information typically available in most large epidemiological studies and surveys were able to predict objectively measured occupational time spent sedentary or in physical activity, with explained variances ranging from 21-63%.

  6. Development of 3D Oxide Fuel Mechanics Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, B. W.; Casagranda, A.; Pitts, S. A.

    This report documents recent work to improve the accuracy and robustness of the mechanical constitutive models used in the BISON fuel performance code. These developments include migration of the fuel mechanics models to be based on the MOOSE Tensor Mechanics module, improving the robustness of the smeared cracking model, implementing a capability to limit the time step size based on material model response, and improving the robustness of the return mapping iterations used in creep and plasticity models.

  7. Economic modeling of HIV treatments.

    PubMed

    Simpson, Kit N

    2010-05-01

    To review the general literature on microeconomic modeling and key points that must be considered in the general assessment of economic modeling reports, discuss the evolution of HIV economic models and identify models that illustrate this development over time, as well as examples of current studies. Recommend improvements in HIV economic modeling. Recent economic modeling studies of HIV include examinations of scaling up antiretroviral (ARV) in South Africa, screening prior to use of abacavir, preexposure prophylaxis, early start of ARV in developing countries and cost-effectiveness comparisons of specific ARV drugs using data from clinical trials. These studies all used extensively published second-generation Markov models in their analyses. There have been attempts to simplify approaches to cost-effectiveness estimates by using simple decision trees or cost-effectiveness calculations with short-time horizons. However, these approaches leave out important cumulative economic effects that will not appear early in a treatment. Many economic modeling studies were identified in the 'gray' literature, but limited descriptions precluded an assessment of their adherence to modeling guidelines, and thus to the validity of their findings. There is a need for developing third-generation models to accommodate new knowledge about adherence, adverse effects, and viral resistance.

  8. The application of latent curve analysis to testing developmental theories in intervention research.

    PubMed

    Curran, P J; Muthén, B O

    1999-08-01

    The effectiveness of a prevention or intervention program has traditionally been assessed using time-specific comparisons of mean levels between the treatment and the control groups. However, many times the behavior targeted by the intervention is naturally developing over time, and the goal of the treatment is to alter this natural or normative developmental trajectory. Examining time-specific mean levels can be both limiting and potentially misleading when the behavior of interest is developing systematically over time. It is argued here that there are both theoretical and statistical advantages associated with recasting intervention treatment effects in terms of normative and altered developmental trajectories. The recently developed technique of latent curve (LC) analysis is reviewed and extended to a true experimental design setting in which subjects are randomly assigned to a treatment intervention or a control condition. LC models are applied to both artificially generated and real intervention data sets to evaluate the efficacy of an intervention program. Not only do the LC models provide a more comprehensive understanding of the treatment and control group developmental processes compared to more traditional fixed-effects models, but LC models have greater statistical power to detect a given treatment effect. Finally, the LC models are modified to allow for the computation of specific power estimates under a variety of conditions and assumptions that can provide much needed information for the planning and design of more powerful but cost-efficient intervention programs for the future.

  9. Numerical Models for Sound Propagation in Long Spaces

    NASA Astrophysics Data System (ADS)

    Lai, Chenly Yuen Cheung

    Both reverberation time and steady-state sound field are the key elements for assessing the acoustic condition in an enclosed space. They affect the noise propagation, speech intelligibility, clarity index, and definition. Since the sound field in a long space is non diffuse, classical room acoustics theory does not apply in this situation. The ray tracing technique and the image source methods are two common models to fathom both reverberation time and steady-state sound field in long enclosures nowadays. Although both models can give an accurate estimate of reverberation times and steady-state sound field directly or indirectly, they often involve time-consuming calculations. In order to simplify the acoustic consideration, a theoretical formulation has been developed for predicting both steady-state sound fields and reverberation times in street canyons. The prediction model is further developed to predict the steady-state sound field in a long enclosure. Apart from the straight long enclosure, there are other variations such as a cross junction, a long enclosure with a T-intersection, an U-turn long enclosure. In the present study, an theoretical and experimental investigations were conducted to develop formulae for predicting reverberation times and steady-state sound fields in a junction of a street canyon and in a long enclosure with T-intersection. The theoretical models are validated by comparing the numerical predictions with published experimental results. The theoretical results are also compared with precise indoor measurements and large-scale outdoor experimental results. In all of previous acoustical studies related to long enclosure, most of the studies are focused on the monopole sound source. Besides non-directional noise source, many noise sources in long enclosure are dipole like, such as train noise and fan noise. In order to study the characteristics of directional noise sources, a review of available dipole source was conducted. A dipole was constructed which was subsequent used for experimental studies. In additional, a theoretical model was developed for predicting dipole sound fields. The theoretical model can be used to study the effect of a dipole source on the speech intelligibility in long enclosures.

  10. The development of personal models of diabetes in the first 2 years after diagnosis: a prospective longitudinal study.

    PubMed

    Lawson, V L; Bundy, C; Harvey, J N

    2008-04-01

    Personal models of diabetes comprise beliefs about symptoms, treatment effectiveness, consequences and emotional responses to possible future complications. They are associated with, and influence, self-care behaviour. Little work has examined potential influences on the development and maintenance of personal models. The aims of this study were: (i) to assess changes in personal models over 2 years from diagnosis of diabetes; and (ii) to examine the relative contributions of health threat communication (at diagnosis, since diagnosis, during follow-up care) and personality to personal models of diabetes 2 years post-diagnosis. Newly diagnosed patients were interviewed at diagnosis (< 3 months; time 1) and 6 months (time 2), 1 year (time 3) and 2 years (time 4) after diagnosis. Data were available for 158 patients at time 1 (32 Type 1 patients and 126 Type 2 patients), 147 at time 2, 142 at time 3 and 138 at time 4. Perceptions of symptoms, consequences, course and control of diabetes remained stable over time. Emotional responses decreased and illness coherence (perceived understanding) increased over time. Health threat communication was a stronger predictor of personal models than personality. Emotional responses to diabetes 2 years after diagnosis were predicted by perceptions of a threatening health message (at diagnosis 18%, at follow-up 5%). Health threat communication predicted perceptions of serious consequences (at diagnosis 5%, at follow-up 9%). Perceptions of a reassuring message during follow-up were related to beliefs of treatment effectiveness (26%). The communication of information and the way it is perceived is an important determinant of the patient's view of their diabetes. The initial effects of the education process at diagnosis persisted 2 years after diagnosis.

  11. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2008-09-30

    novel method of simultaneous real- time measurements of ice-nucleating particle concentrations and size- resolved chemical composition of individual...is to develop a practical predictive capability for visibility and weather effects of aerosol particles for the entire globe for timely use in...prediction follows that used in numerical weather prediction, namely real- time assessment for initialization of first-principles models. The Naval

  12. Estimating Multi-Level Discrete-Time Hazard Models Using Cross-Sectional Data: Neighborhood Effects on the Onset of Adolescent Cigarette Use.

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Brennan, Robert T.; Buka, Stephen L.

    2002-01-01

    Developed procedures for constructing a retrospective person-period data set from cross-sectional data and discusses modeling strategies for estimating multilevel discrete-time event history models. Applied the methods to the analysis of cigarette use by 1,979 urban adolescents. Results show the influence of the racial composition of the…

  13. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  14. Is questionnaire-based sitting time inaccurate and can it be improved? A cross-sectional investigation using accelerometer-based sitting time.

    PubMed

    Gupta, Nidhi; Christiansen, Caroline Stordal; Hanisch, Christiana; Bay, Hans; Burr, Hermann; Holtermann, Andreas

    2017-01-16

    To investigate the differences between a questionnaire-based and accelerometer-based sitting time, and develop a model for improving the accuracy of questionnaire-based sitting time for predicting accelerometer-based sitting time. 183 workers in a cross-sectional study reported sitting time per day using a single question during the measurement period, and wore 2 Actigraph GT3X+ accelerometers on the thigh and trunk for 1-4 working days to determine their actual sitting time per day using the validated Acti4 software. Least squares regression models were fitted with questionnaire-based siting time and other self-reported predictors to predict accelerometer-based sitting time. Questionnaire-based and accelerometer-based average sitting times were ≈272 and ≈476 min/day, respectively. A low Pearson correlation (r=0.32), high mean bias (204.1 min) and wide limits of agreement (549.8 to -139.7 min) between questionnaire-based and accelerometer-based sitting time were found. The prediction model based on questionnaire-based sitting explained 10% of the variance in accelerometer-based sitting time. Inclusion of 9 self-reported predictors in the model increased the explained variance to 41%, with 10% optimism using a resampling bootstrap validation. Based on a split validation analysis, the developed prediction model on ≈75% of the workers (n=132) reduced the mean and the SD of the difference between questionnaire-based and accelerometer-based sitting time by 64% and 42%, respectively, in the remaining 25% of the workers. This study indicates that questionnaire-based sitting time has low validity and that a prediction model can be one solution to materially improve the precision of questionnaire-based sitting time. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. Sandia/Stanford Unified Creep Plasticity Damage Model for ANSYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, David M.; Vianco, Paul T.; Fossum, Arlo F.

    2006-09-03

    A unified creep plasticity (UCP) model was developed, based upon the time-dependent and time-independent deformation properties of the 95.5Sn-3.9Ag-0.6Cu (wt.%) soldier that were measured at Sandia. Then, a damage parameter, D, was added to the equation to develop the unified creep plasticity damage (UCPD) model. The parameter, D, was parameterized, using data obtained at Sandia from isothermal fatigue experiments on a double-lap shear test. The softwae was validated against a BGA solder joint exposed to thermal cycling. The UCPD model was put into the ANSYS finite element as a subroutine. So, the softwae is the subroutine for ANSYS 8.1.

  16. Modeling and simulation of queuing system for customer service improvement: A case study

    NASA Astrophysics Data System (ADS)

    Xian, Tan Chai; Hong, Chai Weng; Hawari, Nurul Nazihah

    2016-10-01

    This study aims to develop a queuing model at UniMall by using discrete event simulation approach in analyzing the service performance that affects customer satisfaction. The performance measures that considered in this model are such as the average time in system, the total number of student served, the number of student in waiting queue, the waiting time in queue as well as the maximum length of buffer. ARENA simulation software is used to develop a simulation model and the output is analyzed. Based on the analysis of output, it is recommended that management of UniMall consider introducing shifts and adding another payment counter in the morning.

  17. Parameter Estimation for a Model of Space-Time Rainfall

    NASA Astrophysics Data System (ADS)

    Smith, James A.; Karr, Alan F.

    1985-08-01

    In this paper, parameter estimation procedures, based on data from a network of rainfall gages, are developed for a class of space-time rainfall models. The models, which are designed to represent the spatial distribution of daily rainfall, have three components, one that governs the temporal occurrence of storms, a second that distributes rain cells spatially for a given storm, and a third that determines the rainfall pattern within a rain cell. Maximum likelihood and method of moments procedures are developed. We illustrate that limitations on model structure are imposed by restricting data sources to rain gage networks. The estimation procedures are applied to a 240-mi2 (621 km2) catchment in the Potomac River basin.

  18. Survivability Versus Time

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2014-01-01

    Develop Survivability vs Time Model as a decision-evaluation tool to assess various emergency egress methods used at Launch Complex 39B (LC 39B) and in the Vehicle Assembly Building (VAB) on NASAs Kennedy Space Center. For each hazard scenario, develop probability distributions to address statistical uncertainty resulting in survivability plots over time and composite survivability plots encompassing multiple hazard scenarios.

  19. Reciprocal Associations between Young Children's Developing Moral Judgments and Theory of Mind

    ERIC Educational Resources Information Center

    Smetana, Judith G.; Jambon, Marc; Conry-Murray, Clare; Sturge-Apple, Melissa L.

    2012-01-01

    Associations between young children's developing theory of mind (ToM) and judgments of prototypical moral transgressions were examined 3 times across 1 year in 70 American middle class 2.5- to 4-year-olds. Separate path models controlling for cross-time stability in judgments, within-time associations, and children's age at Wave 1 indicated that…

  20. Aircraft Engine Systems

    NASA Technical Reports Server (NTRS)

    Veres, Joseph

    2001-01-01

    This report outlines the detailed simulation of Aircraft Turbofan Engine. The objectives were to develop a detailed flow model of a full turbofan engine that runs on parallel workstation clusters overnight and to develop an integrated system of codes for combustor design and analysis to enable significant reduction in design time and cost. The model will initially simulate the 3-D flow in the primary flow path including the flow and chemistry in the combustor, and ultimately result in a multidisciplinary model of the engine. The overnight 3-D simulation capability of the primary flow path in a complete engine will enable significant reduction in the design and development time of gas turbine engines. In addition, the NPSS (Numerical Propulsion System Simulation) multidisciplinary integration and analysis are discussed.

  1. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    NASA Astrophysics Data System (ADS)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  2. Bayesian algorithm implementation in a real time exposure assessment model on benzene with calculation of associated cancer risks.

    PubMed

    Sarigiannis, Dimosthenis A; Karakitsios, Spyros P; Gotti, Alberto; Papaloukas, Costas L; Kassomenos, Pavlos A; Pilidis, Georgios A

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK) risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural). Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations.

  3. Bayesian Algorithm Implementation in a Real Time Exposure Assessment Model on Benzene with Calculation of Associated Cancer Risks

    PubMed Central

    Sarigiannis, Dimosthenis A.; Karakitsios, Spyros P.; Gotti, Alberto; Papaloukas, Costas L.; Kassomenos, Pavlos A.; Pilidis, Georgios A.

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK) risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural). Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations. PMID:22399936

  4. Adaptive Response in Female Modeling of the Hypothalamic-pituitary-gonadal Axis

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  5. Artificial Intelligence Techniques for Predicting and Mapping Daily Pan Evaporation

    NASA Astrophysics Data System (ADS)

    Arunkumar, R.; Jothiprakash, V.; Sharma, Kirty

    2017-09-01

    In this study, Artificial Intelligence techniques such as Artificial Neural Network (ANN), Model Tree (MT) and Genetic Programming (GP) are used to develop daily pan evaporation time-series (TS) prediction and cause-effect (CE) mapping models. Ten years of observed daily meteorological data such as maximum temperature, minimum temperature, relative humidity, sunshine hours, dew point temperature and pan evaporation are used for developing the models. For each technique, several models are developed by changing the number of inputs and other model parameters. The performance of each model is evaluated using standard statistical measures such as Mean Square Error, Mean Absolute Error, Normalized Mean Square Error and correlation coefficient (R). The results showed that daily TS-GP (4) model predicted better with a correlation coefficient of 0.959 than other TS models. Among various CE models, CE-ANN (6-10-1) resulted better than MT and GP models with a correlation coefficient of 0.881. Because of the complex non-linear inter-relationship among various meteorological variables, CE mapping models could not achieve the performance of TS models. From this study, it was found that GP performs better for recognizing single pattern (time series modelling), whereas ANN is better for modelling multiple patterns (cause-effect modelling) in the data.

  6. Chapter 2: Fire and Fuels Extension: Model description

    Treesearch

    Sarah J. Beukema; Elizabeth D. Reinhardt; Julee A. Greenough; Donald C. E. Robinson; Werner A. Kurz

    2003-01-01

    The Fire and Fuels Extension to the Forest Vegetation Simulator is a model that simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. Existing models are used to represent forest stand development (the Forest Vegetation Simulator, Wykoff and others 1982), fire behavior (Rothermel 1972, Van Wagner 1977, and...

  7. Social Context, Self-Perceptions and Student Engagement: A SEM Investigation of the Self-System Model of Motivational Development (SSMMD)

    ERIC Educational Resources Information Center

    Dupont, Serge; Galand, Benoit; Nils, Frédéric; Hospel, Virginie

    2014-01-01

    Introduction: The present study aimed to test a theoretically-based model (the self-system model of motivational development) including at the same time the extent to which the social context provides structure, warmth and autonomy support, the students' perceived autonomy, relatedness and competence, and behavioral, cognitive and emotional…

  8. Placing a Value on Academic Work: The Development and Implementation of a Time-Based Academic Workload Model

    ERIC Educational Resources Information Center

    Kenny, John; Fluck, Andrew; Jetson, Tim

    2012-01-01

    This paper presents a detailed case study of the development and implementation of a quantifiable academic workload model in the education faculty of an Australian university. Flowing from the enterprise bargaining process, the Academic Staff Agreement required the implementation of a workload allocation model for academics that was quantifiable…

  9. An evidence accumulation model for conflict detection performance in a simulated air traffic control task.

    PubMed

    Neal, Andrew; Kwantes, Peter J

    2009-04-01

    The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.

  10. Network Reduction Algorithm for Developing Distribution Feeders for Real-Time Simulators: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Nelson, Austin; Prabakar, Kumaraguru

    As advanced grid-support functions (AGF) become more widely used in grid-connected photovoltaic (PV) inverters, utilities are increasingly interested in their impacts when implemented in the field. These effects can be understood by modeling feeders in real-time systems and testing PV inverters using power hardware-in-the-loop (PHIL) techniques. This paper presents a novel feeder model reduction algorithm using a Monte Carlo method that enables large feeders to be solved and operated on real-time computing platforms. Two Hawaiian Electric feeder models in Synergi Electric's load flow software were converted to reduced order models in OpenDSS, and subsequently implemented in the OPAL-RT real-time digitalmore » testing platform. Smart PV inverters were added to the real-time model with AGF responses modeled after characterizing commercially available hardware inverters. Finally, hardware inverters were tested in conjunction with the real-time model using PHIL techniques so that the effects of AGFs on the choice feeders could be analyzed.« less

  11. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    PubMed

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  12. Spherical visual system for real-time virtual reality and surveillance

    NASA Astrophysics Data System (ADS)

    Chen, Su-Shing

    1998-12-01

    A spherical visual system has been developed for full field, web-based surveillance, virtual reality, and roundtable video conference. The hardware is a CycloVision parabolic lens mounted on a video camera. The software was developed at the University of Missouri-Columbia. The mathematical model is developed by Su-Shing Chen and Michael Penna in the 1980s. The parabolic image, capturing the full (360 degrees) hemispherical field (except the north pole) of view is transformed into the spherical model of Chen and Penna. In the spherical model, images are invariant under the rotation group and are easily mapped to the image plane tangent to any point on the sphere. The projected image is exactly what the usual camera produces at that angle. Thus a real-time full spherical field video camera is developed by using two pieces of parabolic lenses.

  13. A MODEL OF ESTUARY RESPONSE TO NITROGEN LOADING AND FRESHWATER RESIDENCE TIME

    EPA Science Inventory

    We have developed a deterministic model that relates average annual nitrogen loading rate and water residence time in an estuary to in-estuary nitrogen concentrations and loss rates (e.g. denitrification and incorporation in sediments), and to rates of nitrogen export across the ...

  14. A latent class multiple constraint multiple discrete-continuous extreme value model of time use and goods consumption.

    DOT National Transportation Integrated Search

    2016-06-01

    This paper develops a microeconomic theory-based multiple discrete continuous choice model that considers: (a) that both goods consumption and time allocations (to work and non-work activities) enter separately as decision variables in the utility fu...

  15. Development of Co-Extrusion Technologies for Green Manufacture of Energetics

    DTIC Science & Technology

    2006-04-01

    extrusion, Cc-extruded, ETPE, TPE, Energetic thermoplastic elastomer , PDMS, Polydimethyl siloxane, Fast core propellant, Co-layered, Wall slip, Shear...first opportunity possible, the steady FEM models of SIT will need to be converted into time dependent models to allow time dependent calculations to be

  16. Physiologically Based Pharmacokinetic Model for Terbinafine in Rats and Humans

    PubMed Central

    Hosseini-Yeganeh, Mahboubeh; McLachlan, Andrew J.

    2002-01-01

    The aim of this study was to develop a physiologically based pharmacokinetic (PB-PK) model capable of describing and predicting terbinafine concentrations in plasma and tissues in rats and humans. A PB-PK model consisting of 12 tissue and 2 blood compartments was developed using concentration-time data for tissues from rats (n = 33) after intravenous bolus administration of terbinafine (6 mg/kg of body weight). It was assumed that all tissues except skin and testis tissues were well-stirred compartments with perfusion rate limitations. The uptake of terbinafine into skin and testis tissues was described by a PB-PK model which incorporates a membrane permeability rate limitation. The concentration-time data for terbinafine in human plasma and tissues were predicted by use of a scaled-up PB-PK model, which took oral absorption into consideration. The predictions obtained from the global PB-PK model for the concentration-time profile of terbinafine in human plasma and tissues were in close agreement with the observed concentration data for rats. The scaled-up PB-PK model provided an excellent prediction of published terbinafine concentration-time data obtained after the administration of single and multiple oral doses in humans. The estimated volume of distribution at steady state (Vss) obtained from the PB-PK model agreed with the reported value of 11 liters/kg. The apparent volume of distribution of terbinafine in skin and adipose tissues accounted for 41 and 52%, respectively, of the Vss for humans, indicating that uptake into and redistribution from these tissues dominate the pharmacokinetic profile of terbinafine. The PB-PK model developed in this study was capable of accurately predicting the plasma and tissue terbinafine concentrations in both rats and humans and provides insight into the physiological factors that determine terbinafine disposition. PMID:12069977

  17. Time-dependent oral absorption models

    NASA Technical Reports Server (NTRS)

    Higaki, K.; Yamashita, S.; Amidon, G. L.

    2001-01-01

    The plasma concentration-time profiles following oral administration of drugs are often irregular and cannot be interpreted easily with conventional models based on first- or zero-order absorption kinetics and lag time. Six new models were developed using a time-dependent absorption rate coefficient, ka(t), wherein the time dependency was varied to account for the dynamic processes such as changes in fluid absorption or secretion, in absorption surface area, and in motility with time, in the gastrointestinal tract. In the present study, the plasma concentration profiles of propranolol obtained in human subjects following oral dosing were analyzed using the newly derived models based on mass balance and compared with the conventional models. Nonlinear regression analysis indicated that the conventional compartment model including lag time (CLAG model) could not predict the rapid initial increase in plasma concentration after dosing and the predicted Cmax values were much lower than that observed. On the other hand, all models with the time-dependent absorption rate coefficient, ka(t), were superior to the CLAG model in predicting plasma concentration profiles. Based on Akaike's Information Criterion (AIC), the fluid absorption model without lag time (FA model) exhibited the best overall fit to the data. The two-phase model including lag time, TPLAG model was also found to be a good model judging from the values of sum of squares. This model also described the irregular profiles of plasma concentration with time and frequently predicted Cmax values satisfactorily. A comparison of the absorption rate profiles also suggested that the TPLAG model is better at prediction of irregular absorption kinetics than the FA model. In conclusion, the incorporation of a time-dependent absorption rate coefficient ka(t) allows the prediction of nonlinear absorption characteristics in a more reliable manner.

  18. Early adolescence behavior problems and timing of poverty during childhood: A comparison of lifecourse models.

    PubMed

    Mazza, Julia Rachel S E; Lambert, Jean; Zunzunegui, Maria Victoria; Tremblay, Richard E; Boivin, Michel; Côté, Sylvana M

    2017-03-01

    Poverty is a well-established risk factor for the development of behavior problems, yet little is known about how timing of exposure to childhood poverty relates to behavior problems in early adolescence. To examine the differential effects of the timing of poverty between birth and late childhood on behavior problems in early adolescence by modeling lifecourse models, corresponding to sensitive periods, accumulation of risk and social mobility models. We used the Quebec Longitudinal Study of Child Development (N = 2120). Poverty was defined as living below the low-income thresholds defined by Statistics Canada and grouped into three time periods: between ages 0-3 years, 5-7 years, and 8-12 years. Main outcomes were teacher's report of hyperactivity, opposition and physical aggression at age 13 years. Structured linear regression analyses were conducted to estimate the contribution of poverty during the three selected time periods to behavior problems. Partial F-tests were used to compare nested lifecourse models to a full saturated model (all poverty main effects and possible interactions). Families who experienced poverty at all time periods were 9.3% of the original sample. Those who were poor at least one time period were 39.2%. The accumulation of risk model was the best fitting model for hyperactivity and opposition. The risk for physical aggression problems was associated only to poverty between 0 and 3 years supporting the sensitive period. Early and prolonged exposure to childhood poverty predicted higher levels of behavior problems in early adolescence. Antipoverty policies targeting the first years of life and long term support to pregnant women living in poverty are likely to reduce behavior problems in early adolescence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Coordination control of flexible manufacturing systems

    NASA Astrophysics Data System (ADS)

    Menon, Satheesh R.

    One of the first attempts was made to develop a model driven system for coordination control of Flexible Manufacturing Systems (FMS). The structure and activities of the FMS are modeled using a colored Petri Net based system. This approach has the advantage of being able to model the concurrency inherent in the system. It provides a method for encoding the system state, state transitions and the feasible transitions at any given state. Further structural analysis (for detecting conflicting actions, deadlocks which might occur during operation, etc.) can be performed. The problem is also addressed of implementing and testing the behavior of existing dynamic scheduling approaches in simulations of realistic situations. A simulation architecture was proposed and performance evaluation was carried out for establishing the correctness of the model, stability of the system from a structural (deadlocks) and temporal (boundedness of backlogs) points of view, and for collection of statistics for performance measures such as machine and robot utilizations, average wait times and idle times of resources. A real-time implementation architecture for the coordination controller was also developed and implemented in a software simulated environment. Given the current technology of FMS control, the model-driven colored Petri net-based approach promises to develop a very flexible control environment.

  20. Estimation of the base temperature and growth phase duration in terms of thermal time for four grapevine cultivars

    NASA Astrophysics Data System (ADS)

    Zapata, D.; Salazar, M.; Chaves, B.; Keller, M.; Hoogenboom, G.

    2015-12-01

    Thermal time models have been used to predict the development of many different species, including grapevine ( Vitis vinifera L.). These models normally assume that there is a linear relationship between temperature and plant development. The goal of this study was to estimate the base temperature and duration in terms of thermal time for predicting veraison for four grapevine cultivars. Historical phenological data for four cultivars that were collected in the Pacific Northwest were used to develop the thermal time model. Base temperatures ( T b) of 0 and 10 °C and the best estimated T b using three different methods were evaluated for predicting veraison in grapevine. Thermal time requirements for each individual cultivar were evaluated through analysis of variance, and means were compared using the Fisher's test. The methods that were applied to estimate T b for the development of wine grapes included the least standard deviation in heat units, the regression coefficient, and the development rate method. The estimated T b varied among methods and cultivars. The development rate method provided the lowest T b values for all cultivars. For the three methods, Chardonnay had the lowest T b ranging from 8.7 to 10.7 °C, while the highest T b values were obtained for Riesling and Cabernet Sauvignon with 11.8 and 12.8 °C, respectively. Thermal time also differed among cultivars, when either the fixed or estimated T b was used. Predictions of the beginning of ripening with the estimated temperature resulted in the lowest variation in real days when compared with predictions using T b = 0 or 10 °C, regardless of the method that was used to estimate the T b.

  1. Long-Term Trends and Variability in Spring Development of Calanus finmarchicus in the Southeastern Norwegian Sea during 1996-2012

    NASA Astrophysics Data System (ADS)

    Dupont, N.; Bagøien, E.; Melle, W.

    2016-02-01

    Calanus finmarchicus is the dominant copepod species in the Norwegian Sea in terms of biomass, playing a key role in the ecosystem by transferring energy from primary producers to higher trophic levels. This study analyses the long-term trend of a 17-year time series (1996-2012) on abundance of adult Calanus finmarchicus in the Atlantic water-mass of the southern Norwegian Sea during spring. The long-term trend in spring abundance was assessed by using Generalised Additive Models, while simultaneously accounting for both general population development and inter-annual variation in population development throughout the study period. In one model, we focus on inter-annual changes in timing of the Calanus spring seasonal development by including Mean Stage Composition as a measure for state of population development. Following a short increase during the years 1996 to 2000, the abundance of Calanus finmarchicus decreased strongly until about the year 2010. For the two last years of the studied period, 2011-2012, increasing population abundances are suggested but with less certainty. The model results suggest that the analysis is capturing the G0 generation, displaying a peak for the adults in about mid-April. Inter-annual differences in spring seasonal development, with the peak of adults shifting towards earlier in the season as well as a shorter generation time are suggested. Considering the importance of Calanus finmarchicus as food for planktivorous predators in the Norwegian Sea, our time series analysis suggests relevant changes both with respect to the spring abundance and timing of this food source. The next step is to relate variation in the Calanus time series to environmental factors with special emphasis on climatic drivers.

  2. Prospective memory: A comparative perspective

    PubMed Central

    Crystal, Jonathon D.; Wilson, A. George

    2014-01-01

    Prospective memory consists of forming a representation of a future action, temporarily storing that representation in memory, and retrieving it at a future time point. Here we review the recent development of animal models of prospective memory. We review experiments using rats that focus on the development of time-based and event-based prospective memory. Next, we review a number of prospective-memory approaches that have been used with a variety of non-human primates. Finally, we review selected approaches from the human literature on prospective memory to identify targets for development of animal models of prospective memory. PMID:25101562

  3. Physiological time model of Scirpophaga incertulas (Lepidoptera: Pyralidae) in rice in Guandong Province, People's Republic of China.

    PubMed

    Stevenson, Douglass E; Feng, Ge; Zhang, Runjie; Harris, Marvin K

    2005-08-01

    Scirpophaga incertulas (Walker) (Lepidoptera: Pyralidae) is autochthonous and monophagous on rice, Oryza spp., which favors the development of a physiological time model using degree-days (degrees C) to establish a well defined window during which adults will be present in fields. Model development of S. incertulas adult flight phenology used climatic data and historical field observations of S. incertulas from 1962 through 1988. Analysis of variance was used to evaluate 5,203 prospective models with starting dates ranging from 1 January (day 1) to 30 April (day 121) and base temperatures ranging from -3 through 18.5 degrees C. From six candidate models, which shared the lowest standard deviation of prediction error, a model with a base temperature of 10 degrees C starting on 19 January was selected for validation. Validation with linear regression evaluated the differences between predicted and observed events and showed the model consistently predicted phenological events of 10 to 90% cumulative flight activity within a 3.5-d prediction interval regarded as acceptable for pest management decision making. The degree-day phenology model developed here is expected to find field application in Guandong Province. Expansion to other areas of rice production will require field validation. We expect the degree-day characterization of the activity period will remain essentially intact, but the start day may vary based on climate and geographic location. The development and validation of the phenology model of the S. incertulas by using procedures originally developed for pecan nut casebearer, Acrobasis nuxvorella Neunzig, shows the fungibility of this approach to developing prediction models for other insects.

  4. A Distributed Web-based Solution for Ionospheric Model Real-time Management, Monitoring, and Short-term Prediction

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A.; Maurits, S.; Watkins, B.

    2006-12-01

    With the widespread availability of the Internet today, many people can monitor various scientific research activities. It is important to accommodate this interest providing on-line access to dynamic and illustrative Web-resources, which could demonstrate different aspects of ongoing research. It is especially important to explain and these research activities for high school and undergraduate students, thereby providing more information for making decisions concerning their future studies. Such Web resources are also important to clarify scientific research for the general public, in order to achieve better awareness of research progress in various fields. Particularly rewarding is dissemination of information about ongoing projects within Universities and research centers to their local communities. The benefits of this type of scientific outreach are mutual, since development of Web-based automatic systems is prerequisite for many research projects targeting real-time monitoring and/or modeling of natural conditions. Continuous operation of such systems provide ongoing research opportunities for the statistically massive validation of the models, as well. We have developed a Web-based system to run the University of Alaska Fairbanks Polar Ionospheric Model in real-time. This model makes use of networking and computational resources at the Arctic Region Supercomputing Center. This system was designed to be portable among various operating systems and computational resources. Its components can be installed across different computers, separating Web servers and computational engines. The core of the system is a Real-Time Management module (RMM) written Python, which facilitates interactions of remote input data transfers, the ionospheric model runs, MySQL database filling, and PHP scripts for the Web-page preparations. The RMM downloads current geophysical inputs as soon as they become available at different on-line depositories. This information is processed to provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.

  5. Development of an accident duration prediction model on the Korean Freeway Systems.

    PubMed

    Chung, Younshik

    2010-01-01

    Since duration prediction is one of the most important steps in an accident management process, there have been several approaches developed for modeling accident duration. This paper presents a model for the purpose of accident duration prediction based on accurately recorded and large accident dataset from the Korean Freeway Systems. To develop the duration prediction model, this study utilizes the log-logistic accelerated failure time (AFT) metric model and a 2-year accident duration dataset from 2006 to 2007. Specifically, the 2006 dataset is utilized to develop the prediction model and then, the 2007 dataset was employed to test the temporal transferability of the 2006 model. Although the duration prediction model has limitations such as large prediction error due to the individual differences of the accident treatment teams in terms of clearing similar accidents, the results from the 2006 model yielded a reasonable prediction based on the mean absolute percentage error (MAPE) scale. Additionally, the results of the statistical test for temporal transferability indicated that the estimated parameters in the duration prediction model are stable over time. Thus, this temporal stability suggests that the model may have potential to be used as a basis for making rational diversion and dispatching decisions in the event of an accident. Ultimately, such information will beneficially help in mitigating traffic congestion due to accidents.

  6. Computationally efficient and flexible modular modelling approach for river and urban drainage systems based on surrogate conceptual models

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Willems, Patrick

    2015-04-01

    Water managers rely increasingly on mathematical simulation models that represent individual parts of the water system, such as the river, sewer system or waste water treatment plant. The current evolution towards integral water management requires the integration of these distinct components, leading to an increased model scale and scope. Besides this growing model complexity, certain applications gained interest and importance, such as uncertainty and sensitivity analyses, auto-calibration of models and real time control. All these applications share the need for models with a very limited calculation time, either for performing a large number of simulations, or a long term simulation followed by a statistical post-processing of the results. The use of the commonly applied detailed models that solve (part of) the de Saint-Venant equations is infeasible for these applications or such integrated modelling due to several reasons, of which a too long simulation time and the inability to couple submodels made in different software environments are the main ones. Instead, practitioners must use simplified models for these purposes. These models are characterized by empirical relationships and sacrifice model detail and accuracy for increased computational efficiency. The presented research discusses the development of a flexible integral modelling platform that complies with the following three key requirements: (1) Include a modelling approach for water quantity predictions for rivers, floodplains, sewer systems and rainfall runoff routing that require a minimal calculation time; (2) A fast and semi-automatic model configuration, thereby making maximum use of data of existing detailed models and measurements; (3) Have a calculation scheme based on open source code to allow for future extensions or the coupling with other models. First, a novel and flexible modular modelling approach based on the storage cell concept was developed. This approach divides each subcomponent, such as the river, sewer of floodplain, in an arrangement of interconnected cells, thereby lumping processes in space and time. Depending on the behaviour of the system that needs to be emulated and the desired level of accuracy, variables of interest can be predicted by adopting and calibrating one of the predefined model structures, such as weir equations, transfer functions (Wolfs et al., 2013) and self-learning structures including neural networks, model trees and fuzzy systems (Wolfs and Willems, 2013, 2014). Next, a software tool was developed to facilitate and speed-up model configuration. A close integration is foreseen with the MIKE (DHI) and InfoWorks (Innovyze) software. The created software tool also automatically sets up the calculation scheme in C programming language. The developed modelling approach and software were tested extensively on multiple case studies, including uncertainty flood mapping along a river, real time control of hydraulic structures to prevent flooding, and the quantification of the effect on floods of retention basins in a coupled sewer-river system (De Vleeschauwer et al., 2014). Research is currently being done on the extension of the modelling approach and accompanying software tool with physicochemical water quality modules. Acknowledgments This research was supported by the Agency for Innovation by Science and Technology in Flanders (IWT). The authors would like to thank DHI and Innovyze for the MIKE and InfoWorks licenses. References • De Vleeschauwer, K., Weustenraad, J., Nolf, C., Wolfs, V., De Meulder, B., Shannon, K., Willems, P. (2014). Green - blue water in the city: quantification of impact of source control versus end-of-pipe solutions on sewer and river floods. Water Science and Technology, 70 (11), 1825-1837. • Wolfs, V., Villazon Gomez, M., Willems, P. (2013). Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems. Water Science and Technology, 68 (1), 167-175. • Wolfs, V., Willems, P. (2013). A data driven approach using Takagi-Sugeno models for computationally efficient lumped floodplain modeling. Journal of Hydrology, 503, 222-232. • Wolfs, V., Willems, P. (2014). Development of discharge-stage curves affected by hysteresis using time varying models, model tree and neural networks. Environmental Modelling & Software, 55, 107-119.

  7. A multi-scale model for geared transmission aero-thermodynamics

    NASA Astrophysics Data System (ADS)

    McIntyre, Sean M.

    A multi-scale, multi-physics computational tool for the simulation of high-per- formance gearbox aero-thermodynamics was developed and applied to equilibrium and pathological loss-of-lubrication performance simulation. The physical processes at play in these systems include multiphase compressible ow of the air and lubricant within the gearbox, meshing kinematics and tribology, as well as heat transfer by conduction, and free and forced convection. These physics are coupled across their representative space and time scales in the computational framework developed in this dissertation. These scales span eight orders of magnitude, from the thermal response of the full gearbox O(100 m; 10 2 s), through effects at the tooth passage time scale O(10-2 m; 10-4 s), down to tribological effects on the meshing gear teeth O(10-6 m; 10-6 s). Direct numerical simulation of these coupled physics and scales is intractable. Accordingly, a scale-segregated simulation strategy was developed by partitioning and treating the contributing physical mechanisms as sub-problems, each with associated space and time scales, and appropriate coupling mechanisms. These are: (1) the long time scale thermal response of the system, (2) the multiphase (air, droplets, and film) aerodynamic flow and convective heat transfer within the gearbox, (3) the high-frequency, time-periodic thermal effects of gear tooth heating while in mesh and its subsequent cooling through the rest of rotation, (4) meshing effects including tribology and contact mechanics. The overarching goal of this dissertation was to develop software and analysis procedures for gearbox loss-of-lubrication performance. To accommodate these four physical effects and their coupling, each is treated in the CFD code as a sub problem. These physics modules are coupled algorithmically. Specifically, the high- frequency conduction analysis derives its local heat transfer coefficient and near-wall air temperature boundary conditions from a quasi-steady cyclic-symmetric simulation of the internal flow. This high-frequency conduction solution is coupled directly with a model for the meshing friction, developed by a collaborator, which was adapted for use in a finite-volume CFD code. The local surface heat flux on solid surfaces is calculated by time-averaging the heat flux in the high-frequency analysis. This serves as a fixed-flux boundary condition in the long time scale conduction module. The temperature distribution from this long time scale heat transfer calculation serves as a boundary condition for the internal convection simulation, and as the initial condition for the high-frequency heat transfer module. Using this multi-scale model, simulations were performed for equilibrium and loss-of-lubrication operation of the NASA Glenn Research Center test stand. Results were compared with experimental measurements. In addition to the multi-scale model itself, several other specific contributions were made. Eulerian models for droplets and wall-films were developed and im- plemented in the CFD code. A novel approach to retaining liquid film on the solid surfaces, and strategies for its mass exchange with droplets, were developed and verified. Models for interfacial transfer between droplets and wall-film were implemented, and include the effects of droplet deposition, splashing, bouncing, as well as film breakup. These models were validated against airfoil data. To mitigate the observed slow convergence of CFD simulations of the enclosed aerodynamic flows within gearboxes, Fourier stability analysis was applied to the SIMPLE-C fractional-step algorithm. From this, recommendations to accelerate the convergence rate through enhanced pressure-velocity coupling were made. These were shown to be effective. A fast-running finite-volume reduced-order-model of the gearbox aero-thermo- dynamics was developed, and coupled with the tribology model to investigate the sensitivity of loss-of-lubrication predictions to various model and physical param- eters. This sensitivity study was instrumental in guiding efforts toward improving the accuracy of the multi-scale model without undue increase in computational cost. In addition, the reduced-order model is now used extensively by a collaborator in tribology model development and testing. Experimental measurements of high-speed gear windage in partially and fully- shrouded configurations were performed to supplement the paucity of available validation data. This measurement program provided measurements of windage loss for a gear of design-relevant size and operating speed, as well as guidance for increasing the accuracy of future measurements.

  8. Nutritional Systems Biology Modeling: From Molecular Mechanisms to Physiology

    PubMed Central

    de Graaf, Albert A.; Freidig, Andreas P.; De Roos, Baukje; Jamshidi, Neema; Heinemann, Matthias; Rullmann, Johan A.C.; Hall, Kevin D.; Adiels, Martin; van Ommen, Ben

    2009-01-01

    The use of computational modeling and simulation has increased in many biological fields, but despite their potential these techniques are only marginally applied in nutritional sciences. Nevertheless, recent applications of modeling have been instrumental in answering important nutritional questions from the cellular up to the physiological levels. Capturing the complexity of today's important nutritional research questions poses a challenge for modeling to become truly integrative in the consideration and interpretation of experimental data at widely differing scales of space and time. In this review, we discuss a selection of available modeling approaches and applications relevant for nutrition. We then put these models into perspective by categorizing them according to their space and time domain. Through this categorization process, we identified a dearth of models that consider processes occurring between the microscopic and macroscopic scale. We propose a “middle-out” strategy to develop the required full-scale, multilevel computational models. Exhaustive and accurate phenotyping, the use of the virtual patient concept, and the development of biomarkers from “-omics” signatures are identified as key elements of a successful systems biology modeling approach in nutrition research—one that integrates physiological mechanisms and data at multiple space and time scales. PMID:19956660

  9. Three-dimensional time domain model of lightning including corona effects

    NASA Technical Reports Server (NTRS)

    Podgorski, Andrew S.

    1991-01-01

    A new 3-D lightning model that incorporates the effect of corona is described for the first time. The new model is based on a Thin Wire Time Domain Lightning (TWTDL) Code developed previously. The TWTDL Code was verified during the 1985 and 1986 lightning seasons by the measurements conducted at the 553 m CN Tower in Toronto, Ontario. The inclusion of corona in the TWTDL code allowed study of the corona effects on the lightning current parameters and the associated electric field parameters.

  10. Model building techniques for analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less

  11. Weight and the Future of Space Flight Hardware Cost Modeling

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    2003-01-01

    Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.

  12. Empirical Storm-Time Correction to the International Reference Ionosphere Model E-Region Electron and Ion Density Parameterizations Using Observations from TIMED/SABER

    NASA Technical Reports Server (NTRS)

    Mertens, Christoper J.; Winick, Jeremy R.; Russell, James M., III; Mlynczak, Martin G.; Evans, David S.; Bilitza, Dieter; Xu, Xiaojing

    2007-01-01

    The response of the ionospheric E-region to solar-geomagnetic storms can be characterized using observations of infrared 4.3 micrometers emission. In particular, we utilize nighttime TIMED/SABER measurements of broadband 4.3 micrometers limb emission and derive a new data product, the NO+(v) volume emission rate, which is our primary observation-based quantity for developing an empirical storm-time correction the IRI E-region electron density. In this paper we describe our E-region proxy and outline our strategy for developing the empirical storm model. In our initial studies, we analyzed a six day storm period during the Halloween 2003 event. The results of this analysis are promising and suggest that the ap-index is a viable candidate to use as a magnetic driver for our model.

  13. The application of connectionism to query planning/scheduling in intelligent user interfaces

    NASA Technical Reports Server (NTRS)

    Short, Nicholas, Jr.; Shastri, Lokendra

    1990-01-01

    In the mid nineties, the Earth Observing System (EOS) will generate an estimated 10 terabytes of data per day. This enormous amount of data will require the use of sophisticated technologies from real time distributed Artificial Intelligence (AI) and data management. Without regard to the overall problems in distributed AI, efficient models were developed for doing query planning and/or scheduling in intelligent user interfaces that reside in a network environment. Before intelligent query/planning can be done, a model for real time AI planning and/or scheduling must be developed. As Connectionist Models (CM) have shown promise in increasing run times, a connectionist approach to AI planning and/or scheduling is proposed. The solution involves merging a CM rule based system to a general spreading activation model for the generation and selection of plans. The system was implemented in the Rochester Connectionist Simulator and runs on a Sun 3/260.

  14. Analysing child mortality in Nigeria with geoadditive discrete-time survival models.

    PubMed

    Adebayo, Samson B; Fahrmeir, Ludwig

    2005-03-15

    Child mortality reflects a country's level of socio-economic development and quality of life. In developing countries, mortality rates are not only influenced by socio-economic, demographic and health variables but they also vary considerably across regions and districts. In this paper, we analysed child mortality in Nigeria with flexible geoadditive discrete-time survival models. This class of models allows us to measure small-area district-specific spatial effects simultaneously with possibly non-linear or time-varying effects of other factors. Inference is fully Bayesian and uses computationally efficient Markov chain Monte Carlo (MCMC) simulation techniques. The application is based on the 1999 Nigeria Demographic and Health Survey. Our method assesses effects at a high level of temporal and spatial resolution not available with traditional parametric models, and the results provide some evidence on how to reduce child mortality by improving socio-economic and public health conditions. Copyright (c) 2004 John Wiley & Sons, Ltd.

  15. The analysis of the possibility of using 10-minute rainfall series to determine the maximum rainfall amount with 5 minutes duration

    NASA Astrophysics Data System (ADS)

    Kaźmierczak, Bartosz; Wartalska, Katarzyna; Wdowikowski, Marcin; Kotowski, Andrzej

    2017-11-01

    Modern scientific research in the area of heavy rainfall analysis regarding to the sewerage design indicates the need to develop and use probabilistic rain models. One of the issues that remains to be resolved is the length of the shortest amount of rain to be analyzed. It is commonly believed that the best time is 5 minutes, while the least rain duration measured by the national services is often 10 or even 15 minutes. Main aim of this paper is to present the difference between probabilistic rainfall models results given from rainfall time series including and excluding 5 minutes rainfall duration. Analysis were made for long-time period from 1961-2010 on polish meteorological station Legnica. To develop best fitted to measurement rainfall data probabilistic model 4 probabilistic distributions were used. Results clearly indicates that models including 5 minutes rainfall duration remains more appropriate to use.

  16. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  17. Spatiotemporal stochastic models for earth science and engineering applications

    NASA Astrophysics Data System (ADS)

    Luo, Xiaochun

    1998-12-01

    Spatiotemporal processes occur in many areas of earth sciences and engineering. However, most of the available theoretical tools and techniques of space-time daft processing have been designed to operate exclusively in time or in space, and the importance of spatiotemporal variability was not fully appreciated until recently. To address this problem, a systematic framework of spatiotemporal random field (S/TRF) models for geoscience/engineering applications is presented and developed in this thesis. The space-tune continuity characterization is one of the most important aspects in S/TRF modelling, where the space-time continuity is displayed with experimental spatiotemporal variograms, summarized in terms of space-time continuity hypotheses, and modelled using spatiotemporal variogram functions. Permissible spatiotemporal covariance/variogram models are addressed through permissibility criteria appropriate to spatiotemporal processes. The estimation of spatiotemporal processes is developed in terms of spatiotemporal kriging techniques. Particular emphasis is given to the singularity analysis of spatiotemporal kriging systems. The impacts of covariance, functions, trend forms, and data configurations on the singularity of spatiotemporal kriging systems are discussed. In addition, the tensorial invariance of universal spatiotemporal kriging systems is investigated in terms of the space-time trend. The conditional simulation of spatiotemporal processes is proposed with the development of the sequential group Gaussian simulation techniques (SGGS), which is actually a series of sequential simulation algorithms associated with different group sizes. The simulation error is analyzed with different covariance models and simulation grids. The simulated annealing technique honoring experimental variograms, is also proposed, providing a way of conditional simulation without the covariance model fitting which is prerequisite for most simulation algorithms. The proposed techniques were first applied for modelling of the pressure system in a carbonate reservoir, and then applied for modelling of springwater contents in the Dyle watershed. The results of these case studies as well as the theory suggest that these techniques are realistic and feasible.

  18. Placing Health Trajectories in Family and Historical Context: A Proposed Enrichment of the Life Course Health and Development Model.

    PubMed

    Jones, Marian Moser; Roy, Kevin

    2017-10-01

    Purpose This article offers constructive commentary on The Life Course Health and Development Model (LCHD) as an organizing framework for MCH research. Description The LCHD has recently been proposed as an organizing framework for MCH research. This model integrates biomedical, biopsychosocial, and life course frameworks, to explain how "individual health trajectories" develop over time. In this article, we propose that the LCHD can improve its relevance to MCH policy and practice by: (1) placing individual health trajectories within the context of family health trajectories, which unfold within communities and societies, over historical and generational time; and (2) placing greater weight on the social determinants that shape health development trajectories of individuals and families to produce greater or lesser health equity. Assessment We argue that emphasizing these nested, historically specific social contexts in life course models will enrich study design and data analysis for future developmental science research, will make the LCHD model more relevant in shaping MCH policy and interventions, and will guard against its application as a deterministic framework. Specific ways to measure these and examples of how they can be integrated into the LCHD model are articulated. Conclusion Research applying the LCHD should incorporate the specific family and socio-historical contexts in which development occurs to serve as a useful basis for policy and interventions. Future longitudinal studies of maternal and child health should include collection of time-dependent data related to family environment and other social determinants of health, and analyze the impact of historical events and trends on specific cohorts.

  19. The paradigm shift to an "open" model in drug development.

    PubMed

    Au, Regina

    2014-12-01

    The rising cost of healthcare, the rising cost for drug development, the patent cliff for Big pharma, shorter patent protection, decrease reimbursement, and the recession have made it more difficult for the pharmaceutical and biotechnology industry to develop drugs. Due to the unsustainable amount of time and money in developing a drug that will have a significant return on investment (ROI) it has become hard to sustain a robust pipeline. The industry is transforming its business model to meet these challenges. In essence a paradigm shift is occurring; the old "closed" model is giving way to a new "open" business model.

  20. Predicting Time to Hospital Discharge for Extremely Preterm Infants

    PubMed Central

    Hintz, Susan R.; Bann, Carla M.; Ambalavanan, Namasivayam; Cotten, C. Michael; Das, Abhik; Higgins, Rosemary D.

    2010-01-01

    As extremely preterm infant mortality rates have decreased, concerns regarding resource utilization have intensified. Accurate models to predict time to hospital discharge could aid in resource planning, family counseling, and perhaps stimulate quality improvement initiatives. Objectives For infants <27 weeks estimated gestational age (EGA), to develop, validate and compare several models to predict time to hospital discharge based on time-dependent covariates, and based on the presence of 5 key risk factors as predictors. Patients and Methods This was a retrospective analysis of infants <27 weeks EGA, born 7/2002-12/2005 and surviving to discharge from a NICHD Neonatal Research Network site. Time to discharge was modeled as continuous (postmenstrual age at discharge, PMAD), and categorical variables (“Early” and “Late” discharge). Three linear and logistic regression models with time-dependent covariate inclusion were developed (perinatal factors only, perinatal+early neonatal factors, perinatal+early+later factors). Models for Early and Late discharge using the cumulative presence of 5 key risk factors as predictors were also evaluated. Predictive capabilities were compared using coefficient of determination (R2) for linear models, and AUC of ROC curve for logistic models. Results Data from 2254 infants were included. Prediction of PMAD was poor, with only 38% of variation explained by linear models. However, models incorporating later clinical characteristics were more accurate in predicting “Early” or “Late” discharge (full models: AUC 0.76-0.83 vs. perinatal factor models: AUC 0.56-0.69). In simplified key risk factors models, predicted probabilities for Early and Late discharge compared favorably with observed rates. Furthermore, the AUC (0.75-0.77) were similar to those of models including the full factor set. Conclusions Prediction of Early or Late discharge is poor if only perinatal factors are considered, but improves substantially with knowledge of later-occurring morbidities. Prediction using a few key risk factors is comparable to full models, and may offer a clinically applicable strategy. PMID:20008430

  1. Anomalous diffusion for bed load transport with a physically-based model

    NASA Astrophysics Data System (ADS)

    Fan, N.; Singh, A.; Foufoula-Georgiou, E.; Wu, B.

    2013-12-01

    Diffusion of bed load particles shows both normal and anomalous behavior for different spatial-temporal scales. Understanding and quantifying these different types of diffusion is important not only for the development of theoretical models of particle transport but also for practical purposes, e.g., river management. Here we extend a recently proposed physically-based model of particle transport by Fan et al. [2013] to further develop an Episodic Langevin equation (ELE) for individual particle motion which reproduces the episodic movement (start and stop) of sediment particles. Using the proposed ELE we simulate particle movements for a large number of uniform size particles, incorporating different probability distribution functions (PDFs) of particle waiting time. For exponential PDFs of waiting times, particles reveal ballistic motion in short time scales and turn to normal diffusion at long time scales. The PDF of simulated particle travel distances also shows a change in its shape from exponential to Gamma to Gaussian with a change in timescale implying different diffusion scaling regimes. For power-law PDF (with power - μ) of waiting times, the asymptotic behavior of particles at long time scales reveals both super-diffusion and sub-diffusion, however, only very heavy tailed waiting times (i.e. 1.0 < μ < 1.5) could result in sub-diffusion. We suggest that the contrast between our results and previous studies (for e.g., studies based on fractional advection-diffusion models of thin/heavy tailed particle hops and waiting times) results could be due the assumption in those studies that the hops are achieved instantaneously, but in reality, particles achieve their hops within finite times (as we simulate here) instead of instantaneously, even if the hop times are much shorter than waiting times. In summary, this study stresses on the need to rethink the alternative models to the previous models, such as, fractional advection-diffusion equations, for studying the anomalous diffusion of bed load particles. The implications of these results for modeling sediment transport are discussed.

  2. Behavioral Correlates of System Operational Readiness (SOR): Summary of Workshop Proceedings.

    DTIC Science & Technology

    1983-10-01

    1978) time series ARIMA models Use ARIMA models for (Box & Jenkins, 1976) interrupted time series Stage 7. Interpretation 7.1 Formatting and re...This report describes a 2-day conference called to explore the methodology required to develop a behavioral model of system operational readiness (SOR...Participants discussed (4) the behavioral variables that should be included in the model , (2) the system level measures that should be included, (3

  3. Comparison of three approaches to model grapevine organogenesis in conditions of fluctuating temperature, solar radiation and soil water content.

    PubMed

    Pallas, B; Loi, C; Christophe, A; Cournède, P H; Lecoeur, J

    2011-04-01

    There is increasing interest in the development of plant growth models representing the complex system of interactions between the different determinants of plant development. These approaches are particularly relevant for grapevine organogenesis, which is a highly plastic process dependent on temperature, solar radiation, soil water deficit and trophic competition. The extent to which three plant growth models were able to deal with the observed plasticity of axis organogenesis was assessed. In the first model, axis organogenesis was dependent solely on temperature, through thermal time. In the second model, axis organogenesis was modelled through functional relationships linking meristem activity and trophic competition. In the last model, the rate of phytomer appearence on each axis was modelled as a function of both the trophic status of the plant and the direct effect of soil water content on potential meristem activity. The model including relationships between trophic competition and meristem behaviour involved a decrease in the root mean squared error (RMSE) for the simulations of organogenesis by a factor nine compared with the thermal time-based model. Compared with the model in which axis organogenesis was driven only by trophic competition, the implementation of relationships between water deficit and meristem behaviour improved organogenesis simulation results, resulting in a three times divided RMSE. The resulting model can be seen as a first attempt to build a comprehensive complete plant growth model simulating the development of the whole plant in fluctuating conditions of temperature, solar radiation and soil water content. We propose a new hypothesis concerning the effects of the different determinants of axis organogenesis. The rate of phytomer appearance according to thermal time was strongly affected by the plant trophic status and soil water deficit. Furthermore, the decrease in meristem activity when soil water is depleted does not result from source/sink imbalances.

  4. Forecasting wetting and drying of post-wildfire soils in response to precipitation: A time series optimization approach

    NASA Astrophysics Data System (ADS)

    Basak, A.; Kulkarni, C.; Schmidt, K. M.; Mengshoel, O. J.

    2015-12-01

    Volumetric water content (VWC) in soils is critical for forecasting thresholds for runoff-driven erosion caused by rainfall. Even though theoretical relations (e.g., Richards equation) have been developed to quantify VWC in unsaturated granular soils, site-specific field conditions and hysteresis of suction and VWC in soil preclude their direct use. Although attempts have previously been made to forecast VWC using various time-series models (e.g., autoregressive integrated moving average or ARIMA), these approaches lack hydrologic foundations and perform poorly when used to forecast VWC over time periods longer than 24 hours. In this work, we extend an existing Antecedent Water Index (AWI) based model to express VWC as a function of time and rainfall. AWI models typically overfit data and cannot be used for forecast VWC over long time periods. We developed a new model to overcome this limitation, which accumulates rainfall over a time window and fits a diverse range of wetting and drying curves. Hydraulic redistribution parameters in this model bear resemblance to hydrologic processes driven by gravity and suction. This model reasonably forecasts VWC using only initial VWC values and rainfall forecasts. Experimental VWC data were collected from steep gradient post-wildfire sites in southern California. Rapid landscape change was observed in response to small to moderate rain storms. We formulated a mean-squared error minimization problem over the model parameters and optimized using genetic algorithms. We found that our model fits VWC data for 3 distinct soil textures, each occurring at 3 different depths below the ground surface (5 cm, 15 cm, and 30 cm). Our model successfully forecasts VWC trends, such as drying and wetting rate. To a certain extent, our model achieves spatial and seasonal generalizability. Our accumulative rainfall model is also applicable to continuous predictions, where VWC values are repeatedly used to predict future ones within a 12-hr time frame.

  5. Model-based economic evaluation in Alzheimer's disease: a review of the methods available to model Alzheimer's disease progression.

    PubMed

    Green, Colin; Shearer, James; Ritchie, Craig W; Zajicek, John P

    2011-01-01

    To consider the methods available to model Alzheimer's disease (AD) progression over time to inform on the structure and development of model-based evaluations, and the future direction of modelling methods in AD. A systematic search of the health care literature was undertaken to identify methods to model disease progression in AD. Modelling methods are presented in a descriptive review. The literature search identified 42 studies presenting methods or applications of methods to model AD progression over time. The review identified 10 general modelling frameworks available to empirically model the progression of AD as part of a model-based evaluation. Seven of these general models are statistical models predicting progression of AD using a measure of cognitive function. The main concerns with models are on model structure, around the limited characterization of disease progression, and on the use of a limited number of health states to capture events related to disease progression over time. None of the available models have been able to present a comprehensive model of the natural history of AD. Although helpful, there are serious limitations in the methods available to model progression of AD over time. Advances are needed to better model the progression of AD and the effects of the disease on peoples' lives. Recent evidence supports the need for a multivariable approach to the modelling of AD progression, and indicates that a latent variable analytic approach to characterising AD progression is a promising avenue for advances in the statistical development of modelling methods. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. A novel grey-fuzzy-Markov and pattern recognition model for industrial accident forecasting

    NASA Astrophysics Data System (ADS)

    Edem, Inyeneobong Ekoi; Oke, Sunday Ayoola; Adebiyi, Kazeem Adekunle

    2017-10-01

    Industrial forecasting is a top-echelon research domain, which has over the past several years experienced highly provocative research discussions. The scope of this research domain continues to expand due to the continuous knowledge ignition motivated by scholars in the area. So, more intelligent and intellectual contributions on current research issues in the accident domain will potentially spark more lively academic, value-added discussions that will be of practical significance to members of the safety community. In this communication, a new grey-fuzzy-Markov time series model, developed from nondifferential grey interval analytical framework has been presented for the first time. This instrument forecasts future accident occurrences under time-invariance assumption. The actual contribution made in the article is to recognise accident occurrence patterns and decompose them into grey state principal pattern components. The architectural framework of the developed grey-fuzzy-Markov pattern recognition (GFMAPR) model has four stages: fuzzification, smoothening, defuzzification and whitenisation. The results of application of the developed novel model signify that forecasting could be effectively carried out under uncertain conditions and hence, positions the model as a distinctly superior tool for accident forecasting investigations. The novelty of the work lies in the capability of the model in making highly accurate predictions and forecasts based on the availability of small or incomplete accident data.

  7. Real-time forecasting of an epidemic using a discrete time stochastic model: a case study of pandemic influenza (H1N1-2009)

    PubMed Central

    2011-01-01

    Background Real-time forecasting of epidemics, especially those based on a likelihood-based approach, is understudied. This study aimed to develop a simple method that can be used for the real-time epidemic forecasting. Methods A discrete time stochastic model, accounting for demographic stochasticity and conditional measurement, was developed and applied as a case study to the weekly incidence of pandemic influenza (H1N1-2009) in Japan. By imposing a branching process approximation and by assuming the linear growth of cases within each reporting interval, the epidemic curve is predicted using only two parameters. The uncertainty bounds of the forecasts are computed using chains of conditional offspring distributions. Results The quality of the forecasts made before the epidemic peak appears largely to depend on obtaining valid parameter estimates. The forecasts of both weekly incidence and final epidemic size greatly improved at and after the epidemic peak with all the observed data points falling within the uncertainty bounds. Conclusions Real-time forecasting using the discrete time stochastic model with its simple computation of the uncertainty bounds was successful. Because of the simplistic model structure, the proposed model has the potential to additionally account for various types of heterogeneity, time-dependent transmission dynamics and epidemiological details. The impact of such complexities on forecasting should be explored when the data become available as part of the disease surveillance. PMID:21324153

  8. NASA AVOSS Fast-Time Wake Prediction Models: User's Guide

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing and testing fast-time wake transport and decay models to safely enhance the capacity of the National Airspace System (NAS). The fast-time wake models are empirical algorithms used for real-time predictions of wake transport and decay based on aircraft parameters and ambient weather conditions. The aircraft dependent parameters include the initial vortex descent velocity and the vortex pair separation distance. The atmospheric initial conditions include vertical profiles of temperature or potential temperature, eddy dissipation rate, and crosswind. The current distribution includes the latest versions of the APA (3.4) and the TDP (2.1) models. This User's Guide provides detailed information on the model inputs, file formats, and the model output. An example of a model run and a brief description of the Memphis 1995 Wake Vortex Dataset is also provided.

  9. Informal Science and Youth Development: Creating Convergence in Out-of-School Time

    ERIC Educational Resources Information Center

    Noam, Gil G.; Shah, Ashima

    2014-01-01

    This chapter highlights the fit between youth-development-oriented programming and informal science activities in out-of-school time (OST) and illustrates how science and youth development can and should co-occur. The clover model and Dimensions of Success tool are introduced as lenses for designing and assessing science program quality in OST.…

  10. Research and development program for the development of advanced time-temperature dependent constitutive relationships. Volume 2: Programming manual

    NASA Technical Reports Server (NTRS)

    Cassenti, B. N.

    1983-01-01

    The results of a 10-month research and development program for nonlinear structural modeling with advanced time-temperature constitutive relationships are presented. The implementation of the theory in the MARC nonlinear finite element code is discussed, and instructions for the computational application of the theory are provided.

  11. "It's Worth Our Time": A Model of Culturally and Linguistically Supportive Professional Development for K-12 STEM Educators

    ERIC Educational Resources Information Center

    Hudley, Anne H. Charity; Mallinson, Christine

    2017-01-01

    Professional development on issues of language and culture is often separate from professional development on issues related to STEM education, resulting in linguistic and cultural gaps in K-12 STEM pedagogy and practice. To address this issue, we have designed a model of professional development in which we work with educators to build cultural…

  12. Application of stochastic automata networks for creation of continuous time Markov chain models of voltage gating of gap junction channels.

    PubMed

    Snipas, Mindaugas; Pranevicius, Henrikas; Pranevicius, Mindaugas; Pranevicius, Osvaldas; Paulauskas, Nerijus; Bukauskas, Feliksas F

    2015-01-01

    The primary goal of this work was to study advantages of numerical methods used for the creation of continuous time Markov chain models (CTMC) of voltage gating of gap junction (GJ) channels composed of connexin protein. This task was accomplished by describing gating of GJs using the formalism of the stochastic automata networks (SANs), which allowed for very efficient building and storing of infinitesimal generator of the CTMC that allowed to produce matrices of the models containing a distinct block structure. All of that allowed us to develop efficient numerical methods for a steady-state solution of CTMC models. This allowed us to accelerate CPU time, which is necessary to solve CTMC models, ~20 times.

  13. Modeling of switching regulator power stages with and without zero-inductor-current dwell time

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Yu, Y.; Triner, J. E.

    1976-01-01

    State space techniques are employed to derive accurate models for buck, boost, and buck/boost converter power stages operating with and without zero-inductor-current dwell time. A generalized procedure is developed which treats the continuous-inductor-current mode without the dwell time as a special case of the discontinuous-current mode, when the dwell time vanishes. An abrupt change of system behavior including a reduction of the system order when the dwell time appears is shown both analytically and experimentally.

  14. Development of a feed-forward controller for a tracking telescope

    NASA Astrophysics Data System (ADS)

    Allen, John S.; Stufflebeam, Joseph L.; Feller, Dan

    2004-07-01

    This paper develops a State Space model of a feed-forward control system in the frequency domain, and time domain. The results of the mathematical model are implemented and the responses of the Elevation and Azimuth servo controller in a tracking telescope called a Cine-Sextant developed for the Utah Test and Training Range.

  15. Modelled hydraulic redistribution by sunflower (Helianthus annuus L.) matches observed data only after including night-time transpiration.

    PubMed

    Neumann, Rebecca B; Cardon, Zoe G; Teshera-Levye, Jennifer; Rockwell, Fulton E; Zwieniecki, Maciej A; Holbrook, N Michele

    2014-04-01

    The movement of water from moist to dry soil layers through the root systems of plants, referred to as hydraulic redistribution (HR), occurs throughout the world and is thought to influence carbon and water budgets and ecosystem functioning. The realized hydrologic, biogeochemical and ecological consequences of HR depend on the amount of redistributed water, whereas the ability to assess these impacts requires models that correctly capture HR magnitude and timing. Using several soil types and two ecotypes of sunflower (Helianthus annuus L.) in split-pot experiments, we examined how well the widely used HR modelling formulation developed by Ryel et al. matched experimental determination of HR across a range of water potential driving gradients. H. annuus carries out extensive night-time transpiration, and although over the last decade it has become more widely recognized that night-time transpiration occurs in multiple species and many ecosystems, the original Ryel et al. formulation does not include the effect of night-time transpiration on HR. We developed and added a representation of night-time transpiration into the formulation, and only then was the model able to capture the dynamics and magnitude of HR we observed as soils dried and night-time stomatal behaviour changed, both influencing HR. © 2013 John Wiley & Sons Ltd.

  16. Forecasting daily emergency department visits using calendar variables and ambient temperature readings.

    PubMed

    Marcilio, Izabel; Hajat, Shakoor; Gouveia, Nelson

    2013-08-01

    This study aimed to develop different models to forecast the daily number of patients seeking emergency department (ED) care in a general hospital according to calendar variables and ambient temperature readings and to compare the models in terms of forecasting accuracy. The authors developed and tested six different models of ED patient visits using total daily counts of patient visits to an ED in Sao Paulo, Brazil, from January 1, 2008, to December 31, 2010. The first 33 months of the data set were used to develop the ED patient visits forecasting models (the training set), leaving the last 3 months to measure each model's forecasting accuracy by the mean absolute percentage error (MAPE). Forecasting models were developed using three different time-series analysis methods: generalized linear models (GLM), generalized estimating equations (GEE), and seasonal autoregressive integrated moving average (SARIMA). For each method, models were explored with and without the effect of mean daily temperature as a predictive variable. The daily mean number of ED visits was 389, ranging from 166 to 613. Data showed a weekly seasonal distribution, with highest patient volumes on Mondays and lowest patient volumes on weekends. There was little variation in daily visits by month. GLM and GEE models showed better forecasting accuracy than SARIMA models. For instance, the MAPEs from GLM models and GEE models at the first month of forecasting (October 2012) were 11.5 and 10.8% (models with and without control for the temperature effect, respectively), while the MAPEs from SARIMA models were 12.8 and 11.7%. For all models, controlling for the effect of temperature resulted in worse or similar forecasting ability than models with calendar variables alone, and forecasting accuracy was better for the short-term horizon (7 days in advance) than for the longer term (30 days in advance). This study indicates that time-series models can be developed to provide forecasts of daily ED patient visits, and forecasting ability was dependent on the type of model employed and the length of the time horizon being predicted. In this setting, GLM and GEE models showed better accuracy than SARIMA models. Including information about ambient temperature in the models did not improve forecasting accuracy. Forecasting models based on calendar variables alone did in general detect patterns of daily variability in ED volume and thus could be used for developing an automated system for better planning of personnel resources. © 2013 by the Society for Academic Emergency Medicine.

  17. Limited Sampling Strategy for Accurate Prediction of Pharmacokinetics of Saroglitazar: A 3-point Linear Regression Model Development and Successful Prediction of Human Exposure.

    PubMed

    Joshi, Shuchi N; Srinivas, Nuggehally R; Parmar, Deven V

    2018-03-01

    Our aim was to develop and validate the extrapolative performance of a regression model using a limited sampling strategy for accurate estimation of the area under the plasma concentration versus time curve for saroglitazar. Healthy subject pharmacokinetic data from a well-powered food-effect study (fasted vs fed treatments; n = 50) was used in this work. The first 25 subjects' serial plasma concentration data up to 72 hours and corresponding AUC 0-t (ie, 72 hours) from the fasting group comprised a training dataset to develop the limited sampling model. The internal datasets for prediction included the remaining 25 subjects from the fasting group and all 50 subjects from the fed condition of the same study. The external datasets included pharmacokinetic data for saroglitazar from previous single-dose clinical studies. Limited sampling models were composed of 1-, 2-, and 3-concentration-time points' correlation with AUC 0-t of saroglitazar. Only models with regression coefficients (R 2 ) >0.90 were screened for further evaluation. The best R 2 model was validated for its utility based on mean prediction error, mean absolute prediction error, and root mean square error. Both correlations between predicted and observed AUC 0-t of saroglitazar and verification of precision and bias using Bland-Altman plot were carried out. None of the evaluated 1- and 2-concentration-time points models achieved R 2 > 0.90. Among the various 3-concentration-time points models, only 4 equations passed the predefined criterion of R 2 > 0.90. Limited sampling models with time points 0.5, 2, and 8 hours (R 2 = 0.9323) and 0.75, 2, and 8 hours (R 2 = 0.9375) were validated. Mean prediction error, mean absolute prediction error, and root mean square error were <30% (predefined criterion) and correlation (r) was at least 0.7950 for the consolidated internal and external datasets of 102 healthy subjects for the AUC 0-t prediction of saroglitazar. The same models, when applied to the AUC 0-t prediction of saroglitazar sulfoxide, showed mean prediction error, mean absolute prediction error, and root mean square error <30% and correlation (r) was at least 0.9339 in the same pool of healthy subjects. A 3-concentration-time points limited sampling model predicts the exposure of saroglitazar (ie, AUC 0-t ) within predefined acceptable bias and imprecision limit. Same model was also used to predict AUC 0-∞ . The same limited sampling model was found to predict the exposure of saroglitazar sulfoxide within predefined criteria. This model can find utility during late-phase clinical development of saroglitazar in the patient population. Copyright © 2018 Elsevier HS Journals, Inc. All rights reserved.

  18. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. Geo-Engineering through Internet Informatics (GEMINI)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doveton, John H.; Watney, W. Lynn

    The program, for development and methodologies, was a 3-year interdisciplinary effort to develop an interactive, integrated Internet Website named GEMINI (Geo-Engineering Modeling through Internet Informatics) that would build real-time geo-engineering reservoir models for the Internet using the latest technology in Web applications.

  20. Development of a Physiologically Based Pharmacokinetic Model for Triadimefon and Triadimenol in Rats and Humans

    EPA Science Inventory

    A physiologically based pharmacokinetic (PBPK) model was developed for the conazole fungicide triadimefon and its primary metabolite, triadimenol. Rat tissue:blood partition coefficients and metabolic constants were measured in vitro for both compounds. Kinetic time course data...

  1. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    EPA Science Inventory

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  2. Progress Towards a Thermo-Mechanical Magma Chamber Forward Model for Eruption Cycles, Applied to the Columbia River Flood Basalts

    NASA Astrophysics Data System (ADS)

    Karlstrom, L.; Ozimek, C.

    2016-12-01

    Magma chamber modeling has advanced to the stage where it is now possible to develop self-consistent, predictive models that consider mechanical, thermal, and compositional magma time evolution through multiple eruptive cycles. We have developed such a thermo-mechanical-chemical model for a laterally extensive sill-like chamber beneath free surface, to understand physical controls on eruptive products through time at long-lived magmatic centers. This model predicts the relative importance of recharge, eruption, assimilation and fractional crystallization (REAFC, Lee et al., 2013) on evolving chemical composition as a function of mechanical magma chamber stability regimes. We solve for the time evolution of chamber pressure, temperature, gas volume fraction, volume, elemental concentration in the melt and crustal temperature field that accounts for moving boundary conditions associated with chamber inflation (and the possibility of coupled chambers at different depths). The density, volume fractions of melt and crystals, crustal assimilation and the changing viscosity and crustal properties of the wall rock are also tracked, along with joint solubility of water and CO2. The eventual goal is to develop an efficient forward model to invert for eruptive records at long-lived eruptive centers, where multiple types of data for eruptions are available. As a first step, we apply this model to a new compilation of eruptive data from the Columbia River Flood Basalts (CRFB), which erupted 210,000 km3 from feeder dikes in Washington, Oregon and Idaho between 16.9-6Ma. Data include volumes, timing and geochemical composition of eruptive units, along with seismic surveys and clinopyroxene geobarometry that constrain depth of storage through time. We are in the process of performing a suite of simulations varying model input parameters such as mantle melt rate, emplacement depth, wall rock compositions and rheology, and volatile content to explain volume, eruption timescales, and chemical trace aspects of CRFB eruptions. We are particularly interested in whether the large volume eruptions of the main phase Grande Ronde basalts were made possible due to the development of shallow crustal storage.

  3. Kuroshio Pathways in a Climatologically-Forced Model

    NASA Astrophysics Data System (ADS)

    Douglass, E. M.; Jayne, S. R.; Bryan, F. O.; Peacock, S.; Maltrud, M. E.

    2010-12-01

    A high resolution ocean model forced with an annually repeating atmosphere is used to examine variability of the Kuroshio, the western boundary current in the North Pacific Ocean. A large meander in the path of the Kuroshio south of Japan develops and disappears in a highly bimodal fashion on decadal time scales. This meander is comparable in timing and spatial extent to an observed feature in the region. Various characteristics of the large meander are examined, including shear, transport and velocity. The many similarities between the model and observations indicate that the meander results from intrinsic oceanic variability, which is represented in this climatologically-forced model. Each large meander is preceded by a smaller "trigger" meander that originates at the southern end of Kyushu, moves up the coast, and develops into the large meander. However there are also many meanders very similar in character to the trigger meander that do not develop into large meanders. The mechanism that determines which trigger meanders develop into large meanders is as yet undetermined.

  4. Creating wavelet-based models for real-time synthesis of perceptually convincing environmental sounds

    NASA Astrophysics Data System (ADS)

    Miner, Nadine Elizabeth

    1998-09-01

    This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.

  5. A Finite Element Model for Mixed Porohyperelasticity with Transport, Swelling, and Growth.

    PubMed

    Armstrong, Michelle Hine; Buganza Tepole, Adrián; Kuhl, Ellen; Simon, Bruce R; Vande Geest, Jonathan P

    2016-01-01

    The purpose of this manuscript is to establish a unified theory of porohyperelasticity with transport and growth and to demonstrate the capability of this theory using a finite element model developed in MATLAB. We combine the theories of volumetric growth and mixed porohyperelasticity with transport and swelling (MPHETS) to derive a new method that models growth of biological soft tissues. The conservation equations and constitutive equations are developed for both solid-only growth and solid/fluid growth. An axisymmetric finite element framework is introduced for the new theory of growing MPHETS (GMPHETS). To illustrate the capabilities of this model, several example finite element test problems are considered using model geometry and material parameters based on experimental data from a porcine coronary artery. Multiple growth laws are considered, including time-driven, concentration-driven, and stress-driven growth. Time-driven growth is compared against an exact analytical solution to validate the model. For concentration-dependent growth, changing the diffusivity (representing a change in drug) fundamentally changes growth behavior. We further demonstrate that for stress-dependent, solid-only growth of an artery, growth of an MPHETS model results in a more uniform hoop stress than growth in a hyperelastic model for the same amount of growth time using the same growth law. This may have implications in the context of developing residual stresses in soft tissues under intraluminal pressure. To our knowledge, this manuscript provides the first full description of an MPHETS model with growth. The developed computational framework can be used in concert with novel in-vitro and in-vivo experimental approaches to identify the governing growth laws for various soft tissues.

  6. A Finite Element Model for Mixed Porohyperelasticity with Transport, Swelling, and Growth

    PubMed Central

    Armstrong, Michelle Hine; Buganza Tepole, Adrián; Kuhl, Ellen; Simon, Bruce R.; Vande Geest, Jonathan P.

    2016-01-01

    The purpose of this manuscript is to establish a unified theory of porohyperelasticity with transport and growth and to demonstrate the capability of this theory using a finite element model developed in MATLAB. We combine the theories of volumetric growth and mixed porohyperelasticity with transport and swelling (MPHETS) to derive a new method that models growth of biological soft tissues. The conservation equations and constitutive equations are developed for both solid-only growth and solid/fluid growth. An axisymmetric finite element framework is introduced for the new theory of growing MPHETS (GMPHETS). To illustrate the capabilities of this model, several example finite element test problems are considered using model geometry and material parameters based on experimental data from a porcine coronary artery. Multiple growth laws are considered, including time-driven, concentration-driven, and stress-driven growth. Time-driven growth is compared against an exact analytical solution to validate the model. For concentration-dependent growth, changing the diffusivity (representing a change in drug) fundamentally changes growth behavior. We further demonstrate that for stress-dependent, solid-only growth of an artery, growth of an MPHETS model results in a more uniform hoop stress than growth in a hyperelastic model for the same amount of growth time using the same growth law. This may have implications in the context of developing residual stresses in soft tissues under intraluminal pressure. To our knowledge, this manuscript provides the first full description of an MPHETS model with growth. The developed computational framework can be used in concert with novel in-vitro and in-vivo experimental approaches to identify the governing growth laws for various soft tissues. PMID:27078495

  7. Development of multilayer perceptron networks for isothermal time temperature transformation prediction of U-Mo-X alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johns, Jesse M.; Burkes, Douglas

    In this work, a multilayered perceptron (MLP) network is used to develop predictive isothermal time-temperature-transformation (TTT) models covering a range of U-Mo binary and ternary alloys. The selected ternary alloys for model development are U-Mo-Ru, U-Mo-Nb, U-Mo-Zr, U-Mo-Cr, and U-Mo-Re. These model’s ability to predict 'novel' U-Mo alloys is shown quite well despite the discrepancies between literature sources for similar alloys which likely arise from different thermal-mechanical processing conditions. These models are developed with the primary purpose of informing experimental decisions. Additional experimental insight is necessary in order to reduce the number of experiments required to isolate ideal alloys. Thesemore » models allow test planners to evaluate areas of experimental interest; once initial tests are conducted, the model can be updated and further improve follow-on testing decisions. The model also improves analysis capabilities by reducing the number of data points necessary from any particular test. For example, if one or two isotherms are measured during a test, the model can construct the rest of the TTT curve over a wide range of temperature and time. This modeling capability reduces the cost of experiments while also improving the value of the results from the tests. The reduced costs could result in improved material characterization and therefore improved fundamental understanding of TTT dynamics. As additional understanding of phenomena driving TTTs is acquired, this type of MLP model can be used to populate unknowns (such as material impurity and other thermal mechanical properties) from past literature sources.« less

  8. A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes

    PubMed Central

    Ma, Xin; Shen, Jianping

    2017-01-01

    The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094

  9. Additive mixed effect model for recurrent gap time data.

    PubMed

    Ding, Jieli; Sun, Liuquan

    2017-04-01

    Gap times between recurrent events are often of primary interest in medical and observational studies. The additive hazards model, focusing on risk differences rather than risk ratios, has been widely used in practice. However, the marginal additive hazards model does not take the dependence among gap times into account. In this paper, we propose an additive mixed effect model to analyze gap time data, and the proposed model includes a subject-specific random effect to account for the dependence among the gap times. Estimating equation approaches are developed for parameter estimation, and the asymptotic properties of the resulting estimators are established. In addition, some graphical and numerical procedures are presented for model checking. The finite sample behavior of the proposed methods is evaluated through simulation studies, and an application to a data set from a clinic study on chronic granulomatous disease is provided.

  10. A generalized computer code for developing dynamic gas turbine engine models (DIGTEM)

    NASA Technical Reports Server (NTRS)

    Daniele, C. J.

    1984-01-01

    This paper describes DIGTEM (digital turbofan engine model), a computer program that simulates two spool, two stream (turbofan) engines. DIGTEM was developed to support the development of a real time multiprocessor based engine simulator being designed at the Lewis Research Center. The turbofan engine model in DIGTEM contains steady state performance maps for all the components and has control volumes where continuity and energy balances are maintained. Rotor dynamics and duct momentum dynamics are also included. DIGTEM features an implicit integration scheme for integrating stiff systems and trims the model equations to match a prescribed design point by calculating correction coefficients that balance out the dynamic equations. It uses the same coefficients at off design points and iterates to a balanced engine condition. Transients are generated by defining the engine inputs as functions of time in a user written subroutine (TMRSP). Closed loop controls can also be simulated. DIGTEM is generalized in the aerothermodynamic treatment of components. This feature, along with DIGTEM's trimming at a design point, make it a very useful tool for developing a model of a specific turbofan engine.

  11. A generalized computer code for developing dynamic gas turbine engine models (DIGTEM)

    NASA Technical Reports Server (NTRS)

    Daniele, C. J.

    1983-01-01

    This paper describes DIGTEM (digital turbofan engine model), a computer program that simulates two spool, two stream (turbofan) engines. DIGTEM was developed to support the development of a real time multiprocessor based engine simulator being designed at the Lewis Research Center. The turbofan engine model in DIGTEM contains steady state performance maps for all the components and has control volumes where continuity and energy balances are maintained. Rotor dynamics and duct momentum dynamics are also included. DIGTEM features an implicit integration scheme for integrating stiff systems and trims the model equations to match a prescribed design point by calculating correction coefficients that balance out the dynamic equations. It uses the same coefficients at off design points and iterates to a balanced engine condition. Transients are generated by defining the engine inputs as functions of time in a user written subroutine (TMRSP). Closed loop controls can also be simulated. DIGTEM is generalized in the aerothermodynamic treatment of components. This feature, along with DIGTEM's trimming at a design point, make it a very useful tool for developing a model of a specific turbofan engine.

  12. Simulation of multistage turbine flows

    NASA Technical Reports Server (NTRS)

    Adamczyk, John J.; Mulac, Richard A.

    1987-01-01

    A flow model has been developed for analyzing multistage turbomachinery flows. This model, referred to as the average passage flow model, describes the time-averaged flow field with a typical passage of a blade row embedded within a multistage configuration. Computer resource requirements, supporting empirical modeling, formulation code development, and multitasking and storage are discussed. Illustrations from simulations of the space shuttle main engine (SSME) fuel turbine performed to date are given.

  13. Fault Diagnostics and Prognostics for Large Segmented SRMs

    NASA Technical Reports Server (NTRS)

    Luchinsky, Dmitry; Osipov, Viatcheslav V.; Smelyanskiy, Vadim N.; Timucin, Dogan A.; Uckun, Serdar; Hayashida, Ben; Watson, Michael; McMillin, Joshua; Shook, David; Johnson, Mont; hide

    2009-01-01

    We report progress in development of the fault diagnostic and prognostic (FD&P) system for large segmented solid rocket motors (SRMs). The model includes the following main components: (i) 1D dynamical model of internal ballistics of SRMs; (ii) surface regression model for the propellant taking into account erosive burning; (iii) model of the propellant geometry; (iv) model of the nozzle ablation; (v) model of a hole burning through in the SRM steel case. The model is verified by comparison of the spatially resolved time traces of the flow parameters obtained in simulations with the results of the simulations obtained using high-fidelity 2D FLUENT model (developed by the third party). To develop FD&P system of a case breach fault for a large segmented rocket we notice [1] that the stationary zero-dimensional approximation for the nozzle stagnation pressure is surprisingly accurate even when stagnation pressure varies significantly in time during burning tail-off. This was also found to be true for the case breach fault [2]. These results allow us to use the FD&P developed in our earlier research [3]-[6] by substituting head stagnation pressure with nozzle stagnation pressure. The axial corrections to the value of the side thrust due to the mass addition are taken into account by solving a system of ODEs in spatial dimension.

  14. Space-time modeling of timber prices

    Treesearch

    Mo Zhou; Joseph Buongriorno

    2006-01-01

    A space-time econometric model was developed for pine sawtimber timber prices of 21 geographically contiguous regions in the southern United States. The correlations between prices in neighboring regions helped predict future prices. The impulse response analysis showed that although southern pine sawtimber markets were not globally integrated, local supply and demand...

  15. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  16. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  17. Mesospheric Water Vapor Retrieved from SABER/TIMED Measurements

    NASA Technical Reports Server (NTRS)

    Feofilov, Arte, G.; Yankovsky, Valentine A.; Marshall, Benjamin T.; Russell, J. M., III; Pesnell, W. D.; Kutepov, Alexander A.; Goldberg, Richard A.; Gordley, Larry L.; Petelina, Svetlama; Mauilova, Rada O.; hide

    2007-01-01

    The SABER instrument on board the TIMED satellite is a limb scanning infrared radiometer designed to measure temperature and minor constituent vertical profiles and energetics parameters in the mesosphere and lower thermosphere (MLT) The H2O concentrations are retrieved from 6.3 micron band radiances. The interpretation of this radiance requires developing a non-LTE H2O model that includes energy exchange processes with the system of O3 and O2 vibrational levels populated at the daytime through a number of photoabsorption and photodissociation processes. We developed a research model base on an extended H2O non-LTE model of Manuilova coupled with the novel model of the electronic kinetics of the O2 and O3 photolysis products suggested by Yankosvky and Manuilova. The performed study of this model helped u to develop and test an optimized operational model for interpretation of SABER 6.3 micron band radiances. The sensitivity of retrievals to the parameters of the model is discussed. The H2O retrievals are compared to other measurements for different seasons and locations.

  18. PSO-Assisted Development of New Transferable Coarse-Grained Water Models.

    PubMed

    Bejagam, Karteek K; Singh, Samrendra; An, Yaxin; Berry, Carter; Deshmukh, Sanket A

    2018-02-15

    We have employed two-to-one mapping scheme to develop three coarse-grained (CG) water models, namely, 1-, 2-, and 3-site CG models. Here, for the first time, particle swarm optimization (PSO) and gradient descent methods were coupled to optimize the force-field parameters of the CG models to reproduce the density, self-diffusion coefficient, and dielectric constant of real water at 300 K. The CG MD simulations of these new models conducted with various timesteps, for different system sizes, and at a range of different temperatures are able to predict the density, self-diffusion coefficient, dielectric constant, surface tension, heat of vaporization, hydration free energy, and isothermal compressibility of real water with excellent accuracy. The 1-site model is ∼3 and ∼4.5 times computationally more efficient than 2- and 3-site models, respectively. To utilize the speed of 1-site model and electrostatic interactions offered by 2- and 3-site models, CG MD simulations of 1:1 combination of 1- and 2-/3-site models were performed at 300 K. These mixture simulations could also predict the properties of real water with good accuracy. Two new CG models of benzene, consisting of beads with and without partial charges, were developed. All three water models showed good capacity to solvate these benzene models.

  19. An approach to a real-time distribution system

    NASA Technical Reports Server (NTRS)

    Kittle, Frank P., Jr.; Paddock, Eddie J.; Pocklington, Tony; Wang, Lui

    1990-01-01

    The requirements of a real-time data distribution system are to provide fast, reliable delivery of data from source to destination with little or no impact to the data source. In this particular case, the data sources are inside an operational environment, the Mission Control Center (MCC), and any workstation receiving data directly from the operational computer must conform to the software standards of the MCC. In order to supply data to development workstations outside of the MCC, it is necessary to use gateway computers that prevent unauthorized data transfer back to the operational computers. Many software programs produced on the development workstations are targeted for real-time operation. Therefore, these programs must migrate from the development workstation to the operational workstation. It is yet another requirement for the Data Distribution System to ensure smooth transition of the data interfaces for the application developers. A standard data interface model has already been set up for the operational environment, so the interface between the distribution system and the application software was developed to match that model as closely as possible. The system as a whole therefore allows the rapid development of real-time applications without impacting the data sources. In summary, this approach to a real-time data distribution system provides development users outside of the MCC with an interface to MCC real-time data sources. In addition, the data interface was developed with a flexible and portable software design. This design allows for the smooth transition of new real-time applications to the MCC operational environment.

  20. A residence-time-based transport approach for the groundwater pathway in performance assessment models

    NASA Astrophysics Data System (ADS)

    Robinson, Bruce A.; Chu, Shaoping

    2013-03-01

    This paper presents the theoretical development and numerical implementation of a new modeling approach for representing the groundwater pathway in risk assessment or performance assessment model of a contaminant transport system. The model developed in the present study, called the Residence Time Distribution (RTD) Mixing Model (RTDMM), allows for an arbitrary distribution of fluid travel times to be represented, to capture the effects on the breakthrough curve of flow processes such as channelized flow and fast pathways and complex three-dimensional dispersion. Mathematical methods for constructing the model for a given RTD are derived directly from the theory of residence time distributions in flowing systems. A simple mixing model is presented, along with the basic equations required to enable an arbitrary RTD to be reproduced using the model. The practical advantages of the RTDMM include easy incorporation into a multi-realization probabilistic simulation; computational burden no more onerous than a one-dimensional model with the same number of grid cells; and straightforward implementation into available flow and transport modeling codes, enabling one to then utilize advanced transport features of that code. For example, in this study we incorporated diffusion into the stagnant fluid in the rock matrix away from the flowing fractures, using a generalized dual porosity model formulation. A suite of example calculations presented herein showed the utility of the RTDMM for the case of a radioactive decay chain, dual porosity transport and sorption.

  1. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  2. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  3. Developing a predictive tropospheric ozone model for Tabriz

    NASA Astrophysics Data System (ADS)

    Khatibi, Rahman; Naghipour, Leila; Ghorbani, Mohammad A.; Smith, Michael S.; Karimi, Vahid; Farhoudi, Reza; Delafrouz, Hadi; Arvanaghi, Hadi

    2013-04-01

    Predictive ozone models are becoming indispensable tools by providing a capability for pollution alerts to serve people who are vulnerable to the risks. We have developed a tropospheric ozone prediction capability for Tabriz, Iran, by using the following five modeling strategies: three regression-type methods: Multiple Linear Regression (MLR), Artificial Neural Networks (ANNs), and Gene Expression Programming (GEP); and two auto-regression-type models: Nonlinear Local Prediction (NLP) to implement chaos theory and Auto-Regressive Integrated Moving Average (ARIMA) models. The regression-type modeling strategies explain the data in terms of: temperature, solar radiation, dew point temperature, and wind speed, by regressing present ozone values to their past values. The ozone time series are available at various time intervals, including hourly intervals, from August 2010 to March 2011. The results for MLR, ANN and GEP models are not overly good but those produced by NLP and ARIMA are promising for the establishing a forecasting capability.

  4. Construction schedules slack time minimizing

    NASA Astrophysics Data System (ADS)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  5. Risk factors for the development of heterotopic ossification in seriously burned adults: A National Institute on Disability, Independent Living and Rehabilitation Research burn model system database analysis.

    PubMed

    Levi, Benjamin; Jayakumar, Prakash; Giladi, Avi; Jupiter, Jesse B; Ring, David C; Kowalske, Karen; Gibran, Nicole S; Herndon, David; Schneider, Jeffrey C; Ryan, Colleen M

    2015-11-01

    Heterotopic ossification (HO) is a debilitating complication of burn injury; however, incidence and risk factors are poorly understood. In this study, we use a multicenter database of adults with burn injuries to identify and analyze clinical factors that predict HO formation. Data from six high-volume burn centers, in the Burn Injury Model System Database, were analyzed. Univariate logistic regression models were used for model selection. Cluster-adjusted multivariate logistic regression was then used to evaluate the relationship between clinical and demographic data and the development of HO. Of 2,979 patients in the database with information on HO that addressed risk factors for development of HO, 98 (3.5%) developed HO. Of these 98 patients, 97 had arm burns, and 96 had arm grafts. When controlling for age and sex in a multivariate model, patients with greater than 30% total body surface area burn had 11.5 times higher odds of developing HO (p < 0.001), and those with arm burns that required skin grafting had 96.4 times higher odds of developing HO (p = 0.04). For each additional time a patient went to the operating room, odds of HO increased by 30% (odds ratio, 1.32; p < 0.001), and each additional ventilator day increased odds by 3.5% (odds ratio, 1.035; p < 0.001). Joint contracture, inhalation injury, and bone exposure did not significantly increase odds of HO. Risk factors for HO development include greater than 30% total body surface area burn, arm burns, arm grafts, ventilator days, and number of trips to the operating room. Future studies can use these results to identify highest-risk patients to guide deployment of prophylactic and experimental treatments. Prognostic study, level III.

  6. [Projection of prisoner numbers].

    PubMed

    Metz, Rainer; Sohn, Werner

    2015-01-01

    The past and future development of occupancy rates in prisons is of crucial importance for the judicial administration of every country. Basic factors for planning the required penal facilities are seasonal fluctuations, minimum, maximum and average occupancy as well as the present situation and potential development of certain imprisonment categories. As the prisoner number of a country is determined by a complex set of interdependent conditions, it has turned out to be difficult to provide any theoretical explanations. The idea accepted in criminology for a long time that prisoner numbers are interdependent with criminal policy must be regarded as having failed. Statistical and time series analyses may help, however, to identify the factors having influenced the development of prisoner numbers in the past. The analyses presented here, first describe such influencing factors from a criminological perspective and then deal with their statistical identification and modelling. Using the development of prisoner numbers in Hesse as an example, it has been found that modelling methods in which the independent variables predict the dependent variable with a time lag are particularly helpful. A potential complication is, however, that for predicting the number of prisoners the different dynamics in German and foreign prisoners require the development of further models.

  7. Development, testing, and applications of site-specific tsunami inundation models for real-time forecasting

    NASA Astrophysics Data System (ADS)

    Tang, L.; Titov, V. V.; Chamberlin, C. D.

    2009-12-01

    The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.

  8. The Use of Piecewise Growth Models to Estimate Learning Trajectories and RtI Instructional Effects in a Comparative Interrupted Time-Series Design

    ERIC Educational Resources Information Center

    Zvoch, Keith

    2016-01-01

    Piecewise growth models (PGMs) were used to estimate and model changes in the preliteracy skill development of kindergartners in a moderately sized school district in the Pacific Northwest. PGMs were applied to interrupted time-series (ITS) data that arose within the context of a response-to-intervention (RtI) instructional framework. During the…

  9. A time-dependent diffusion convection model for the long term modulation of cosmic rays

    NASA Technical Reports Server (NTRS)

    Gallagher, J. J.

    1974-01-01

    A model is developed which incorporates to first order the direct effects of the time dependent diffusive propagation of interstellar cosmic rays in a slowly changing interplanetary medium. The model provides a physical explanation for observed rigidity-dependent phase lags in modulated spectra (cosmic ray hysteresis). The average distance to the modulating boundary during the last solar cycle is estimated.

  10. A review and assessment of land-use change models: dynamics of space, time, and human choice

    Treesearch

    Chetan Agarwal; Glen M. Green; J. Morgan Grove; Tom P. Evans; Charles M. Schweik

    2002-01-01

    A review of different types of land-use change models incorporating human processes. Presents a framework to compare land-use change models in terms of scale (both spatial and temporal) and complexity, and how well they incorporate space, time, and human decisionmaking. Examines a summary set of 250 relevant citations and develops a bibliography of 136 papers. From...

  11. Logic models as a tool for sexual violence prevention program development.

    PubMed

    Hawkins, Stephanie R; Clinton-Sherrod, A Monique; Irvin, Neil; Hart, Laurie; Russell, Sarah Jane

    2009-01-01

    Sexual violence is a growing public health problem, and there is an urgent need to develop sexual violence prevention programs. Logic models have emerged as a vital tool in program development. The Centers for Disease Control and Prevention funded an empowerment evaluation designed to work with programs focused on the prevention of first-time male perpetration of sexual violence, and it included as one of its goals, the development of program logic models. Two case studies are presented that describe how significant positive changes can be made to programs as a result of their developing logic models that accurately describe desired outcomes. The first case study describes how the logic model development process made an organization aware of the importance of a program's environmental context for program success; the second case study demonstrates how developing a program logic model can elucidate gaps in organizational programming and suggest ways to close those gaps.

  12. High performance real-time flight simulation at NASA Langley

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  13. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  14. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  15. A GIS-based time-dependent seismic source modeling of Northern Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  16. Computational problems in autoregressive moving average (ARMA) models

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Goodarzi, S. M.; Oneill, W. D.; Gottlieb, G. L.

    1981-01-01

    The choice of the sampling interval and the selection of the order of the model in time series analysis are considered. Band limited (up to 15 Hz) random torque perturbations are applied to the human ankle joint. The applied torque input, the angular rotation output, and the electromyographic activity using surface electrodes from the extensor and flexor muscles of the ankle joint are recorded. Autoregressive moving average models are developed. A parameter constraining technique is applied to develop more reliable models. The asymptotic behavior of the system must be taken into account during parameter optimization to develop predictive models.

  17. Cost Modeling for Space Optical Telescope Assemblies

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.

  18. Modeling and simulation of sexual activity daily diary data of patients with female sexual arousal disorder treated with sildenafil citrate (Viagra).

    PubMed

    Claret, Laurent; Cox, Eugene H; McFadyen, Lynn; Pidgen, Alwyn; Johnson, Patrick J; Haughie, Scott; Boolell, Mitra; Bruno, Rene

    2006-08-01

    To develop a model to explore the dose-response of sildenafil citrate in patients with female sexual arousal disorder (FSAD) based on telephone sexual activity daily diary (TSADD) data obtained in double-blind, placebo controlled clinical studies. Data were available on 614 patients with FSAD. A parametric model (Weibull distribution) was developed to describe the probability density function of the time between sexual events. Orgasm satisfaction scores and overall sexual satisfaction scores were simultaneously modeled as ordered categorical variables. Simulations were performed to evaluate the expected clinical response in patients with FSAD. The expected time between sexual events was approximately 3.5 days. Satisfaction scores increased with time to achieve a plateau after 3 to 4 weeks on treatment. The expected probability of satisfying orgasm (score of 3 and higher) ranged from 34.7% for placebo to 41.6% for 100 mg sildenafil citrate. Treatment effect (difference from placebo) was 6.9% for 100 mg sildenafil citrate, ranging from 0.6 to 24.7% for testosterone levels of 0.1 to 4.0 pg/ml. The treatment effect in postmenopausal women was larger than in premenopausal women. A modeling and simulation framework to support drug development in FSAD was developed. Sildenafil citrate demonstrated a dose-dependent effect in patients with FSAD.

  19. On use of characteristic wavelengths of track irregularities to predict track portions with deteriorated wheel/rail forces

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Zhai, Wanming; Chen, Zhaowei

    2018-05-01

    The dynamic performance of the railway vehicles and the guiding tracks is mainly governed by the wheel-rail interactions, particularly in cases of track irregularities. In this work, a united model was developed to investigate the track portions subject to violent wheel/rail forces triggered by track irregularities at middle-low frequencies. In the modeling procedures, a time-frequency unification method combining wavelet transform and Wigner-Ville distribution for characterizing time-frequency characteristics of track irregularities and a three-dimensional nonlinear model for describing vehicle-track interaction signatures were developed and coupled, based on which the method for predicting track portions subject to deteriorated wheel/rail forces was proposed. The theoretical models developed in this paper were comprehensively validated by numerical investigations. The significance of this present study mainly lies on offering a new path to establish correlation and realize mutual prediction between track irregularity and railway system dynamics.

  20. A model for methane production in sewers.

    PubMed

    Chaosakul, Thitirat; Koottatep, Thammarat; Polprasert, Chongrak

    2014-09-19

    Most sewers in developing countries are combined sewers which receive stormwater and effluent from septic tanks or cesspools of households and buildings. Although the wastewater strength in these sewers is usually lower than those in developed countries, due to improper construction and maintenance, the hydraulic retention time (HRT) could be relatively long and resulting considerable greenhouse gas (GHG) production. This study proposed an empirical model to predict the quantity of methane production in gravity-flow sewers based on relevant parameters such as surface area to volume ratio (A/V) of sewer, hydraulic retention time (HRT) and wastewater temperature. The model was developed from field survey data of gravity-flow sewers located in a peri-urban area, central Thailand and validated with field data of a sewer system of the Gold Coast area, Queensland, Australia. Application of this model to improve construction and maintenance of gravity-flow sewers to minimize GHG production and reduce global warming is presented.

Top