Science.gov

Sample records for accident prediction model

  1. Updating outdated predictive accident models.

    PubMed

    Wood, A G; Mountain, L J; Connors, R D; Maher, M J; Ropkins, K

    2013-06-01

    Reliable predictive accident models (PAMs) (also referred to as safety performance functions (SPFs)) are essential to design and maintain safe road networks however, ongoing changes in road and vehicle design coupled with road safety initiatives, mean that these models can quickly become dated. Unfortunately, because the fitting of sophisticated PAMs including a wide range of explanatory variables is not a trivial task, available models tend to be based on data collected many years ago and seem unlikely to give reliable estimates of current accidents. Large, expensive studies to produce new models are likely to be, at best, only a temporary solution. This paper thus seeks to develop a practical and efficient methodology to allow currently available PAMs to be updated to give unbiased estimates of accident frequencies at any point in time. Two principal issues are examined: the extent to which the temporal transferability of predictive accident models varies with model complexity; and the practicality and efficiency of two alternative updating strategies. The models used to illustrate these issues are the suites of models developed for rural dual and single carriageway roads in the UK. These are widely used in several software packages in spite of being based on data collected during the 1980s and early 1990s. It was found that increased model complexity by no means ensures better temporal transferability and that calibration of the models using a scale factor can be a practical alternative to fitting new models. PMID:23510788

  2. Temporal transferability and updating of zonal level accident prediction models.

    PubMed

    Hadayeghi, Alireza; Shalaby, Amer S; Persaud, Bhagwant N; Cheung, Carl

    2006-05-01

    This paper examines the temporal transferability of the zonal accident prediction models by using appropriate evaluation measures of predictive performance to assess whether the relationship between the dependent and independent variables holds reasonably well across time. The two temporal contexts are the years 1996 and 2001, with updated 1996 models being used to predict 2001 accidents in each traffic zone of the City of Toronto. The paper examines alternative updating methods for temporal transfer by imagining that only a sample of 2001 data is available. The sensitivity of the performance of the updated models to the 2001 sample size is explored. The updating procedures examined include the Bayesian updating approach and the application of calibration factors to the 1996 models. Models calibrated for the 2001 samples were also explored, but were found to be inadequate. The results show that the models are not transferable in a strict statistical sense. However, relative measures of transferability indicate that the transferred models yield useful information in the application context. Also, it is concluded that the updated accident models using the calibration factors produce better results for predicting the number of accidents in the year 2001 than using the Bayesian approach. PMID:16414003

  3. Accident prediction models for roads with minor junctions.

    PubMed

    Mountain, L; Fawaz, B; Jarrett, D

    1996-11-01

    The purpose of this study was to develop and validate a method for predicting expected accidents on main roads with minor junctions where traffic counts on the minor approaches are not available. The study was based on data for some 3800 km of highway in the U.K. including more than 5000 minor junctions. The highways consisted of both single and dual-carriageway roads in urban and rural areas. Generalized linear modelling was used to develop regression estimates of expected accidents for six highway categories and an empirical Bayes procedure was used to improve these estimates by combining them with accident counts. Accidents on highway sections were shown to be a non-linear function of exposure and minor junction frequency. For the purposes of estimating expected accidents, while the regression model estimates were shown to be preferable to accident counts, the best results were obtained using the empirical Bayes method. The latter was the only method that produced unbiased estimates of expected accidents for high-risk sites. PMID:9006638

  4. Methodology for fitting and updating predictive accident models with trend.

    PubMed

    Connors, Richard D; Maher, Mike; Wood, Alan; Mountain, Linda; Ropkins, Karl

    2013-07-01

    Reliable predictive accident models (PAMs) (also referred to as Safety Performance Functions (SPFs)) have a variety of important uses in traffic safety research and practice. They are used to help identify sites in need of remedial treatment, in the design of transport schemes to assess safety implications, and to estimate the effectiveness of remedial treatments. The PAMs currently in use in the UK are now quite old; the data used in their development was gathered up to 30 years ago. Many changes have occurred over that period in road and vehicle design, in road safety campaigns and legislation, and the national accident rate has fallen substantially. It seems unlikely that these ageing models can be relied upon to provide accurate and reliable predictions of accident frequencies on the roads today. This paper addresses a number of methodological issues that arise in seeking practical and efficient ways to update PAMs, whether by re-calibration or by re-fitting. Models for accidents on rural single carriageway roads have been chosen to illustrate these issues, including the choice of distributional assumption for overdispersion, the choice of goodness of fit measures, questions of independence between observations in different years, and between links on the same scheme, the estimation of trends in the models, the uncertainty of predictions, as well as considerations about the most efficient and convenient ways to fit the required models. PMID:23612560

  5. Accident prediction model for railway-highway interfaces.

    PubMed

    Oh, Jutaek; Washington, Simon P; Nam, Doohee

    2006-03-01

    Considerable past research has explored relationships between vehicle accidents and geometric design and operation of road sections, but relatively little research has examined factors that contribute to accidents at railway-highway crossings. Between 1998 and 2002 in Korea, about 95% of railway accidents occurred at highway-rail grade crossings, resulting in 402 accidents, of which about 20% resulted in fatalities. These statistics suggest that efforts to reduce crashes at these locations may significantly reduce crash costs. The objective of this paper is to examine factors associated with railroad crossing crashes. Various statistical models are used to examine the relationships between crossing accidents and features of crossings. The paper also compares accident models developed in the United States and the safety effects of crossing elements obtained using Korea data. Crashes were observed to increase with total traffic volume and average daily train volumes. The proximity of crossings to commercial areas and the distance of the train detector from crossings are associated with larger numbers of accidents, as is the time duration between the activation of warning signals and gates. The unique contributions of the paper are the application of the gamma probability model to deal with underdispersion and the insights obtained regarding railroad crossing related vehicle crashes. PMID:16297846

  6. Accident prediction model for public highway-rail grade crossings.

    PubMed

    Lu, Pan; Tolliver, Denver

    2016-05-01

    Considerable research has focused on roadway accident frequency analysis, but relatively little research has examined safety evaluation at highway-rail grade crossings. Highway-rail grade crossings are critical spatial locations of utmost importance for transportation safety because traffic crashes at highway-rail grade crossings are often catastrophic with serious consequences. The Poisson regression model has been employed to analyze vehicle accident frequency as a good starting point for many years. The most commonly applied variations of Poisson including negative binomial, and zero-inflated Poisson. These models are used to deal with common crash data issues such as over-dispersion (sample variance is larger than the sample mean) and preponderance of zeros (low sample mean and small sample size). On rare occasions traffic crash data have been shown to be under-dispersed (sample variance is smaller than the sample mean) and traditional distributions such as Poisson or negative binomial cannot handle under-dispersion well. The objective of this study is to investigate and compare various alternate highway-rail grade crossing accident frequency models that can handle the under-dispersion issue. The contributions of the paper are two-fold: (1) application of probability models to deal with under-dispersion issues and (2) obtain insights regarding to vehicle crashes at public highway-rail grade crossings. PMID:26922288

  7. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  8. Model predictions of wind and turbulence profiles associated with an ensemble of aircraft accidents

    NASA Technical Reports Server (NTRS)

    Williamson, G. G.; Lewellen, W. S.; Teske, M. E.

    1977-01-01

    The feasibility of predicting conditions under which wind/turbulence environments hazardous to aviation operations exist is studied by examining a number of different accidents in detail. A model of turbulent flow in the atmospheric boundary layer is used to reconstruct wind and turbulence profiles which may have existed at low altitudes at the time of the accidents. The predictions are consistent with available flight recorder data, but neither the input boundary conditions nor the flight recorder observations are sufficiently precise for these studies to be interpreted as verification tests of the model predictions.

  9. Combined Prediction Model of Death Toll for Road Traffic Accidents Based on Independent and Dependent Variables

    PubMed Central

    Zhong-xiang, Feng; Shi-sheng, Lu; Wei-hua, Zhang; Nan-nan, Zhang

    2014-01-01

    In order to build a combined model which can meet the variation rule of death toll data for road traffic accidents and can reflect the influence of multiple factors on traffic accidents and improve prediction accuracy for accidents, the Verhulst model was built based on the number of death tolls for road traffic accidents in China from 2002 to 2011; and car ownership, population, GDP, highway freight volume, highway passenger transportation volume, and highway mileage were chosen as the factors to build the death toll multivariate linear regression model. Then the two models were combined to be a combined prediction model which has weight coefficient. Shapley value method was applied to calculate the weight coefficient by assessing contributions. Finally, the combined model was used to recalculate the number of death tolls from 2002 to 2011, and the combined model was compared with the Verhulst and multivariate linear regression models. The results showed that the new model could not only characterize the death toll data characteristics but also quantify the degree of influence to the death toll by each influencing factor and had high accuracy as well as strong practicability. PMID:25610454

  10. Application of Gray Markov SCGM(1,1)c Model to Prediction of Accidents Deaths in Coal Mining

    PubMed Central

    Lan, Jian-yi; Zhou, Ying

    2014-01-01

    The prediction of mine accident is the basis of aviation safety assessment and decision making. Gray prediction is suitable for such kinds of system objects with few data, short time, and little fluctuation, and Markov chain theory is just suitable for forecasting stochastic fluctuating dynamic process. Analyzing the coal mine accident human error cause, combining the advantages of both Gray prediction and Markov theory, an amended Gray Markov SCGM(1,1)c model is proposed. The gray SCGM(1,1)c model is applied to imitate the development tendency of the mine safety accident, and adopt the amended model to improve prediction accuracy, while Markov prediction is used to predict the fluctuation along the tendency. Finally, the new model is applied to forecast the mine safety accident deaths from 1990 to 2010 in China, and, 2011–2014 coal accidents deaths were predicted. The results show that the new model not only discovers the trend of the mine human error accident death toll but also overcomes the random fluctuation of data affecting precision. It possesses stronger engineering application.

  11. Compartment model for long-term contamination prediction in deciduous fruit trees after a nuclear accident

    SciTech Connect

    Antonopoulos-Domis, M.; Clouvas, A.; Gagianas, A. )

    1990-06-01

    Radiocesium contamination from the Chernobyl accident of different parts (fruits, leaves, and shoots) of selected apricot trees in North Greece was systematically measured in 1987 and 1988. The results are presented and discussed in the framework of a simple compartment model describing the long-term contamination uptake mechanism of deciduous fruit trees after a nuclear accident.

  12. A combined M5P tree and hazard-based duration model for predicting urban freeway traffic accident durations.

    PubMed

    Lin, Lei; Wang, Qian; Sadek, Adel W

    2016-06-01

    The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a "time-to-an-event" is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a "time-to-event" modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean

  13. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  14. Predictions of structural integrity of steam generator tubes under normal operating, accident, and severe accident conditions

    SciTech Connect

    Majumdar, S.

    1996-09-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation is confirmed by further tests at high temperatures as well as by finite element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation is confirmed by finite element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure is developed and validated by tests under varying temperature and pressure loading expected during severe accidents.

  15. Review of models applicable to accident aerosols

    SciTech Connect

    Glissmeyer, J.A.

    1983-07-01

    Estimations of potential airborne-particle releases are essential in safety assessments of nuclear-fuel facilities. This report is a review of aerosol behavior models that have potential applications for predicting aerosol characteristics in compartments containing accident-generated aerosol sources. Such characterization of the accident-generated aerosols is a necessary step toward estimating their eventual release in any accident scenario. Existing aerosol models can predict the size distribution, concentration, and composition of aerosols as they are acted on by ventilation, diffusion, gravity, coagulation, and other phenomena. Models developed in the fields of fluid mechanics, indoor air pollution, and nuclear-reactor accidents are reviewed with this nuclear fuel facility application in mind. The various capabilities of modeling aerosol behavior are tabulated and discussed, and recommendations are made for applying the models to problems of differing complexity.

  16. Investigation of adolescent accident predictive variables in hilly regions.

    PubMed

    Mohanty, Malaya; Gupta, Ankit

    2016-09-01

    The study aims to determine the significant personal and environmental factors in predicting the adolescent accidents in the hilly regions taking into account two cities Hamirpur and Dharamshala, which lie at an average elevation of 700--1000 metres above the mean sea level (MSL). Detailed comparisons between the results of 2 cities are also studied. The results are analyzed to provide the list of most significant factors responsible for adolescent accidents. Data were collected from different schools and colleges of the city with the help of a questionnaire survey. Around 690 responses from Hamirpur and 460 responses from Dharamshala were taken for study and analysis. Standard deviations (SD) of various factors affecting accidents were calculated and factors with relatively very low SD were discarded and other variables were considered for correlations. Correlation was developed using Kendall's-tau and chi-square tests and factors those were found significant were used for modelling. They were - the victim's age, the character of road, the speed of vehicle, and the use of helmet for Hamirpur and for Dharamshala, the kind of vehicle involved was an added variable found responsible for adolescent accidents. A logistic regression was performed to know the effect of each category present in a variable on the occurrence of accidents. Though the age and the speed of vehicle were considered to be important factors for accident occurrence according to Indian accident data records, even the use of helmet comes out as a major concern. The age group of 15-18 and 18-21 years were found to be more susceptible to accidents than the higher age groups. Due to the presence of hilly area, the character of road becomes a major concern for cause of accidents and the topography of the area makes the kind of vehicle involved as a major variable for determining the severity of accidents. PMID:26077876

  17. Do Cognitive Models Help in Predicting the Severity of Posttraumatic Stress Disorder, Phobia, and Depression after Motor Vehicle Accidents? A Prospective Longitudinal Study

    ERIC Educational Resources Information Center

    Ehring, Thomas; Ehlers, Anke; Glucksman, Edward

    2008-01-01

    The study investigated the power of theoretically derived cognitive variables to predict posttraumatic stress disorder (PTSD), travel phobia, and depression following injury in a motor vehicle accident (MVA). MVA survivors (N = 147) were assessed at the emergency department on the day of their accident and 2 weeks, 1 month, 3 months, and 6 months…

  18. Predictions of structural integrity of steam generator tubes under normal operating, accident, an severe accident conditions

    SciTech Connect

    Majumdar, S.

    1997-02-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation was confirmed by further tests at high temperatures, as well as by finite-element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation was confirmed by finite-element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate-sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure was developed and validated by tests under various temperature and pressure loadings that can occur during postulated severe accidents.

  19. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  20. Development of a model to predict flow oscillations in low-flow sodium boiling. [Loss-of-Piping Integrity accidents

    SciTech Connect

    Levin, A.E.; Griffith, P.

    1980-04-01

    Tests performed in a small scale water loop showed that voiding oscillations, similar to those observed in sodium, were present in water, as well. An analytical model, appropriate for either sodium or water, was developed and used to describe the water flow behavior. The experimental results indicate that water can be successfully employed as a sodium simulant, and further, that the condensation heat transfer coefficient varies significantly during the growth and collapse of vapor slugs during oscillations. It is this variation, combined with the temperature profile of the unheated zone above the heat source, which determines the oscillatory behavior of the system. The analytical program has produced a model which qualitatively does a good job in predicting the flow behavior in the wake experiment. The amplitude discrepancies are attributable to experimental uncertainties and model inadequacies. Several parameters (heat transfer coefficient, unheated zone temperature profile, mixing between hot and cold fluids during oscillations) are set by the user. Criteria for the comparison of water and sodium experiments have been developed.

  1. FASTGRASS: A mechanistic model for the prediction of Xe, I, Cs, Te, Ba, and Sr release from nuclear fuel under normal and severe-accident conditions

    SciTech Connect

    Rest, J.; Zawadzki, S.A. )

    1992-09-01

    The primary physical/chemical models that form the basis of the FASTGRASS mechanistic computer model for calculating fission-product release from nuclear fuel are described. Calculated results are compared with test data and the major mechanisms affecting the transport of fission products during steady-state and accident conditions are identified.

  2. An exploration of the utility of mathematical modeling predicting fatigue from sleep/wake history and circadian phase applied in accident analysis and prevention: the crash of Comair Flight 5191.

    PubMed

    Pruchnicki, Shawn A; Wu, Lora J; Belenky, Gregory

    2011-05-01

    On 27 August 2006 at 0606 eastern daylight time (EDT) at Bluegrass Airport in Lexington, KY (LEX), the flight crew of Comair Flight 5191 inadvertently attempted to take off from a general aviation runway too short for their aircraft. The aircraft crashed killing 49 of the 50 people on board. To better understand this accident and to aid in preventing similar accidents, we applied mathematical modeling predicting fatigue-related degradation in performance for the Air Traffic Controller on-duty at the time of the crash. To provide the necessary input to the model, we attempted to estimate circadian phase and sleep/wake histories for the Captain, First Officer, and Air Traffic Controller. We were able to estimate with confidence the circadian phase for each. We were able to estimate with confidence the sleep/wake history for the Air Traffic Controller, but unable to do this for the Captain and First Officer. Using the sleep/wake history estimates for the Air Traffic Controller as input, the mathematical modeling predicted moderate fatigue-related performance degradation at the time of the crash. This prediction was supported by the presence of what appeared to be fatigue-related behaviors in the Air Traffic Controller during the 30 min prior to and in the minutes after the crash. Our modeling results do not definitively establish fatigue in the Air Traffic Controller as a cause of the accident, rather they suggest that had he been less fatigued he might have detected Comair Flight 5191's lining up on the wrong runway. We were not able to perform a similar analysis for the Captain and First Officer because we were not able to estimate with confidence their sleep/wake histories. Our estimates of sleep/wake history and circadian rhythm phase for the Air Traffic Controller might generalize to other air traffic controllers and to flight crew operating in the early morning hours at LEX. Relative to other times of day, the modeling results suggest an elevated risk of fatigue

  3. Nuclear Facilities Fire Accident Model

    Energy Science and Technology Software Center (ESTSC)

    1999-09-01

    4. NATURE OF PROBLEM SOLVED FIRAC predicts fire-induced flows, thermal and material transport, and radioactive and nonradioactive source terms in a ventilation system. It is designed to predict the radioactive and nonradioactive source terms that lead to gas dynamic, material transport, and heat transfer transients. FIRAC's capabilities are directed toward nuclear fuel cycle facilities and the primary release pathway, the ventilation system. However, it is applicable to other facilities and can be used to modelmore » other airflow pathways within a structure. The basic material transport capability of FIRAC includes estimates of entrainment, convection, deposition, and filtration of material. The interrelated effects of filter plugging, heat transfer, and gas dynamics are also simulated. A ventilation system model includes elements such as filters, dampers, ducts, and blowers connected at nodal points to form networks. A zone-type compartment fire model is incorporated to simulate fire-induced transients within a facility. 5. METHOD OF SOLUTION FIRAC solves one-dimensional, lumped-parameter, compressible flow equations by an implicit numerical scheme. The lumped-parameter method is the basic formulation that describes the gas dynamics system. No spatial distribution of parameters is considered in this approach, but an effect of spatial distribution can be approximated by noding. Network theory, using the lumped parameter method, includes a number of system elements, called branches, joined at certain points, called nodes. Ventilation system components that exhibit flow resistance and inertia, such as dampers, ducts, valves, and filters, and those that exhibit flow potential, such as blowers, are located within the branches of the system. The connection points of branches are nodes for components that have finite volumes, such as rooms, gloveboxes, and plenums, and for boundaries where the volume is practically infinite. All internal nodes, therefore, possess some

  4. Predicting and analyzing the trend of traffic accidents deaths in Iran in 2014 and 2015

    PubMed Central

    Mehmandar, Mohammadreza; Soori, Hamid; Mehrabi, Yadolah

    2016-01-01

    Background: Predicting the trend in traffic accidents deaths and its analysis can be a useful tool for planning and policy-making, conducting interventions appropriate with death trend, and taking the necessary actions required for controlling and preventing future occurrences. Objective: Predicting and analyzing the trend of traffic accidents deaths in Iran in 2014 and 2015. Settings and Design: It was a cross-sectional study. Materials and Methods: All the information related to fatal traffic accidents available in the database of Iran Legal Medicine Organization from 2004 to the end of 2013 were used to determine the change points (multi-variable time series analysis). Using autoregressive integrated moving average (ARIMA) model, traffic accidents death rates were predicted for 2014 and 2015, and a comparison was made between this rate and the predicted value in order to determine the efficiency of the model. Results: From the results, the actual death rate in 2014 was almost similar to that recorded for this year, while in 2015 there was a decrease compared with the previous year (2014) for all the months. A maximum value of 41% was also predicted for the months of January and February, 2015. Conclusion: From the prediction and analysis of the death trends, proper application and continuous use of the intervention conducted in the previous years for road safety improvement, motor vehicle safety improvement, particularly training and culture-fostering interventions, as well as approval and execution of deterrent regulations for changing the organizational behaviors, can significantly decrease the loss caused by traffic accidents. PMID:27308255

  5. Modeling secondary accidents identified by traffic shock waves.

    PubMed

    Junhua, Wang; Boya, Liu; Lanfang, Zhang; Ragland, David R

    2016-02-01

    The high potential for occurrence and the negative consequences of secondary accidents make them an issue of great concern affecting freeway safety. Using accident records from a three-year period together with California interstate freeway loop data, a dynamic method for more accurate classification based on the traffic shock wave detecting method was used to identify secondary accidents. Spatio-temporal gaps between the primary and secondary accident were proven be fit via a mixture of Weibull and normal distribution. A logistic regression model was developed to investigate major factors contributing to secondary accident occurrence. Traffic shock wave speed and volume at the occurrence of a primary accident were explicitly considered in the model, as a secondary accident is defined as an accident that occurs within the spatio-temporal impact scope of the primary accident. Results show that the shock waves originating in the wake of a primary accident have a more significant impact on the likelihood of a secondary accident occurrence than the effects of traffic volume. Primary accidents with long durations can significantly increase the possibility of secondary accidents. Unsafe speed and weather are other factors contributing to secondary crash occurrence. It is strongly suggested that when police or rescue personnel arrive at the scene of an accident, they should not suddenly block, decrease, or unblock the traffic flow, but instead endeavor to control traffic in a smooth and controlled manner. Also it is important to reduce accident processing time to reduce the risk of secondary accident. PMID:26687540

  6. Predicted spatio-temporal dynamics of radiocesium deposited onto forests following the Fukushima nuclear accident

    PubMed Central

    Hashimoto, Shoji; Matsuura, Toshiya; Nanko, Kazuki; Linkov, Igor; Shaw, George; Kaneko, Shinji

    2013-01-01

    The majority of the area contaminated by the Fukushima Dai-ichi nuclear power plant accident is covered by forest. To facilitate effective countermeasure strategies to mitigate forest contamination, we simulated the spatio-temporal dynamics of radiocesium deposited into Japanese forest ecosystems in 2011 using a model that was developed after the Chernobyl accident in 1986. The simulation revealed that the radiocesium inventories in tree and soil surface organic layer components drop rapidly during the first two years after the fallout. Over a period of one to two years, the radiocesium is predicted to move from the tree and surface organic soil to the mineral soil, which eventually becomes the largest radiocesium reservoir within forest ecosystems. Although the uncertainty of our simulations should be considered, the results provide a basis for understanding and anticipating the future dynamics of radiocesium in Japanese forests following the Fukushima accident. PMID:23995073

  7. Characterizing the Severe Turbulence Environments Associated With Commercial Aviation Accidents: A Real-Time Turbulence Model (RTTM) Designed for the Operational Prediction of Hazardous Aviation Turbulence Environments

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Lux, Kevin M.; Cetola, Jeffrey D.; Huffman, Allan W.; Riordan, Allen J.; Slusser, Sarah W.; Lin, Yuh-Lang; Charney, Joseph J.; Waight, Kenneth T.

    2004-01-01

    Real-time prediction of environments predisposed to producing moderate-severe aviation turbulence is studied. We describe the numerical model and its postprocessing system designed for said prediction of environments predisposed to severe aviation turbulence as well as presenting numerous examples of its utility. The numerical model is MASS version 5.13, which is integrated over three different grid matrices in real time on a university work station in support of NASA Langley Research Center s B-757 turbulence research flight missions. The postprocessing system includes several turbulence-related products, including four turbulence forecasting indices, winds, streamlines, turbulence kinetic energy, and Richardson numbers. Additionally, there are convective products including precipitation, cloud height, cloud mass fluxes, lifted index, and K-index. Furthermore, soundings, sounding parameters, and Froude number plots are also provided. The horizontal cross-section plot products are provided from 16 000 to 46 000 ft in 2000-ft intervals. Products are available every 3 hours at the 60- and 30-km grid interval and every 1.5 hours at the 15-km grid interval. The model is initialized from the NWS ETA analyses and integrated two times a day.

  8. Relating aviation service difficulty reports to accident data for safety trend prediction

    SciTech Connect

    Fullwood, R.R.; Hall, R.E.; Martinez-Guridi, G.; Uryasev, S.; Sampath, S.G.

    1996-10-01

    A synthetic model of scheduled-commercial U.S. aviation fatalities was constructed from linear combinations of the time-spectra of critical systems reporting using 5.5 years of Service Difficulty Reports (SDR){sup 2} and Accident Incident Data System (AIDS) records{sup 3}. This model, used to predict near-future trends in aviation accidents, was tested by using the first 36 months of data to construct the synthetic model which was used to predict fatalities during the following eight months. These predictions were tested by comparison with the fatality data. A reliability block diagram (RBD) and third-order extrapolations also were used as predictive models and compared with actuality. The synthetic model was the best predictor because of its use of systems data. Other results of the study are a database of service difficulties for major aviation systems, and a rank ordering of systems according to their contribution to the synthesis. 4 refs., 8 figs., 3 tabs.

  9. An idealized transient model for melt dispersal from reactor cavities during pressurized melt ejection accident scenarios

    SciTech Connect

    Tutu, N.K.

    1991-06-01

    The direct Containment Heating (DCH) calculations require that the transient rate at which the melt is ejected from the reactor cavity during hypothetical pressurized melt ejection accident scenarios be calculated. However, at present no models, that are able to predict the available melt dispersal data from small scale reactor cavity models, are available. In this report, a simple idealized model of the melt dispersal process within a reactor cavity during a pressurized melt ejection accident scenario is presented. The predictions from the model agree reasonably well with the integral data obtained from the melt dispersal experiments using a small scale model of the Surry reactor cavity. 17 refs., 15 figs.

  10. A catastrophe-theory model for simulating behavioral accidents

    SciTech Connect

    Souder, W.E.

    1988-01-01

    Behavioral accidents are a particular type of accident. They are caused by inappropriate individual behaviors and faulty reactions. Catastrophe theory is a means for mathematically modeling the dynamic processes that underlie behavioral accidents. Based on a comprehensive data base of mining accidents, a computerized catastrophe model has been developed by the Bureau of Mines. This model systematically links individual psychological, group behavioral, and mine environmental variables with other accident causing factors. It answers several longstanding questions about why some normally safe behaving persons may spontaneously engage in unsafe acts that have high risks of serious injury. Field tests with the model indicate that it has three imnportant uses: it can be used as a effective training aid for increasing employee safety consciousness; it can be used as a management laboratory for testing decision alternatives and policies; and it can be used to help design the most effective work teams.

  11. Weather and Dispersion Modeling of the Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Dunn, Thomas; Businger, Steven

    2014-05-01

    The surface deposition of radioactive material from the accident at the Fukushima Daiichi nuclear power station was investigated for 11 March to 17 March 2011. A coupled weather and dispersion modeling system was developed and simulations of the accident performed using two independent source terms that differed in emission rate and height and in the total amount of radioactive material released. Observations in Japan during the first week of the accident revealed a natural grouping between periods of dry (12-14 March) and wet (15-17 March) weather. The distinct weather regimes served as convenient validation periods for the model predictions. Results show significant differences in the distribution of cumulative surface deposition of 137Cs due to wet and dry removal processes. A comparison of 137Cs deposition predicted by the model with aircraft observations of surface-deposited gamma radiation showed reasonable agreement in surface contamination patterns during the dry phase of the accident for both source terms. It is suggested that this agreement is because of the weather model's ability to simulate the extent and timing of onshore flow associated with a sea breeze circulation that developed around the time of the first reactor explosion. During the wet phase of the accident the pattern is not as well predicted. It is suggested that this discrepancy is because of differences between model predicted and observed precipitation distributions.

  12. Predicting Posttraumatic Stress Symptoms in Children after Road Traffic Accidents

    ERIC Educational Resources Information Center

    Landolt, Markus A.; Vollrath, Margarete; Timm, Karin; Gnehm, Hanspeter E.; Sennhauser, Felix H.

    2005-01-01

    Objective: To prospectively assess the prevalence, course, and predictors of posttraumatic stress symptoms (PTSSs) in children after road traffic accidents (RTAs). Method: Sixty-eight children (6.5-14.5 years old) were interviewed 4-6 weeks and 12 months after an RTA with the Child PTSD Reaction Index (response rate 58.6%). Their mothers (n = 60)…

  13. Battery Life Predictive Model

    Energy Science and Technology Software Center (ESTSC)

    2009-12-31

    The Software consists of a model used to predict battery capacity fade and resistance growth for arbitrary cycling and temperature profiles. It allows the user to extrapolate from experimental data to predict actual life cycle.

  14. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions. PMID:27286683

  15. Wind power prediction models

    NASA Technical Reports Server (NTRS)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  16. Usefulness of high resolution coastal models for operational oil spill forecast: the Full City accident

    NASA Astrophysics Data System (ADS)

    Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.

    2011-06-01

    Oil spill modeling is considered to be an important decision support system (DeSS) useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas implying that low resolution basin scale ocean models is of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the Full City accident on the Norwegian south coast and compare three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws but including an analysis based on a higher resolution model (1.5 km resolution) for the area the model system show results that compare well with observations. The study also shows that an ensemble using three different models is useful when predicting/analyzing oil spill in coastal areas.

  17. Usefulness of high resolution coastal models for operational oil spill forecast: the "Full City" accident

    NASA Astrophysics Data System (ADS)

    Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.

    2011-11-01

    Oil spill modeling is considered to be an important part of a decision support system (DeSS) for oil spill combatment and is useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas, implying that low resolution basin scale ocean models are of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the "Full City" accident on the Norwegian south coast and compare operational simulations from three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws, but by applying ocean forcing data of higher resolution (1.5 km resolution), the model system shows results that compare well with observations. The study also shows that an ensemble of results from the three different models is useful when predicting/analyzing oil spill in coastal areas.

  18. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. PMID:23376544

  19. Accidents and unpleasant incidents: worry in transport and prediction of travel behavior.

    PubMed

    Backer-Grøndahl, Agathe; Fyhri, Aslak; Ulleberg, Pål; Amundsen, Astrid Helene

    2009-09-01

    Worry on nine different means of transport was measured in a Norwegian sample of 853 respondents. The main aim of the study was to investigate differences in worry about accidents and worry about unpleasant incidents, and how these two sorts of worry relate to various means of transport as well as transport behavior. Factor analyses of worry about accidents suggested a division between rail transport, road transport, and nonmotorized transport, whereas analyses of worry about unpleasant incidents suggested a division between transport modes where you interact with other people and "private" transport modes. Moreover, mean ratings of worry showed that respondents worried more about accidents than unpleasant incidents on private transport modes, and more about unpleasant incidents than accidents on public transport modes. Support for the distinction between worry about accidents and unpleasant incidents was also found when investigating relationships between both types of worry and behavioral adaptations: worry about accidents was more important than worry about unpleasant incidents in relation to behavioral adaptations on private means of transport, whereas the opposite was true for public means of transport. Finally, predictors of worry were investigated. The models of worry about accidents and worry about unpleasant incidents differed as to what predictors turned out significant. Knowledge about peoples' worries on different means of transport is important with regard to understanding and influencing transport and travel behavior, as well as attending to commuters' welfare. PMID:19645756

  20. Predictive models in urology.

    PubMed

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology. PMID:23423686

  1. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  2. [Predictive models for ART].

    PubMed

    Arvis, P; Guivarc'h-Levêque, A; Varlan, E; Colella, C; Lehert, P

    2013-02-01

    A predictive model is a mathematical expression estimating the probability of pregnancy, by combining predictive variables, or indicators. Its development requires three successive phases: formulation of the model, its validation--internal then external--and the impact study. Its performance is assessed by its discrimination and its calibration. Numerous models were proposed, for spontaneous pregnancies, IUI and IVF, but with rather poor results, and their external validation was seldom carried out and was mainly inconclusive. The impact study-consisting in ascertaining whether their use improves medical practice--was exceptionally done. The ideal ART predictive model is a "Center specific" model, helping physicians to choose between abstention, IUI and IVF, by providing a reliable cumulative rate of pregnancy for each option. This tool would allow to rationalize the practices, by avoiding premature, late, or hopeless treatments. The model would also allow to compare the performances between ART Centers based on objective criteria. Today the best solution is to adjust the existing models to one's own practice, by considering models validated with variables describing the treated population, whilst adjusting the calculation to the Center's performances. PMID:23182786

  3. The modelling of fuel volatilisation in accident conditions

    NASA Astrophysics Data System (ADS)

    Manenc, H.; Mason, P. K.; Kissane, M. P.

    2001-04-01

    For oxidising conditions, at high temperatures, the pressure of uranium vapour species at the fuel surface is predicted to be high. These vapour species can be transported away from the fuel surface, giving rise to significant amounts of volatilised fuel, as has been observed during small-scale experiments and taken into account in different models. Hence, fuel volatilisation must be taken into account in the conduct of a simulated severe accident such as the Phebus FPT-4 experiment. A large-scale in-pile test is designed to investigate the release of fission products and actinides from irradiated UO 2 fuel in a debris bed and molten pool configuration. Best estimate predictions for fuel volatilisation were performed before the test. This analysis was used to assess the maximum possible loading of filters collecting emissions and the consequences for the filter-change schedule. Following successful completion of the experiment, blind post-test analysis is being performed; boundary conditions for the calculations are based on the preliminary post-test analysis with the core degradation code ICARE2 [J.C. Crestia, G. Repetto, S. Ederli, in: Proceedings of the Fourth Technical Seminar on the PHEBUS FP Programme, Marseille, France, 20-22 March 2000]. The general modelling approach is presented here and then illustrated by the analysis of fuel volatilisation in Phebus FPT4 (for which results are not yet available). Effort was made to reduce uncertainties in the calculations by improving the understanding of controlling physical processes and by using critically assessed thermodynamic data to determine uranium vapour pressures. The analysis presented here constitutes a preliminary, blind, post-test estimate of fuel volatilised during the test.

  4. Accident sequence precursor analysis level 2/3 model development

    SciTech Connect

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-02-01

    The US Nuclear Regulatory Commission`s Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models.

  5. Advanced accident sequence precursor analysis level 2 models

    SciTech Connect

    Galyean, W.J.; Brownson, D.A.; Rempe, J.L.

    1996-03-01

    The U.S. Nuclear Regulatory Commission Accident Sequence Precursor program pursues the ultimate objective of performing risk significant evaluations on operational events (precursors) occurring in commercial nuclear power plants. To achieve this objective, the Office of Nuclear Regulatory Research is supporting the development of simple probabilistic risk assessment models for all commercial nuclear power plants (NPP) in the U.S. Presently, only simple Level 1 plant models have been developed which estimate core damage frequencies. In order to provide a true risk perspective, the consequences associated with postulated core damage accidents also need to be considered. With the objective of performing risk evaluations in an integrated and consistent manner, a linked event tree approach which propagates the front end results to back end was developed. This approach utilizes simple plant models that analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude and timing of a radioactive release to the environment, and calculate the consequences for a given release. Detailed models and results from previous studies, such as the NUREG-1150 study, are used to quantify these simple models. These simple models are then linked to the existing Level 1 models, and are evaluated using the SAPHIRE code. To demonstrate the approach, prototypic models have been developed for a boiling water reactor, Peach Bottom, and a pressurized water reactor, Zion.

  6. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  7. Predicting pediatric posttraumatic stress disorder after road traffic accidents: the role of parental psychopathology.

    PubMed

    Kolaitis, Gerasimos; Giannakopoulos, George; Liakopoulou, Magda; Pervanidou, Panagiota; Charitaki, Stella; Mihas, Constantinos; Ferentinos, Spyros; Papassotiriou, Ioannis; Chrousos, George P; Tsiantis, John

    2011-08-01

    This study examined prospectively the role of parental psychopathology among other predictors in the development and persistence of posttraumatic stress disorder (PTSD) in 57 hospitalized youths aged 7-18 years immediately after a road traffic accident and 1 and 6 months later. Self report questionnaires and semistructured diagnostic interviews were used in all 3 assessments. Neuroendocrine evaluation was performed at the initial assessment. Maternal PTSD symptomatology predicted the development of children's PTSD 1 month after the event, OR = 6.99, 95% CI [1.049, 45.725]; the persistence of PTSD 6 months later was predicted by the child's increased evening salivary cortisol concentrations within 24 hours of the accident, OR = 1.006, 95% CI [1.001, 1.011]. Evaluation of both biological and psychosocial predictors that increase the risk for later development and maintenance of PTSD is important for appropriate early prevention and treatment. PMID:21812037

  8. A catastrophe-theory model for simulating behavioral accidents

    SciTech Connect

    Souder, W.E.

    1988-01-01

    Based on a comprehensive data base of mining accidents, a computerized catastrophe model has been developed by the Bureau of Mines. This model systematically links individual psychological, group behavioral, and mine environmental variables with other accident causing factors. It answers several longstanding questions about why some normally safe behaving persons may spontaneously engage in unsafe acts that have high risks of serious injury. Field tests with the model indicate that it has three important uses: It can be used as an effective training aid for increasing employee safety consciousness; it can be used as a management laboratory for testing decision alternatives and policies; and it can be used to help design the most effective work teams.

  9. Modelling the oil spill track from Prestige-Nassau accident

    NASA Astrophysics Data System (ADS)

    Montero, P.; Leitao, P.; Penabad, E.; Balseiro, C. F.; Carracedo, P.; Braunschweig, F.; Fernandes, R.; Gomez, B.; Perez-Munuzuri, V.; Neves, R.

    2003-04-01

    On November 13th 2002, the tank ship Prestige-Nassau sent a SOS signal. The hull of the ship was damaged producing an oil spill in front of the Galician coast (NW Spain). The damaged ship took north direction spilling more fuel and affecting the western Galician coast. After this, it changed its track to south. At this first stage of the accident, the ship spilt around 10000 Tm in 19th at the Galician Bank, at 133 NM of Galician coast. From the very beginning, a monitoring and forecasting of the first slick was developed. Afterwards, since southwesternly winds are frequent in wintertime, the slick from the initial spill started to move towards the Galician coast. This drift movement was followed by overflights. With the aim of forecasting the place and arriving date to the coast, some simulations with two different models were developed. The first one was a very simple drift model forced with the surface winds generated by ARPS operational model (1) at MeteoGalicia (regional weather forecast service). The second one was a more complex hydrodynamic model, MOHID2000 (2,3), developed by MARETEC GROUP (Instituto Superior Técnico de Lisboa) in collaboration with GFNL (Grupo de Física Non Lineal, Universidade de Santiago de Compostela). On November 28th, some tarballs appeared at south of main slick. This observations could be explained taking into account the below surface water movement following Ekman dynamic. Some new simulations with the aim of understanding better the physic underlying these observations were performed. Agreed between observations and simulations was achieved. We performed simulations with and without slope current previously calculated by other authors, showing that this current can only introduce subtle differences in the slick's arriving point to the coast and introducing wind as the primary forcing. (1) A two-dimensional particle tracking model for pollution dispersion in A Coruña and Vigo Rias (NW Spain). M. Gómez-Gesteira, P. Montero, R

  10. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    SciTech Connect

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  11. WHEN MODEL MEETS REALITY – A REVIEW OF SPAR LEVEL 2 MODEL AGAINST FUKUSHIMA ACCIDENT

    SciTech Connect

    Zhegang Ma

    2013-09-01

    The Standardized Plant Analysis Risk (SPAR) models are a set of probabilistic risk assessment (PRA) models used by the Nuclear Regulatory Commission (NRC) to evaluate the risk of operations at U.S. nuclear power plants and provide inputs to risk informed regulatory process. A small number of SPAR Level 2 models have been developed mostly for feasibility study purpose. They extend the Level 1 models to include containment systems, group plant damage states, and model containment phenomenology and accident progression in containment event trees. A severe earthquake and tsunami hit the eastern coast of Japan in March 2011 and caused significant damages on the reactors in Fukushima Daiichi site. Station blackout (SBO), core damage, containment damage, hydrogen explosion, and intensive radioactivity release, which have been previous analyzed and assumed as postulated accident progression in PRA models, now occurred with various degrees in the multi-units Fukushima Daiichi site. This paper reviews and compares a typical BWR SPAR Level 2 model with the “real” accident progressions and sequences occurred in Fukushima Daiichi Units 1, 2, and 3. It shows that the SPAR Level 2 model is a robust PRA model that could very reasonably describe the accident progression for a real and complicated nuclear accident in the world. On the other hand, the comparison shows that the SPAR model could be enhanced by incorporating some accident characteristics for better representation of severe accident progression.

  12. Dynamic modelling of radionuclide uptake by marine biota: application to the Fukushima nuclear power plant accident.

    PubMed

    Vives i Batlle, Jordi

    2016-01-01

    The dynamic model D-DAT was developed to study the dynamics of radionuclide uptake and turnover in biota and sediments in the immediate aftermath of the Fukushima accident. This dynamics is determined by the interplay between the residence time of radionuclides in seawater/sediments and the biological half-lives of elimination by the biota. The model calculates time-variable activity concentration of (131)I, (134)Cs, (137)Cs and (90)Sr in seabed sediment, fish, crustaceans, molluscs and macroalgae from surrounding activity concentrations in seawater, with which to derive internal and external dose rates. A central element of the model is the inclusion of dynamic transfer of radionuclides to/from sediments by factorising the depletion of radionuclides adsorbed onto suspended particulates, molecular diffusion, pore water mixing and bioturbation, represented by a simple set of differential equations coupled with the biological uptake/turnover processes. In this way, the model is capable of reproducing activity concentration in sediment more realistically. The model was used to assess the radiological impact of the Fukushima accident on marine biota in the acute phase of the accident. Sediment and biota activity concentrations are within the wide range of actual monitoring data. Activity concentrations in marine biota are thus shown to be better calculated by a dynamic model than with the simpler equilibrium approach based on concentration factors, which tends to overestimate for the acute accident period. Modelled dose rates from external exposure from sediment are also significantly below equilibrium predictions. The model calculations confirm previous studies showing that radioactivity levels in marine biota have been generally below the levels necessary to cause a measurable effect on populations. The model was used in mass-balance mode to calculate total integrated releases of 103, 30 and 3 PBq for (131)I, (137)Cs and (90)Sr, reasonably in line with previous

  13. Markov Model of Severe Accident Progression and Management

    SciTech Connect

    Bari, R.A.; Cheng, L.; Cuadra,A.; Ginsberg,T.; Lehner,J.; Martinez-Guridi,G.; Mubayi,V.; Pratt,W.T.; Yue, M.

    2012-06-25

    The earthquake and tsunami that hit the nuclear power plants at the Fukushima Daiichi site in March 2011 led to extensive fuel damage, including possible fuel melting, slumping, and relocation at the affected reactors. A so-called feed-and-bleed mode of reactor cooling was initially established to remove decay heat. The plan was to eventually switch over to a recirculation cooling system. Failure of feed and bleed was a possibility during the interim period. Furthermore, even if recirculation was established, there was a possibility of its subsequent failure. Decay heat has to be sufficiently removed to prevent further core degradation. To understand the possible evolution of the accident conditions and to have a tool for potential future hypothetical evaluations of accidents at other nuclear facilities, a Markov model of the state of the reactors was constructed in the immediate aftermath of the accident and was executed under different assumptions of potential future challenges. This work was performed at the request of the U.S. Department of Energy to explore 'what-if' scenarios in the immediate aftermath of the accident. The work began in mid-March and continued until mid-May 2011. The analysis had the following goals: (1) To provide an overall framework for describing possible future states of the damaged reactors; (2) To permit an impact analysis of 'what-if' scenarios that could lead to more severe outcomes; (3) To determine approximate probabilities of alternative end-states under various assumptions about failure and repair times of cooling systems; (4) To infer the reliability requirements of closed loop cooling systems needed to achieve stable core end-states and (5) To establish the importance for the results of the various cooling system and physical phenomenological parameters via sensitivity calculations.

  14. Mathematical model to predict drivers' reaction speeds.

    PubMed

    Long, Benjamin L; Gillespie, A Isabella; Tanaka, Martin L

    2012-02-01

    Mental distractions and physical impairments can increase the risk of accidents by affecting a driver's ability to control the vehicle. In this article, we developed a linear mathematical model that can be used to quantitatively predict drivers' performance over a variety of possible driving conditions. Predictions were not limited only to conditions tested, but also included linear combinations of these tests conditions. Two groups of 12 participants were evaluated using a custom drivers' reaction speed testing device to evaluate the effect of cell phone talking, texting, and a fixed knee brace on the components of drivers' reaction speed. Cognitive reaction time was found to increase by 24% for cell phone talking and 74% for texting. The fixed knee brace increased musculoskeletal reaction time by 24%. These experimental data were used to develop a mathematical model to predict reaction speed for an untested condition, talking on a cell phone with a fixed knee brace. The model was verified by comparing the predicted reaction speed to measured experimental values from an independent test. The model predicted full braking time within 3% of the measured value. Although only a few influential conditions were evaluated, we present a general approach that can be expanded to include other types of distractions, impairments, and environmental conditions. PMID:22431214

  15. Advanced accident sequence precursor analysis level 1 models

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O.

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  16. Development of hydrogeological modelling approaches for assessment of consequences of hazardous accidents at nuclear power plants

    SciTech Connect

    Rumynin, V.G.; Mironenko, V.A.; Konosavsky, P.K.; Pereverzeva, S.A.

    1994-07-01

    This paper introduces some modeling approaches for predicting the influence of hazardous accidents at nuclear reactors on groundwater quality. Possible pathways for radioactive releases from nuclear power plants were considered to conceptualize boundary conditions for solving the subsurface radionuclides transport problems. Some approaches to incorporate physical-and-chemical interactions into transport simulators have been developed. The hydrogeological forecasts were based on numerical and semi-analytical scale-dependent models. They have been applied to assess the possible impact of the nuclear power plants designed in Russia on groundwater reservoirs.

  17. A simplified model for calculating atmospheric radionuclide transport and early health effects from nuclear reactor accidents

    SciTech Connect

    Madni, I.K.; Cazzoli, E.G.; Khatib-Rahbar, M.

    1995-11-01

    During certain hypothetical severe accidents in a nuclear power plant, radionuclides could be released to the environment as a plume. Prediction of the atmospheric dispersion and transport of these radionuclides is important for assessment of the risk to the public from such accidents. A simplified PC-based model was developed that predicts time-integrated air concentration of each radionuclide at any location from release as a function of time integrated source strength using the Gaussian plume model. The solution procedure involves direct analytic integration of air concentration equations over time and position, using simplified meteorology. The formulation allows for dry and wet deposition, radioactive decay and daughter buildup, reactor building wake effects, the inversion lid effect, plume rise due to buoyancy or momentum, release duration, and grass height. Based on air and ground concentrations of the radionuclides, the early dose to an individual is calculated via cloudshine, groundshine, and inhalation. The model also calculates early health effects based on the doses. This paper presents aspects of the model that would be of interest to the prediction of environmental flows and their public consequences.

  18. Markov Model of Accident Progression at Fukushima Daiichi

    SciTech Connect

    Cuadra A.; Bari R.; Cheng, L-Y; Ginsberg, T.; Lehner, J.; Martinez-Guridi, G.; Mubayi, V.; Pratt, T.; Yue, M.

    2012-11-11

    On March 11, 2011, a magnitude 9.0 earthquake followed by a tsunami caused loss of offsite power and disabled the emergency diesel generators, leading to a prolonged station blackout at the Fukushima Daiichi site. After successful reactor trip for all operating reactors, the inability to remove decay heat over an extended period led to boil-off of the water inventory and fuel uncovery in Units 1-3. A significant amount of metal-water reaction occurred, as evidenced by the quantities of hydrogen generated that led to hydrogen explosions in the auxiliary buildings of the Units 1 & 3, and in the de-fuelled Unit 4. Although it was assumed that extensive fuel damage, including fuel melting, slumping, and relocation was likely to have occurred in the core of the affected reactors, the status of the fuel, vessel, and drywell was uncertain. To understand the possible evolution of the accident conditions at Fukushima Daiichi, a Markov model of the likely state of one of the reactors was constructed and executed under different assumptions regarding system performance and reliability. The Markov approach was selected for several reasons: It is a probabilistic model that provides flexibility in scenario construction and incorporates time dependence of different model states. It also readily allows for sensitivity and uncertainty analyses of different failure and repair rates of cooling systems. While the analysis was motivated by a need to gain insight on the course of events for the damaged units at Fukushima Daiichi, the work reported here provides a more general analytical basis for studying and evaluating severe accident evolution over extended periods of time. This work was performed at the request of the U.S. Department of Energy to explore 'what-if' scenarios in the immediate aftermath of the accidents.

  19. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  20. ON PREDICTION AND MODEL VALIDATION

    SciTech Connect

    M. MCKAY; R. BECKMAN; K. CAMPBELL

    2001-02-01

    Quantification of prediction uncertainty is an important consideration when using mathematical models of physical systems. This paper proposes a way to incorporate ''validation data'' in a methodology for quantifying uncertainty of the mathematical predictions. The report outlines a theoretical framework.

  1. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  2. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    SciTech Connect

    Carbajo, J.J.

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  3. ATMOSPHERIC MODELING IN SUPPORT OF A ROADWAY ACCIDENT

    SciTech Connect

    Buckley, R.; Hunter, C.

    2010-10-21

    The United States Forest Service-Savannah River (USFS) routinely performs prescribed fires at the Savannah River Site (SRS), a Department of Energy (DOE) facility located in southwest South Carolina. This facility covers {approx}800 square kilometers and is mainly wooded except for scattered industrial areas containing facilities used in managing nuclear materials for national defense and waste processing. Prescribed fires of forest undergrowth are necessary to reduce the risk of inadvertent wild fires which have the potential to destroy large areas and threaten nuclear facility operations. This paper discusses meteorological observations and numerical model simulations from a period in early 2002 of an incident involving an early-morning multicar accident caused by poor visibility along a major roadway on the northern border of the SRS. At the time of the accident, it was not clear if the limited visibility was due solely to fog or whether smoke from a prescribed burn conducted the previous day just to the northwest of the crash site had contributed to the visibility. Through use of available meteorological information and detailed modeling, it was determined that the primary reason for the low visibility on this night was fog induced by meteorological conditions.

  4. a Simplified Methodology for the Prediction of the Small Break Loss-Of Accident.

    NASA Astrophysics Data System (ADS)

    Ward, Leonard William

    1988-12-01

    This thesis describes a complete methodology which has allowed for the development of a faster than real time computer program designed to simulate a small break loss -of-coolant accident in the primary system of a pressurized water reactor. By developing an understanding of the major phenomenon governing the small break LOCA fluid response, the system model representation can be greatly simplified leading to a very fast executing transient system blowdown code. Because of the fast execution times, the CULSETS code, or Columbia University Loss-of-Coolant Accident and System Excursion Transient Simulator code, is ideal for performing parametric studies of Emergency Core Cooling system or assessing the consequences of the many operator actions performed to place the system in a long term cooling mode following a small break LOCA. While the methodology was designed with specific application to the small break loss-of-coolant accident, it can also be used to simulate loss-of-feedwater, steam line breaks, and steam generator tube rupture events. The code is easily adaptable to a personal computer and could also be modified to provide the primary and secondary system responses to supply the required inputs to a simulator for a pressurized water reactor.

  5. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  6. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  7. Simulation Study of Traffic Accidents in Bidirectional Traffic Models

    NASA Astrophysics Data System (ADS)

    Moussa, Najem

    Conditions for the occurrence of bidirectional collisions are developed based on the Simon-Gutowitz bidirectional traffic model. Three types of dangerous situations can occur in this model. We analyze those corresponding to head-on collision; rear-end collision and lane-changing collision. Using Monte Carlo simulations, we compute the probability of the occurrence of these collisions for different values of the oncoming cars' density. It is found that the risk of collisions is important when the density of cars in one lane is small and that of the other lane is high enough. The influence of different proportions of heavy vehicles is also studied. We found that heavy vehicles cause an important reduction of traffic flow on the home lane and provoke an increase of the risk of car accidents.

  8. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  9. A statistical model for predicting muscle performance

    NASA Astrophysics Data System (ADS)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  10. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  11. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. PREDICTIVE MODELS. Enhanced Oil Recovery Model

    SciTech Connect

    Ray, R.M.

    1992-02-26

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding; 2 carbon dioxide miscible flooding; 3 in-situ combustion; 4 polymer flooding; and 5 steamflood. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes. The IBM PC/AT version includes a plotting capability to produces a graphic picture of the predictive model results.

  13. VICTORIA: A mechanistic model of radionuclide behavior in the reactor coolant system under severe accident conditions

    SciTech Connect

    Heames, T.J. ); Williams, D.A.; Johns, N.A.; Chown, N.M. ); Bixler, N.E.; Grimley, A.J. ); Wheatley, C.J. )

    1990-10-01

    This document provides a description of a model of the radionuclide behavior in the reactor coolant system (RCS) of a light water reactor during a severe accident. This document serves as the user's manual for the computer code called VICTORIA, based upon the model. The VICTORIA code predicts fission product release from the fuel, chemical reactions between fission products and structural materials, vapor and aerosol behavior, and fission product decay heating. This document provides a detailed description of each part of the implementation of the model into VICTORIA, the numerical algorithms used, and the correlations and thermochemical data necessary for determining a solution. A description of the code structure, input and output, and a sample problem are provided. The VICTORIA code was developed upon a CRAY-XMP at Sandia National Laboratories in the USA and a CRAY-2 and various SUN workstations at the Winfrith Technology Centre in England. 60 refs.

  14. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  15. Another Look at the Relationship Between Accident- and Encroachment-Based Approaches to Run-Off-the-Road Accidents Modeling

    SciTech Connect

    Miaou, Shaw-Pin

    1997-08-01

    The purpose of this study was to look for ways to combine the strengths of both approaches in roadside safety research. The specific objectives were (1) to present the encroachment-based approach in a more systematic and coherent way so that its limitations and strengths can be better understood from both statistical and engineering standpoints, and (2) to apply the analytical and engineering strengths of the encroachment-based thinking to the formulation of mean functions in accident-based models.

  16. Predictive Modeling in Race Walking

    PubMed Central

    Wiktorowicz, Krzysztof; Przednowek, Krzysztof; Lassota, Lesław; Krzeszowski, Tomasz

    2015-01-01

    This paper presents the use of linear and nonlinear multivariable models as tools to support training process of race walkers. These models are calculated using data collected from race walkers' training events and they are used to predict the result over a 3 km race based on training loads. The material consists of 122 training plans for 21 athletes. In order to choose the best model leave-one-out cross-validation method is used. The main contribution of the paper is to propose the nonlinear modifications for linear models in order to achieve smaller prediction error. It is shown that the best model is a modified LASSO regression with quadratic terms in the nonlinear part. This model has the smallest prediction error and simplified structure by eliminating some of the predictors. PMID:26339230

  17. Prediction of the Possibility a Right-Turn Driving Behavior at Intersection Leads to an Accident by Detecting Deviation of the Situation from Usual when the Behavior is Observed

    NASA Astrophysics Data System (ADS)

    Hayashi, Toshinori; Yamada, Keiichi

    Deviation of driving behavior from usual could be a sign of human error that increases the risk of traffic accidents. This paper proposes a novel method for predicting the possibility a driving behavior leads to an accident from the information on the driving behavior and the situation. In a previous work, a method of predicting the possibility by detecting the deviation of driving behavior from usual one in that situation has been proposed. In contrast, the method proposed in this paper predicts the possibility by detecting the deviation of the situation from usual one when the behavior is observed. An advantage of the proposed method is the number of the required models is independent of the variety of the situations. The method was applied to a problem of predicting accidents by right-turn driving behavior at an intersection, and the performance of the method was evaluated by experiments on a driving simulator.

  18. VICTORIA: A mechanistic model of radionuclide behavior in the reactor coolant system under severe accident conditions. Revision 1

    SciTech Connect

    Heams, T J; Williams, D A; Johns, N A; Mason, A; Bixler, N E; Grimley, A J; Wheatley, C J; Dickson, L W; Osborn-Lee, I; Domagala, P; Zawadzki, S; Rest, J; Alexander, C A; Lee, R Y

    1992-12-01

    The VICTORIA model of radionuclide behavior in the reactor coolant system (RCS) of a light water reactor during a severe accident is described. It has been developed by the USNRC to define the radionuclide phenomena and processes that must be considered in systems-level models used for integrated analyses of severe accident source terms. The VICTORIA code, based upon this model, predicts fission product release from the fuel, chemical reactions involving fission products, vapor and aerosol behavior, and fission product decay heating. Also included is a detailed description of how the model is implemented in VICTORIA, the numerical algorithms used, and the correlations and thermochemical data necessary for determining a solution. A description of the code structure, input and output, and a sample problem are provided.

  19. Model aids cuttings transport prediction

    SciTech Connect

    Gavignet, A.A. ); Sobey, I.J. )

    1989-09-01

    Drilling of highly deviated wells can be complicated by the formation of a thick bed of cuttings at low flow rates. The model proposed in this paper shows what mechanisms control the thickness of such a bed, and the model predictions are compared with experimental results.

  20. A contrail cirrus prediction model

    NASA Astrophysics Data System (ADS)

    Schumann, U.

    2012-05-01

    A new model to simulate and predict the properties of a large ensemble of contrails as a function of given air traffic and meteorology is described. The model is designed for approximate prediction of contrail cirrus cover and analysis of contrail climate impact, e.g. within aviation system optimization processes. The model simulates the full contrail life-cycle. Contrail segments form between waypoints of individual aircraft tracks in sufficiently cold and humid air masses. The initial contrail properties depend on the aircraft. The advection and evolution of the contrails is followed with a Lagrangian Gaussian plume model. Mixing and bulk cloud processes are treated quasi analytically or with an effective numerical scheme. Contrails disappear when the bulk ice content is sublimating or precipitating. The model has been implemented in a "Contrail Cirrus Prediction Tool" (CoCiP). This paper describes the model assumptions, the equations for individual contrails, and the analysis-method for contrail-cirrus cover derived from the optical depth of the ensemble of contrails and background cirrus. The model has been applied for a case study and compared to the results of other models and in-situ contrail measurements. The simple model reproduces a considerable part of observed contrail properties. Mid-aged contrails provide the largest contributions to the product of optical depth and contrail width, important for climate impact.

  1. A crash-prediction model for road tunnels.

    PubMed

    Caliendo, Ciro; De Guglielmo, Maria Luisa; Guida, Maurizio

    2013-06-01

    Considerable research has been carried out into open roads to establish relationships between crashes and traffic flow, geometry of infrastructure and environmental factors, whereas crash-prediction models for road tunnels, have rarely been investigated. In addition different results have been sometimes obtained regarding the effects of traffic and geometry on crashes in road tunnels. However, most research has focused on tunnels where traffic and geometric conditions, as well as driving behaviour, differ from those in Italy. Thus, in this paper crash prediction-models that had not yet been proposed for Italian road tunnels have been developed. For the purpose, a 4-year monitoring period extending from 2006 to 2009 was considered. The tunnels investigated are single-tube ones with unidirectional traffic. The Bivariate Negative Binomial regression model, jointly applied to non-severe crashes (accidents involving material-damage only) and severe crashes (fatal and injury accidents only), was used to model the frequency of accident occurrence. The year effect on severe crashes was also analyzed by the Random Effects Binomial regression model and the Negative Multinomial regression model. Regression parameters were estimated by the Maximum Likelihood Method. The Cumulative Residual Method was used to test the adequacy of the regression model through the range of annual average daily traffic per lane. The candidate set of variables was: tunnel length (L), annual average daily traffic per lane (AADTL), percentage of trucks (%Tr), number of lanes (NL), and the presence of a sidewalk. Both for non-severe crashes and severe crashes, prediction-models showed that significant variables are: L, AADTL, %Tr, and NL. A significant year effect consisting in a systematic reduction of severe crashes over time was also detected. The analysis developed in this paper appears to be useful for many applications such as the estimation of accident reductions due to improvement in existing

  2. Estimation Of 137Cs Using Atmospheric Dispersion Models After A Nuclear Reactor Accident

    NASA Astrophysics Data System (ADS)

    Simsek, V.; Kindap, T.; Unal, A.; Pozzoli, L.; Karaca, M.

    2012-04-01

    Nuclear energy will continue to have an important role in the production of electricity in the world as the need of energy grows up. But the safety of power plants will always be a question mark for people because of the accidents happened in the past. Chernobyl nuclear reactor accident which happened in 26 April 1986 was the biggest nuclear accident ever. Because of explosion and fire large quantities of radioactive material was released to the atmosphere. The release of the radioactive particles because of accident affected not only its region but the entire Northern hemisphere. But much of the radioactive material was spread over west USSR and Europe. There are many studies about distribution of radioactive particles and the deposition of radionuclides all over Europe. But this was not true for Turkey especially for the deposition of radionuclides released after Chernobyl nuclear reactor accident and the radiation doses received by people. The aim of this study is to determine the radiation doses received by people living in Turkish territory after Chernobyl nuclear reactor accident and use this method in case of an emergency. For this purpose The Weather Research and Forecasting (WRF) Model was used to simulate meteorological conditions after the accident. The results of WRF which were for the 12 days after accident were used as input data for the HYSPLIT model. NOAA-ARL's (National Oceanic and Atmospheric Administration Air Resources Laboratory) dispersion model HYSPLIT was used to simulate the 137Cs distrubition. The deposition values of 137Cs in our domain after Chernobyl Nuclear Reactor Accident were between 1.2E-37 Bq/m2 and 3.5E+08 Bq/m2. The results showed that Turkey was affected because of the accident especially the Black Sea Region. And the doses were calculated by using GENII-LIN which is multipurpose health physics code.

  3. What do saliency models predict?

    PubMed Central

    Koehler, Kathryn; Guo, Fei; Zhang, Sheng; Eckstein, Miguel P.

    2014-01-01

    Saliency models have been frequently used to predict eye movements made during image viewing without a specified task (free viewing). Use of a single image set to systematically compare free viewing to other tasks has never been performed. We investigated the effect of task differences on the ability of three models of saliency to predict the performance of humans viewing a novel database of 800 natural images. We introduced a novel task where 100 observers made explicit perceptual judgments about the most salient image region. Other groups of observers performed a free viewing task, saliency search task, or cued object search task. Behavior on the popular free viewing task was not best predicted by standard saliency models. Instead, the models most accurately predicted the explicit saliency selections and eye movements made while performing saliency judgments. Observers' fixations varied similarly across images for the saliency and free viewing tasks, suggesting that these two tasks are related. The variability of observers' eye movements was modulated by the task (lowest for the object search task and greatest for the free viewing and saliency search tasks) as well as the clutter content of the images. Eye movement variability in saliency search and free viewing might be also limited by inherent variation of what observers consider salient. Our results contribute to understanding the tasks and behavioral measures for which saliency models are best suited as predictors of human behavior, the relationship across various perceptual tasks, and the factors contributing to observer variability in fixational eye movements. PMID:24618107

  4. CFD modeling of debris melting phenomena during late phase Candu 6 severe accident

    SciTech Connect

    Nicolici, S.; Dupleac, D.; Prisecaru, I.

    2012-07-01

    The objective of this paper was to study the phase change of the debris formed on the Candu 6 calandria bottom in a postulated accident sequence. The molten pool and crust formation were studied employing the Ansys-Fluent code. The 3D model using Large Eddy Simulation (LES) predicts the conjugate, radiative and convective heat transfer inside and from the corium pool. LES simulations require a very fine grid to capture the crust formation and the free convection flow. This aspect (fine mesh requirement) correlated with the long transient has imposed the use of a slice from the 3D calandria geometry in order not to exceed the computing resources. The preliminary results include heat transfer coefficients, temperature profiles and heat fluxes through calandria wall. From the safety point of view it is very important to maintain a heat flux through the wall below the CHF assuring the integrity of the calandria vessel. This can be achieved by proper cooling of the tank water which contains the vessel. Also, transient duration can be estimated being important in developing guidelines for severe accidents management. The debris physical structure and material properties have large uncertainties in the temperature range of interest. Thus, further sensitivity studies should be carried out in order to better understand the influence of these parameters on this complex phenomenon. (authors)

  5. Innovative approach to modeling accident response of Gravel Gerties

    SciTech Connect

    Kramer, M.; McClure, P.; Sullivan, H.

    1997-08-01

    Recent safety analyses at nuclear explosive facilities have renewed interest in the accident phenomenology associated with explosions in nuclear explosive cells, which are commonly referred to as {open_quotes}Gravel Gerties.{close_quotes} The cells are used for the assembly and disassembly of nuclear explosives and are located in the Device Assembly Facility (DAF) at the Nevada Test Site (NTS) and at the Pantex facility. The cells are designed to mitigate the release of special nuclear material to the environment in the event of a detonation of high explosive within the Gravel Gertie. Although there are some subtle differences between the cells of DAF and Pantex, their general design, geometry, and configuration are similar. The cells consist of a round room approximately 10.4 m in diameter and 5.2 m high enclosed by 0.3-m-thick concrete. Each cell has a wire-rope cantenary roof overlain with gravel. The gravel is approximately 6.9 m deep at the center of the roof and decreases toward the outer edge of the cell. The cell is connected to a corridor and subsequent rooms through an interlocking blast door. In the event of a accidental explosion involving significant amounts of high explosive, the roof structure is lifted by the force of the explosion, the supporting cables break, the gravel is lifted by the blast (resulting in rapid venting of the cell), and the gravel roof collapses, filling the cell. The lifting and subsequent collapse of the gravel, which acts much like a piston, is very challenging to model.

  6. PREDICTIVE MODELS. Enhanced Oil Recovery Model

    SciTech Connect

    Ray, R.M.

    1992-02-26

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding, where soap-like surfactants are injected into the reservoir to wash out the oil; 2 carbon dioxide miscible flooding, where carbon dioxide mixes with the lighter hydrocarbons making the oil easier to displace; 3 in-situ combustion, which uses the heat from burning some of the underground oil to thin the product; 4 polymer flooding, where thick, cohesive material is pumped into a reservoir to push the oil through the underground rock; and 5 steamflood, where pressurized steam is injected underground to thin the oil. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes.

  7. A method for modeling and analysis of directed weighted accident causation network (DWACN)

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ding, Jing

    2015-11-01

    Using complex network theory to analyze accidents is effective to understand the causes of accidents in complex systems. In this paper, a novel method is proposed to establish directed weighted accident causation network (DWACN) for the Rail Accident Investigation Branch (RAIB) in the UK, which is based on complex network and using event chains of accidents. DWACN is composed of 109 nodes which denote causal factors and 260 directed weighted edges which represent complex interrelationships among factors. The statistical properties of directed weighted complex network are applied to reveal the critical factors, the key event chains and the important classes in DWACN. Analysis results demonstrate that DWACN has characteristics of small-world networks with short average path length and high weighted clustering coefficient, and display the properties of scale-free networks captured by that the cumulative degree distribution follows an exponential function. This modeling and analysis method can assist us to discover the latent rules of accidents and feature of faults propagation to reduce accidents. This paper is further development on the research of accident analysis methods using complex network.

  8. Predictive Models of Liver Cancer

    EPA Science Inventory

    Predictive models of chemical-induced liver cancer face the challenge of bridging causative molecular mechanisms to adverse clinical outcomes. The latent sequence of intervening events from chemical insult to toxicity are poorly understood because they span multiple levels of bio...

  9. Highway accident severities and the mixed logit model: an exploratory empirical analysis.

    PubMed

    Milton, John C; Shankar, Venky N; Mannering, Fred L

    2008-01-01

    Many transportation agencies use accident frequencies, and statistical models of accidents frequencies, as a basis for prioritizing highway safety improvements. However, the use of accident severities in safety programming has been often been limited to the locational assessment of accident fatalities, with little or no emphasis being placed on the full severity distribution of accidents (property damage only, possible injury, injury)-which is needed to fully assess the benefits of competing safety-improvement projects. In this paper we demonstrate a modeling approach that can be used to better understand the injury-severity distributions of accidents on highway segments, and the effect that traffic, highway and weather characteristics have on these distributions. The approach we use allows for the possibility that estimated model parameters can vary randomly across roadway segments to account for unobserved effects potentially relating to roadway characteristics, environmental factors, and driver behavior. Using highway-injury data from Washington State, a mixed (random parameters) logit model is estimated. Estimation findings indicate that volume-related variables such as average daily traffic per lane, average daily truck traffic, truck percentage, interchanges per mile and weather effects such as snowfall are best modeled as random-parameters-while roadway characteristics such as the number of horizontal curves, number of grade breaks per mile and pavement friction are best modeled as fixed parameters. Our results show that the mixed logit model has considerable promise as a methodological tool in highway safety programming. PMID:18215557

  10. Inter-comparison of dynamic models for radionuclide transfer to marine biota in a Fukushima accident scenario.

    PubMed

    Vives I Batlle, J; Beresford, N A; Beaugelin-Seiller, K; Bezhenar, R; Brown, J; Cheng, J-J; Ćujić, M; Dragović, S; Duffa, C; Fiévet, B; Hosseini, A; Jung, K T; Kamboj, S; Keum, D-K; Kryshev, A; LePoire, D; Maderich, V; Min, B-I; Periáñez, R; Sazykina, T; Suh, K-S; Yu, C; Wang, C; Heling, R

    2016-03-01

    We report an inter-comparison of eight models designed to predict the radiological exposure of radionuclides in marine biota. The models were required to simulate dynamically the uptake and turnover of radionuclides by marine organisms. Model predictions of radionuclide uptake and turnover using kinetic calculations based on biological half-life (TB1/2) and/or more complex metabolic modelling approaches were used to predict activity concentrations and, consequently, dose rates of (90)Sr, (131)I and (137)Cs to fish, crustaceans, macroalgae and molluscs under circumstances where the water concentrations are changing with time. For comparison, the ERICA Tool, a model commonly used in environmental assessment, and which uses equilibrium concentration ratios, was also used. As input to the models we used hydrodynamic forecasts of water and sediment activity concentrations using a simulated scenario reflecting the Fukushima accident releases. Although model variability is important, the intercomparison gives logical results, in that the dynamic models predict consistently a pattern of delayed rise of activity concentration in biota and slow decline instead of the instantaneous equilibrium with the activity concentration in seawater predicted by the ERICA Tool. The differences between ERICA and the dynamic models increase the shorter the TB1/2 becomes; however, there is significant variability between models, underpinned by parameter and methodological differences between them. The need to validate the dynamic models used in this intercomparison has been highlighted, particularly in regards to optimisation of the model biokinetic parameters. PMID:26717350

  11. A Statistical Approach to Predict the Failure Enthalpy and Reliability of Irradiated PWR Fuel Rods During Reactivity-Initiated Accidents

    SciTech Connect

    Nam, Cheol; Jeong, Yong-Hwan; Jung, Youn-Ho

    2001-11-15

    During the last decade, the failure behavior of high-burnup fuel rods under a reactivity-initiated accident (RIA) condition has been a serious concern since fuel rod failures at low enthalpy have been observed. This has resulted in the reassessment of existing licensing criteria and failure-mode study. To address the issue, a statistics-based methodology is suggested to predict failure probability of irradiated fuel rods under an RIA. Based on RIA simulation results in the literature, a failure enthalpy correlation for an irradiated fuel rod is constructed as a function of oxide thickness, fuel burnup, and pulse width. Using the failure enthalpy correlation, a new concept of ''equivalent enthalpy'' is introduced to reflect the effects of the three primary factors as well as peak fuel enthalpy into a single damage parameter. Moreover, the failure distribution function with equivalent enthalpy is derived, applying a two-parameter Weibull statistical model. Finally, the sensitivity analysis is carried out to estimate the effects of burnup, corrosion, peak fuel enthalpy, pulse width, and cladding materials used.

  12. Predictive Capability Maturity Model (PCMM).

    SciTech Connect

    Swiler, Laura Painton; Knupp, Patrick Michael; Urbina, Angel

    2010-10-01

    Predictive Capability Maturity Model (PCMM) is a communication tool that must include a dicussion of the supporting evidence. PCMM is a tool for managing risk in the use of modeling and simulation. PCMM is in the service of organizing evidence to help tell the modeling and simulation (M&S) story. PCMM table describes what activities within each element are undertaken at each of the levels of maturity. Target levels of maturity can be established based on the intended application. The assessment is to inform what level has been achieved compared to the desired level, to help prioritize the VU activities & to allocate resources.

  13. Cellular automata model simulating traffic car accidents in the on-ramp system

    NASA Astrophysics Data System (ADS)

    Echab, H.; Lakouari, N.; Ez-Zahraouy, H.; Benyoussef, A.

    2015-01-01

    In this paper, using Nagel-Schreckenberg model we study the on-ramp system under the expanded open boundary condition. The phase diagram of the two-lane on-ramp system is computed. It is found that the expanded left boundary insertion strategy enhances the flow in the on-ramp lane. Furthermore, we have studied the probability of the occurrence of car accidents. We distinguish two types of car accidents: the accident at the on-ramp site (Prc) and the rear-end accident in the main road (Pac). It is shown that car accidents at the on-ramp site are more likely to occur when traffic is free on road A. However, the rear-end accidents begin to occur above a critical injecting rate αc1. The influence of the on-ramp length (LB) and position (xC0) on the car accidents probabilities is studied. We found that large LB or xC0 causes an important decrease of the probability Prc. However, only large xC0 provokes an increase of the probability Pac. The effect of the stochastic randomization is also computed.

  14. Mars solar conjunction prediction modeling

    NASA Astrophysics Data System (ADS)

    Srivastava, Vineet K.; Kumar, Jai; Kulshrestha, Shivali; Kushvah, Badam Singh

    2016-01-01

    During the Mars solar conjunction, telecommunication and tracking between the spacecraft and the Earth degrades significantly. The radio signal degradation depends on the angular separation between the Sun, Earth and probe (SEP), the signal frequency band and the solar activity. All radiometric tracking data types display increased noise and signatures for smaller SEP angles. Due to scintillation, telemetry frame errors increase significantly when solar elongation becomes small enough. This degradation in telemetry data return starts at solar elongation angles of around 5° at S-band, around 2° at X-band and about 1° at Ka-band. This paper presents a mathematical model for predicting Mars superior solar conjunction for any Mars orbiting spacecraft. The described model is simulated for the Mars Orbiter Mission which experienced Mars solar conjunction during May-July 2015. Such a model may be useful to flight projects and design engineers in the planning of Mars solar conjunction operational scenarios.

  15. Computer program predicts thermal and flow transients experienced in a reactor loss- of-flow accident

    NASA Technical Reports Server (NTRS)

    Hale, C. J.

    1967-01-01

    Program analyzes the consequences of a loss-of-flow accident in the primary cooling system of a heterogeneous light-water moderated and cooled nuclear reactor. It produces a temperature matrix 36 x 41 /x,y/ which includes fuel surface temperatures relative to the time the pump power was lost.

  16. Object-Oriented Bayesian Networks (OOBN) for Aviation Accident Modeling and Technology Portfolio Impact Assessment

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Ancel, Ersin; Jones, Sharon M.

    2012-01-01

    The concern for reducing aviation safety risk is rising as the National Airspace System in the United States transforms to the Next Generation Air Transportation System (NextGen). The NASA Aviation Safety Program is committed to developing an effective aviation safety technology portfolio to meet the challenges of this transformation and to mitigate relevant safety risks. The paper focuses on the reasoning of selecting Object-Oriented Bayesian Networks (OOBN) as the technique and commercial software for the accident modeling and portfolio assessment. To illustrate the benefits of OOBN in a large and complex aviation accident model, the in-flight Loss-of-Control Accident Framework (LOCAF) constructed as an influence diagram is presented. An OOBN approach not only simplifies construction and maintenance of complex causal networks for the modelers, but also offers a well-organized hierarchical network that is easier for decision makers to exploit the model examining the effectiveness of risk mitigation strategies through technology insertions.

  17. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  18. Modeling the early-phase redistribution of radiocesium fallouts in an evergreen coniferous forest after Chernobyl and Fukushima accidents.

    PubMed

    Calmon, P; Gonze, M-A; Mourlon, Ch

    2015-10-01

    Following the Chernobyl accident, the scientific community gained numerous data on the transfer of radiocesium in European forest ecosystems, including information regarding the short-term redistribution of atmospheric fallout onto forest canopies. In the course of international programs, the French Institute for Radiological Protection and Nuclear Safety (IRSN) developed a forest model, named TREE4 (Transfer of Radionuclides and External Exposure in FORest systems), 15 years ago. Recently published papers on a Japanese evergreen coniferous forest contaminated by Fukushima radiocesium fallout provide interesting and quantitative data on radioactive mass fluxes measured within the forest in the months following the accident. The present study determined whether the approach adopted in the TREE4 model provides satisfactory results for Japanese forests or whether it requires adjustments. This study focused on the interception of airborne radiocesium by forest canopy, and the subsequent transfer to the forest floor through processes such as litterfall, throughfall, and stemflow, in the months following the accident. We demonstrated that TREE4 quite satisfactorily predicted the interception fraction (20%) and the canopy-to-soil transfer (70% of the total deposit in 5 months) in the Tochigi forest. This dynamics was similar to that observed in the Höglwald spruce forest. However, the unexpectedly high contribution of litterfall (31% in 5 months) in the Tochigi forest could not be reproduced in our simulations (2.5%). Possible reasons for this discrepancy are discussed; and sensitivity of the results to uncertainty in deposition conditions was analyzed. PMID:26005747

  19. Development of fission-products transport model in severe-accident scenarios for Scdap/Relap5

    NASA Astrophysics Data System (ADS)

    Honaiser, Eduardo Henrique Rangel

    The understanding and estimation of the release of fission products during a severe accident became one of the priorities of the nuclear community after 1980, with the events of the Three-mile Island unit 2 (TMI-2), in 1979, and Chernobyl accidents, in 1986. Since this time, theoretical developments and experiments have shown that the primary circuit systems of light water reactors (LWR) have the potential to attenuate the release of fission products, a fact that had been neglected before. An advanced tool, compatible with nuclear thermal-hydraulics integral codes, is developed to predict the retention and physical evolution of the fission products in the primary circuit of LWRs, without considering the chemistry effects. The tool embodies the state-of-the-art models for the involved phenomena as well as develops new models. The capabilities acquired after the implementation of this tool in the Scdap/Relap5 code can be used to increase the accuracy of probability safety assessment (PSA) level 2, enhance the reactor accident management procedures and design new emergency safety features.

  20. Climate Modeling and Prediction at NSIPP

    NASA Technical Reports Server (NTRS)

    Suarez, Max; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The talk will review modeling and prediction efforts undertaken as part of NASA's Seasonal to Interannual Prediction Project (NSIPP). The focus will be on atmospheric model results, including its use for experimental seasonal prediction and the diagnostic analysis of climate anomalies. The model's performance in coupled experiments with land and atmosphere models will also be discussed.

  1. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  2. Predictive models and computational toxicology.

    PubMed

    Knudsen, Thomas; Martin, Matthew; Chandler, Kelly; Kleinstreuer, Nicole; Judson, Richard; Sipes, Nisha

    2013-01-01

    Understanding the potential health risks posed by environmental chemicals is a significant challenge elevated by the large number of diverse chemicals with generally uncharacterized exposures, mechanisms, and toxicities. The ToxCast computational toxicology research program was launched by EPA in 2007 and is part of the federal Tox21 consortium to develop a cost-effective approach for efficiently prioritizing the toxicity testing of thousands of chemicals and the application of this information to assessing human toxicology. ToxCast addresses this problem through an integrated workflow using high-throughput screening (HTS) of chemical libraries across more than 650 in vitro assays including biochemical assays, human cells and cell lines, and alternative models such as mouse embryonic stem cells and zebrafish embryo development. The initial phase of ToxCast profiled a library of 309 environmental chemicals, mostly pesticidal actives having rich in vivo data from guideline studies that include chronic/cancer bioassays in mice and rats, multigenerational reproductive studies in rats, and prenatal developmental toxicity endpoints in rats and rabbits. The first phase of ToxCast was used to build models that aim to determine how well in vivo animal effects can be predicted solely from the in vitro data. Phase I is now complete and both the in vitro data (ToxCast) and anchoring in vivo database (ToxRefDB) have been made available to the public (http://actor.epa.gov/). As Phase II of ToxCast is now underway, the purpose of this chapter is to review progress to date with ToxCast predictive modeling, using specific examples on developmental and reproductive effects in rats and rabbits with lessons learned during Phase I. PMID:23138916

  3. Severe accident modeling of a PWR core with different cladding materials

    SciTech Connect

    Johnson, S. C.; Henry, R. E.; Paik, C. Y.

    2012-07-01

    The MAAP v.4 software has been used to model two severe accident scenarios in nuclear power reactors with three different materials as fuel cladding. The TMI-2 severe accident was modeled with Zircaloy-2 and SiC as clad material and a SBO accident in a Zion-like, 4-loop, Westinghouse PWR was modeled with Zircaloy-2, SiC, and 304 stainless steel as clad material. TMI-2 modeling results indicate that lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would result if SiC was substituted for Zircaloy-2 as cladding. SBO modeling results indicate that the calculated time to RCS rupture would increase by approximately 20 minutes if SiC was substituted for Zircaloy-2. Additionally, when an extended SBO accident (RCS creep rupture failure disabled) was modeled, significantly lower peak core temperatures, less H 2 (g) produced, and a smaller mass of molten material would be generated by substituting SiC for Zircaloy-2 or stainless steel cladding. Because the rate of SiC oxidation reaction with elevated temperature H{sub 2}O (g) was set to 0 for this work, these results should be considered preliminary. However, the benefits of SiC as a more accident tolerant clad material have been shown and additional investigation of SiC as an LWR core material are warranted, specifically investigations of the oxidation kinetics of SiC in H{sub 2}O (g) over the range of temperatures and pressures relevant to severe accidents in LWR 's. (authors)

  4. Radiological assessment by compartment model POSEIDON-R of radioactivity released in the ocean following Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Bezhenar, Roman; Maderich, Vladimir; Heling, Rudie; Jung, Kyung Tae; Myoung, Jung-Goo

    2013-04-01

    The modified compartment model POSEIDON-R (Lepicard et al, 2004), was applied to the North-Western Pacific and adjacent seas. It is for the first time, that a compartment model was used in this region, where 25 Nuclear Power Plants (NPP) are operated. The aim of this study is to perform a radiological assessment of the releases of radioactivity due to the Fukushima Daiichi accident. The model predicts the dispersion of radioactivity in water column and in the sediments, and the transfer of radionuclides throughout the marine food web, and the subsequent doses to the population due to the consumption of fishery products. A generic predictive dynamical food-chain model is used instead of concentration factor (CF) approach. The radionuclide uptake model for fish has as central feature the accumulation of radionuclides in the target tissue. Three layer structure of the water column makes it possible to describe deep-water transport adequately. In total 175 boxes cover the Northwestern Pacific, the East China Sea, and the Yellow Sea and East/Japan Sea. Water fluxes between boxes were calculated by averaging three-dimensional currents obtained by hydrodynamic model ROMS over a 10-years period. Tidal mixing between boxes was parameterized. The model was validated on observation data on the Cs-137 in water for the period 1945-2004. The source terms from nuclear weapon tests are regional source term from the bomb tests on Atoll Enewetak and Atoll Bikini and global deposition from weapons tests. The correlation coefficient between predicted and observed concentrations of Cs-137 in the surface water is 0.925 and RMSE=1.43 Bq/m3. A local-scale coastal box was used according POSEIDON's methodology to describe local processes of activity transport, deposition and food web around the Fukushima Daiichi NPP. The source term to the ocean from the Fukushima accident includes a 10-days release of Cs-134 (5 PBq) and Cs-137 (4 PBq) directly into the ocean and 6 and 5 PBq of Cs-134 and

  5. Investigation of shipping accident injury severity and mortality.

    PubMed

    Weng, Jinxian; Yang, Dong

    2015-03-01

    Shipping movements are operated in a complex and high-risk environment. Fatal shipping accidents are the nightmares of seafarers. With ten years' worldwide ship accident data, this study develops a binary logistic regression model and a zero-truncated binomial regression model to predict the probability of fatal shipping accidents and corresponding mortalities. The model results show that both the probability of fatal accidents and mortalities are greater for collision, fire/explosion, contact, grounding, sinking accidents occurred in adverse weather conditions and darkness conditions. Sinking has the largest effects on the increment of fatal accident probability and mortalities. The results also show that the bigger number of mortalities is associated with shipping accidents occurred far away from the coastal area/harbor/port. In addition, cruise ships are found to have more mortalities than non-cruise ships. The results of this study are beneficial for policy-makers in proposing efficient strategies to prevent fatal shipping accidents. PMID:25617776

  6. Predictive Modeling of Cardiac Ischemia

    NASA Technical Reports Server (NTRS)

    Anderson, Gary T.

    1996-01-01

    The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.

  7. Predictive Modeling of Tokamak Configurations*

    NASA Astrophysics Data System (ADS)

    Casper, T. A.; Lodestro, L. L.; Pearlstein, L. D.; Bulmer, R. H.; Jong, R. A.; Kaiser, T. B.; Moller, J. M.

    2001-10-01

    The Corsica code provides comprehensive toroidal plasma simulation and design capabilities with current applications [1] to tokamak, reversed field pinch (RFP) and spheromak configurations. It calculates fixed and free boundary equilibria coupled to Ohm's law, sources, transport models and MHD stability modules. We are exploring operations scenarios for both the DIII-D and KSTAR tokamaks. We will present simulations of the effects of electron cyclotron heating (ECH) and current drive (ECCD) relevant to the Quiescent Double Barrier (QDB) regime on DIII-D exploring long pulse operation issues. KSTAR simulations using ECH/ECCD in negative central shear configurations explore evolution to steady state while shape evolution studies during current ramp up using a hyper-resistivity model investigate startup scenarios and limitations. Studies of high bootstrap fraction operation stimulated by recent ECH/ECCD experiments on DIIID will also be presented. [1] Pearlstein, L.D., et al, Predictive Modeling of Axisymmetric Toroidal Configurations, 28th EPS Conference on Controlled Fusion and Plasma Physics, Madeira, Portugal, June 18-22, 2001. * Work performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  8. Using meteorological ensembles for atmospheric dispersion modelling of the Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Périllat, Raphaël; Korsakissok, Irène; Mallet, Vivien; Mathieu, Anne; Sekiyama, Thomas; Didier, Damien; Kajino, Mizuo; Igarashi, Yasuhito; Adachi, Kouji

    2016-04-01

    Dispersion models are used in response to an accidental release of radionuclides of the atmosphere, to infer mitigation actions, and complement field measurements for the assessment of short and long term environmental and sanitary impacts. However, the predictions of these models are subject to important uncertainties, especially due to input data, such as meteorological fields or source term. This is still the case more than four years after the Fukushima disaster (Korsakissok et al., 2012, Girard et al., 2014). In the framework of the SAKURA project, an MRI-IRSN collaboration, a meteorological ensemble of 20 members designed by MRI (Sekiyama et al. 2013) was used with IRSN's atmospheric dispersion models. Another ensemble, retrieved from ECMWF and comprising 50 members, was also used for comparison. The MRI ensemble is 3-hour assimilated, with a 3-kilometers resolution, designed to reduce the meteorological uncertainty in the Fukushima case. The ECMWF is a 24-hour forecast with a coarser grid, representative of the uncertainty of the data available in a crisis context. First, it was necessary to assess the quality of the ensembles for our purpose, to ensure that their spread was representative of the uncertainty of meteorological fields. Using meteorological observations allowed characterizing the ensembles' spread, with tools such as Talagrand diagrams. Then, the uncertainty was propagated through atmospheric dispersion models. The underlying question is whether the output spread is larger than the input spread, that is, whether small uncertainties in meteorological fields can produce large differences in atmospheric dispersion results. Here again, the use of field observations was crucial, in order to characterize the spread of the ensemble of atmospheric dispersion simulations. In the case of the Fukushima accident, gamma dose rates, air activities and deposition data were available. Based on these data, selection criteria for the ensemble members were

  9. Modelling of conspicuity-related motorcycle accidents in Seremban and Shah Alam, Malaysia.

    PubMed

    Radin, U R; Mackay, M G; Hills, B L

    1996-05-01

    Preliminary analysis of the short-term impact of a running headlights intervention revealed that there has been a significant drop in conspicuity-related motorcycle accidents in the pilot areas, Seremban and Shah Alam, Malaysia. This paper attempts to look in more detail at conspicuity-related accidents involving motorcycles. The aim of the analysis was to establish a statistical model to describe the relationship between the frequency of conspicuity-related motorcycle accidents and a range of explanatory variables so that new insights can be obtained into the effects of introducing a running headlight campaign and regulation. The exogenous variables in this analysis include the influence of time trends, changes in the recording and analysis system, the effect of fasting activities during Ramadhan and the "Balik Kampong" culture, a seasonal cultural-religious holiday activity unique to Malaysia. The model developed revealed that the running headlight intervention reduced the conspicuity-related motorcycle accidents by about 29%. It is concluded that the intervention has been successful in improving conspicuity-related motorcycle accidents in Malaysia. PMID:8799436

  10. Numerical weather prediction model tuning via ensemble prediction system

    NASA Astrophysics Data System (ADS)

    Jarvinen, H.; Laine, M.; Ollinaho, P.; Solonen, A.; Haario, H.

    2011-12-01

    This paper discusses a novel approach to tune predictive skill of numerical weather prediction (NWP) models. NWP models contain tunable parameters which appear in parameterizations schemes of sub-grid scale physical processes. Currently, numerical values of these parameters are specified manually. In a recent dual manuscript (QJRMS, revised) we developed a new concept and method for on-line estimation of the NWP model parameters. The EPPES ("Ensemble prediction and parameter estimation system") method requires only minimal changes to the existing operational ensemble prediction infra-structure and it seems very cost-effective because practically no new computations are introduced. The approach provides an algorithmic decision making tool for model parameter optimization in operational NWP. In EPPES, statistical inference about the NWP model tunable parameters is made by (i) generating each member of the ensemble of predictions using different model parameter values, drawn from a proposal distribution, and (ii) feeding-back the relative merits of the parameter values to the proposal distribution, based on evaluation of a suitable likelihood function against verifying observations. In the presentation, the method is first illustrated in low-order numerical tests using a stochastic version of the Lorenz-95 model which effectively emulates the principal features of ensemble prediction systems. The EPPES method correctly detects the unknown and wrongly specified parameters values, and leads to an improved forecast skill. Second, results with an atmospheric general circulation model based ensemble prediction system show that the NWP model tuning capacity of EPPES scales up to realistic models and ensemble prediction systems. Finally, a global top-end NWP model tuning exercise with preliminary results is published.

  11. Future synergism in diving accident management: The Singapore model.

    PubMed

    Chong, Si Jack; Liang, Weihao; Kim, Soo Jang; Kang, Wee Lee

    2010-03-01

    The popularity of diving as a leisure activity has been an increasing trend in recent years. With the rise of this sport inevitably comes increasing numbers and risk of diving-related injuries and demand for professional medical treatment of such injuries. Concurrently, with hyperbaric oxygen therapy (HBOT) being more readily available, new applications for HBOT have been proven for the treatment of various medical conditions. In Singapore, diving and hyperbaric medicine was largely a military medicine specialty and its practice confined to the Singapore Armed Forces for many years. The new Hyperbaric and Diving Medicine Centre set up in Singapore General Hospital (SGH) offers an excellent opportunity for collaboration between the Singapore Navy Medical Service (NMS) and SGH. This combines the expertise in the field of diving and hyperbaric medicine that NMS provides, with the resources and specialized services available at SGH. This collaboration was officially formalized by the recent signing of a Memorandum of Understanding between the two organisations. The partnership will allow both organisations to leverage on each other's strengths and enhance the development of research and training capabilities. This collaboration will also be an important step towards formal recognition and accreditation of diving and hyperbaric medicine as a medical subspecialty in the foreseeable future, thus helping to develop and promote diving and hyperbaric medicine in Singapore. This synergistic approach in diving accident management will also promote and establish Singapore as a leader in the field of diving and hyperbaric medicine in the region. PMID:23111838

  12. A dynamic model for evaluating radionuclide distribution in forests from nuclear accidents.

    PubMed

    Schell, W R; Linkov, I; Myttenaere, C; Morel, B

    1996-03-01

    The Chernobyl Nuclear Power Plant accident in 1986 caused radionuclide contamination in most countries in Eastern and Western Europe. A prime example is Belarus where 23% of the total land area received chronic levels; about 1.5 x 10(6) ha of forested lands were contaminated with 40--190 kBq m-2 and 2.5 x 10(4) ha received greater than 1,480 kBq m-2 of 137Cs and other long-lived radionuclides such as 90Sr and 239,240Pu. Since the radiological dose to the forest ecosystem will tend to accumulate over long time periods (decades to centuries), we need to determine what countermeasures can be taken to limit this dose so that the affected regions can, once again, safely provide habitat and natural forest products. To address some of these problems, our initial objective is to formulate a generic model, FORESTPATH, which describes the major kinetic processes and pathways of radionuclide movement in forests and natural ecosystems and which can be used to predict future radionuclide concentrations. The model calculates the time-dependent radionuclide concentrations in different compartments of the forest ecosystem based on the information available on residence half-times in two forest types: coniferous and deciduous. The results show that the model reproduces well the radionuclide cycling pattern found in the literature for deciduous and coniferous forests. Variability analysis was used to access the relative importance of specific parameter values in the generic model performance. The FORESTPASTH model can be easily adjusted for site-specific applications. PMID:8609024

  13. A dynamic model for evaluating radionuclide distribution in forests from nuclear accidents

    SciTech Connect

    Schell, W.R.; Linkov, I.; Myttenaere, C.

    1996-03-01

    The Chernobyl Nuclear Power Plant accident in 1986 caused radionuclide contamination in most countries in Eastern and Western Europe. A prime example is Belarus where 23% of the total land area received chronic levels; about 1.5 X 10{sup 6} ha of forested lands were contaminated with 40-190 kBq m{sup -2} and 2.5 X 10{sup 4} ha received greater than 1,480 kBq m{sup -2} of {sup 137}Cs and other long-lived radionuclides such as {sup 90}Sr and {sup 239,240}Pu. Since the radiological dose to the forest ecosystem will tend to accumulate over long time periods (decades to centuries), we need to determine what countermeasures can be taken to limit this dose so that the affected regions can, once again, safely provide habitat and natural forest products. To address some of these problems, our initial objective is to formulate a generic model, FORESTPATH, which describes the major kinetic processes and pathways of radionuclide movement in forests and natural ecosystems and which can be used to predict future radionuclide concentrations. The model calculates the time-dependent radionuclide concentrations in different compartments of the forest ecosystem based on the information available on residence half-times in two forest types: coniferous and deciduous. The results show that the model reproduces well the radionuclide cycling pattern found in the literature for deciduous and coniferous forests. Variability analysis was used to access the relative importance of specific parameter values in the generic model performance. The FORESTPASTH model can be easily adjusted for site-specific applications. 92 refs., 5 figs., 6 tabs.

  14. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential

  15. A dynamic model to estimate the activity concentration and whole body dose rate of marine biota as consequences of a nuclear accident.

    PubMed

    Keum, Dong-Kwon; Jun, In; Kim, Byeong-Ho; Lim, Kwang-Muk; Choi, Yong-Ho

    2015-02-01

    This paper describes a dynamic compartment model (K-BIOTA-DYN-M) to assess the activity concentration and whole body dose rate of marine biota as a result of a nuclear accident. The model considers the transport of radioactivity between the marine biota through the food chain, and applies the first order kinetic model for the sedimentation of radionuclides from seawater onto sediment. A set of ordinary differential equations representing the model are simultaneously solved to calculate the activity concentration of the biota and the sediment, and subsequently the dose rates, given the seawater activity concentration. The model was applied to investigate the long-term effect of the Fukushima nuclear accident on the marine biota using (131)I, (134)Cs, and, (137)Cs activity concentrations of seawater measured for up to about 2.5 years after the accident at two locations in the port of the Fukushima Daiichi Nuclear Power Station (FDNPS) which was the most highly contaminated area. The predicted results showed that the accumulated dose for 3 months after the accident was about 4-4.5Gy, indicating the possibility of occurrence of an acute radiation effect in the early phase after the Fukushima accident; however, the total dose rate for most organisms studied was usually below the UNSCEAR (United Nations Scientific Committee on the Effects of Atomic Radiation)'s bench mark level for chronic exposure except for the initial phase of the accident, suggesting a very limited radiological effect on the marine biota at the population level. The predicted Cs sediment activity by the first-order kinetic model for the sedimentation was in a good agreement with the measured activity concentration. By varying the ecological parameter values, the present model was able to predict the very scattered (137)Cs activity concentrations of fishes measured in the port of FDNPS. Conclusively, the present dynamic model can be usefully applied to estimate the activity concentration and whole

  16. [Guilty victims: a model to perpetuate impunity for work-related accidents].

    PubMed

    Vilela, Rodolfo Andrade Gouveia; Iguti, Aparecida Mari; Almeida, Ildeberto Muniz

    2004-01-01

    This article analyzes reports and data from the investigation of severe and fatal work-related accidents by the Regional Institute of Criminology in Piracicaba, São Paulo State, Brazil. Some 71 accident investigation reports were analyzed from 1998, 1999, and 2000. Accidents involving machinery represented 38.0% of the total, followed by high falls (15.5%), and electric shocks (11.3%). The reports conclude that 80.0% of the accidents are caused by "unsafe acts" committed by workers themselves, while the lack of safety or "unsafe conditions" account for only 15.5% of cases. Victims are blamed even in situations involving high risk in which not even minimum safety conditions are adopted, thus favoring employers' interests. Such conclusions reflect traditional reductionist explanatory models, in which accidents are viewed as simple, unicausal phenomena, generally focused on slipups and errors by the workers themselves. Despite criticism in recent decades from the technical and academic community, this concept is still hegemonic, thus jeopardizing the development of preventive policies and the improvement of work conditions. PMID:15073638

  17. Multilevel modelling for the regional effect of enforcement on road accidents.

    PubMed

    Yannis, George; Papadimitriou, Eleonora; Antoniou, Constantinos

    2007-07-01

    This paper investigates the effect of the intensification of Police enforcement on the number of road accidents at national and regional level in Greece, focusing on one of the most important road safety violations: drinking-and-driving. Multilevel negative binomial models are developed to describe the effect of the intensification of alcohol enforcement on the reduction of road accidents in different regions of Greece. Moreover, two approaches are explored as far as regional clustering is concerned: the first one concerns an ad hoc geographical clustering and the second one is based on the results of mathematical cluster analysis through demographic, transport and road safety characteristics. Results indicate that there are significant spatial dependences among road accidents and enforcement. Additionally, it is shown that these dependences are more efficiently interpreted when regions are determined on the basis of qualitative similarities than on the basis of geographical adjacency. PMID:17274938

  18. Input-output model for MACCS nuclear accident impacts estimation¹

    SciTech Connect

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  19. A nuclear plant accident diagnosis method to support prediction of errors of commission

    SciTech Connect

    Chang, Y. H. J.; Coyne, K.; Mosleh, A.

    2006-07-01

    The identification and mitigation of operator errors of commission (EOCs) continue to be a major focus of nuclear plant human reliability research. Current Human Reliability Analysis (HRA) methods for predicting EOCs generally rely on the availability of operating procedures or extensive use of expert judgment. Consequently, an analysis for EOCs cannot easily be performed for actions that may be taken outside the scope of the operating procedures. Additionally, current HRA techniques rarely capture an operator's 'creative' problem-solving behavior. However, a nuclear plant operator knowledge base developed for the use with the IDAC (Information, Decision, and Action in Crew context) cognitive model shows potential for addressing these limitations. This operator knowledge base currently includes an event-symptom diagnosis matrix for a pressurized water reactor (PWR) nuclear plant. The diagnosis matrix defines a probabilistic relationship between observed symptoms and plant events that models the operator's heuristic process for classifying a plant state. Observed symptoms are obtained from a dynamic thermal-hydraulic plant model and can be modified to account for the limitations of human perception and cognition. A fuzzy-logic inference technique is used to calculate the operator's confidence, or degree of belief, that a given plant event has occurred based on the observed symptoms. An event diagnosis can be categorized as either: (a) a generalized flow imbalance of basic thermal-hydraulic properties (e.g., a mass or energy flow imbalance in the reactor coolant system), or (b) a specific event type, such as a steam generator tube rupture or a reactor trip. When an operator is presented with incomplete or contradictory information, this diagnosis approach provides a means to identify situations where an operator might be misled to perform unsafe actions based on an incorrect diagnosis. This knowledge base model could also support identification of potential EOCs when

  20. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

    2014-01-01

    The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  1. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    NASA Technical Reports Server (NTRS)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  2. Accident Sequence Precursor Program Large Early Release Frequency Model Development

    SciTech Connect

    Brown, T.D.; Brownson, D.A.; Duran, F.A.; Gregory, J.J.; Rodrick, E.G.

    1999-01-04

    The objectives for the ASP large early release frequency (LERF) model development work is to build a Level 2 containment response model that would capture all of the events necessary to define LERF as outlined in Regulatory Guide 1.174, can be directly interfaced with the existing Level 1 models, is technically correct, can be readily modified to incorporate new information or to represent another plant, and can be executed in SAPHIRE. The ASP LERF models being developed will meet these objectives while providing the NRC with the capability to independently assess the risk impact of plant-specific changes proposed by the utilities that change the nuclear power plants' licensing basis. Together with the ASP Level 1 models, the ASP LERF models provide the NRC with the capability of performing equipment and event assessments to determine their impact on a plant's LERF for internal events during power operation. In addition, the ASP LERF models are capable of being updated to reflect changes in information regarding the system operations and phenomenological events, and of being updated to assess the potential for early fatalities for each LERF sequence. As the ASP Level 1 models evolve to include more analysis capabilities, the LERF models will also be refined to reflect the appropriate level of detail needed to demonstrate the new capabilities. An approach was formulated for the development of detailed LERF models using the NUREG-1150 APET models as a guide. The modifications to the SAPHIRE computer code have allowed the development of these detailed models and the ability to analyze these models in a reasonable time. Ten reference LERF plant models, including six PWR models and four BWR models, which cover a wide variety of containment and nuclear steam supply systems designs, will be complete in 1999. These reference models will be used as the starting point for developing the LERF models for the remaining nuclear power plants.

  3. Generation IV benchmarking of TRISO fuel performance models under accident conditions. Modeling input data

    SciTech Connect

    Blaise Collin

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document

  4. Predictive models of radiative neutrino masses

    NASA Astrophysics Data System (ADS)

    Julio, J.

    2016-06-01

    We discuss two models of radiative neutrino mass generation. The first model features one-loop Zee model with Z4 symmetry. The second model is the two-loop neutrino mass model with singly- and doubly-charged scalars. These two models fit neutrino oscillation data well and predict some interesting rates for lepton flavor violation processes.

  5. How to Establish Clinical Prediction Models

    PubMed Central

    Bang, Heejung

    2016-01-01

    A clinical prediction model can be applied to several challenging clinical scenarios: screening high-risk individuals for asymptomatic disease, predicting future events such as disease or death, and assisting medical decision-making and health education. Despite the impact of clinical prediction models on practice, prediction modeling is a complex process requiring careful statistical analyses and sound clinical judgement. Although there is no definite consensus on the best methodology for model development and validation, a few recommendations and checklists have been proposed. In this review, we summarize five steps for developing and validating a clinical prediction model: preparation for establishing clinical prediction models; dataset selection; handling variables; model generation; and model evaluation and validation. We also review several studies that detail methods for developing clinical prediction models with comparable examples from real practice. After model development and vigorous validation in relevant settings, possibly with evaluation of utility/usability and fine-tuning, good models can be ready for the use in practice. We anticipate that this framework will revitalize the use of predictive or prognostic research in endocrinology, leading to active applications in real clinical practice. PMID:26996421

  6. How to Establish Clinical Prediction Models.

    PubMed

    Lee, Yong Ho; Bang, Heejung; Kim, Dae Jung

    2016-03-01

    A clinical prediction model can be applied to several challenging clinical scenarios: screening high-risk individuals for asymptomatic disease, predicting future events such as disease or death, and assisting medical decision-making and health education. Despite the impact of clinical prediction models on practice, prediction modeling is a complex process requiring careful statistical analyses and sound clinical judgement. Although there is no definite consensus on the best methodology for model development and validation, a few recommendations and checklists have been proposed. In this review, we summarize five steps for developing and validating a clinical prediction model: preparation for establishing clinical prediction models; dataset selection; handling variables; model generation; and model evaluation and validation. We also review several studies that detail methods for developing clinical prediction models with comparable examples from real practice. After model development and vigorous validation in relevant settings, possibly with evaluation of utility/usability and fine-tuning, good models can be ready for the use in practice. We anticipate that this framework will revitalize the use of predictive or prognostic research in endocrinology, leading to active applications in real clinical practice. PMID:26996421

  7. Future missions studies: Combining Schatten's solar activity prediction model with a chaotic prediction model

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    K. Schatten (1991) recently developed a method for combining his prediction model with our chaotic model. The philosophy behind this combined model and his method of combination is explained. Because the Schatten solar prediction model (KS) uses a dynamo to mimic solar dynamics, accurate prediction is limited to long-term solar behavior (10 to 20 years). The Chaotic prediction model (SA) uses the recently developed techniques of nonlinear dynamics to predict solar activity. It can be used to predict activity only up to the horizon. In theory, the chaotic prediction should be several orders of magnitude better than statistical predictions up to that horizon; beyond the horizon, chaotic predictions would theoretically be just as good as statistical predictions. Therefore, chaos theory puts a fundamental limit on predictability.

  8. Initial VHTR accident scenario classification: models and data.

    SciTech Connect

    Vilim, R. B.; Feldman, E. E.; Pointer, W. D.; Wei, T. Y. C.; Nuclear Engineering Division

    2005-09-30

    Nuclear systems codes are being prepared for use as computational tools for conducting performance/safety analyses of the Very High Temperature Reactor. The thermal-hydraulic codes are RELAP5/ATHENA for one-dimensional systems modeling and FLUENT and/or Star-CD for three-dimensional modeling. We describe a formal qualification framework, the development of Phenomena Identification and Ranking Tables (PIRTs), the initial filtering of the experiment databases, and a preliminary screening of these codes for use in the performance/safety analyses. In the second year of this project we focused on development of PIRTS. Two events that result in maximum fuel and vessel temperatures, the Pressurized Conduction Cooldown (PCC) event and the Depressurized Conduction Cooldown (DCC) event, were selected for PIRT generation. A third event that may result in significant thermal stresses, the Load Change event, is also selected for PIRT generation. Gas reactor design experience and engineering judgment were used to identify the important phenomena in the primary system for these events. Sensitivity calculations performed with the RELAP5 code were used as an aid to rank the phenomena in order of importance with respect to the approach of plant response to safety limits. The overall code qualification methodology was illustrated by focusing on the Reactor Cavity Cooling System (RCCS). The mixed convection mode of heat transfer and pressure drop is identified as an important phenomenon for Reactor Cavity Cooling System (RCCS) operation. Scaling studies showed that the mixed convection mode is likely to occur in the RCCS air duct during normal operation and during conduction cooldown events. The RELAP5/ATHENA code was found to not adequately treat the mixed convection regime. Readying the code will require adding models for the turbulent mixed convection regime while possibly performing new experiments for the laminar mixed convection regime. Candidate correlations for the turbulent

  9. Phase-Change Modelling in Severe Nuclear Accidents

    NASA Astrophysics Data System (ADS)

    Pain, Christopher; Pavlidis, Dimitrios; Xie, Zhihua; Percival, James; Gomes, Jefferson; Matar, Omar; Moatamedi, Moji; Tehrani, Ali; Jones, Alan; Smith, Paul

    2014-11-01

    This paper describes progress on a consistent approach for multi-phase flow modelling with phase-change. Although, the developed methods are general purpose the applications presented here cover core melt phenomena at the lower vessel head. These include corium pool formation, coolability and solidification. With respect to external cooling, comparison with the LIVE experiments (from Karlsruhe) is undertaken. Preliminary re-flooding simulation results are also presented. These include water injection into porous media (debris bed) and boiling. Numerical simulations follow IRSN's PEARL experimental programme on quenching/re-flooding. The authors wish to thank Prof. Timothy Haste of IRSN. Dr. D. Pavlidis is funded by EPSRC Consortium ``Computational Modelling for Advanced Nuclear Plants,'' Grant Number EP/I003010/1.

  10. Chernobyl and Fukushima nuclear accidents: what has changed in the use of atmospheric dispersion modeling?

    PubMed

    Benamrane, Y; Wybo, J-L; Armand, P

    2013-12-01

    The threat of a major accidental or deliberate event that would lead to hazardous materials emission in the atmosphere is a great cause of concern to societies. This is due to the potential large scale of casualties and damages that could result from the release of explosive, flammable or toxic gases from industrial plants or transport accidents, radioactive material from nuclear power plants (NPPs), and chemical, biological, radiological or nuclear (CBRN) terrorist attacks. In order to respond efficiently to such events, emergency services and authorities resort to appropriate planning and organizational patterns. This paper focuses on the use of atmospheric dispersion modeling (ADM) as a support tool for emergency planning and response, to assess the propagation of the hazardous cloud and thereby, take adequate counter measures. This paper intends to illustrate the noticeable evolution in the operational use of ADM tools over 25 y and especially in emergency situations. This study is based on data available in scientific publications and exemplified using the two most severe nuclear accidents: Chernobyl (1986) and Fukushima (2011). It appears that during the Chernobyl accident, ADM were used few days after the beginning of the accident mainly in a diagnosis approach trying to reconstruct what happened, whereas 25 y later, ADM was also used during the first days and weeks of the Fukushima accident to anticipate the potentially threatened areas. We argue that the recent developments in ADM tools play an increasing role in emergencies and crises management, by supporting stakeholders in anticipating, monitoring and assessing post-event damages. However, despite technological evolutions, its prognostic and diagnostic use in emergency situations still arise many issues. PMID:24077309

  11. Evaluating the Predictive Value of Growth Prediction Models

    ERIC Educational Resources Information Center

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  12. Incorporating uncertainty in predictive species distribution modelling

    PubMed Central

    Beale, Colin M.; Lennon, Jack J.

    2012-01-01

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates. PMID:22144387

  13. Development of a Gravid Uterus Model for the Study of Road Accidents Involving Pregnant Women.

    PubMed

    Auriault, F; Thollon, L; Behr, M

    2016-01-01

    Car accident simulations involving pregnant women are well documented in the literature and suggest that intra-uterine pressure could be responsible for the phenomenon of placental abruption, underlining the need for a realistic amniotic fluid model, including fluid-structure interactions (FSI). This study reports the development and validation of an amniotic fluid model using an Arbitrary Lagrangian Eulerian formulation in the LS-DYNA environment. Dedicated to the study of the mechanisms responsible for fetal injuries resulting from road accidents, the fluid model was validated using dynamic loading tests. Drop tests were performed on a deformable water-filled container at acceleration levels that would be experienced in a gravid uterus during a frontal car collision at 25 kph. During the test device braking phase, container deformation induced by inertial effects and FSI was recorded by kinematic analysis. These tests were then simulated in the LS-DYNA environment to validate a fluid model under dynamic loading, based on the container deformations. Finally, the coupling between the amniotic fluid model and an existing finite-element full-body pregnant woman model was validated in terms of pressure. To do so, experimental test results performed on four postmortem human surrogates (PMHS) (in which a physical gravid uterus model was inserted) were used. The experimental intra-uterine pressure from these tests was compared to intra uterine pressure from a numerical simulation performed under the same loading conditions. Both free fall numerical and experimental responses appear strongly correlated. The relationship between the amniotic fluid model and pregnant woman model provide intra-uterine pressure values correlated with the experimental test responses. The use of an Arbitrary Lagrangian Eulerian formulation allows the analysis of FSI between the amniotic fluid and the gravid uterus during a road accident involving pregnant women. PMID:26592419

  14. Posterior Predictive Bayesian Phylogenetic Model Selection

    PubMed Central

    Lewis, Paul O.; Xie, Wangang; Chen, Ming-Hui; Fan, Yu; Kuo, Lynn

    2014-01-01

    We present two distinctly different posterior predictive approaches to Bayesian phylogenetic model selection and illustrate these methods using examples from green algal protein-coding cpDNA sequences and flowering plant rDNA sequences. The Gelfand–Ghosh (GG) approach allows dissection of an overall measure of model fit into components due to posterior predictive variance (GGp) and goodness-of-fit (GGg), which distinguishes this method from the posterior predictive P-value approach. The conditional predictive ordinate (CPO) method provides a site-specific measure of model fit useful for exploratory analyses and can be combined over sites yielding the log pseudomarginal likelihood (LPML) which is useful as an overall measure of model fit. CPO provides a useful cross-validation approach that is computationally efficient, requiring only a sample from the posterior distribution (no additional simulation is required). Both GG and CPO add new perspectives to Bayesian phylogenetic model selection based on the predictive abilities of models and complement the perspective provided by the marginal likelihood (including Bayes Factor comparisons) based solely on the fit of competing models to observed data. [Bayesian; conditional predictive ordinate; CPO; L-measure; LPML; model selection; phylogenetics; posterior predictive.] PMID:24193892

  15. Modeling and Predicting Pesticide Exposures

    EPA Science Inventory

    Models provide a means for representing a real system in an understandable way. They take many forms, beginning with conceptual models that explain the way a system works, such as delineation of all the factors and parameters of how a pesticide particle moves in the air after a s...

  16. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Crider, Dennis; Foster, John V.

    2012-01-01

    In-flight loss of control remains the leading contributor to aviation accident fatalities, with stall upsets being the leading causal factor. The February 12, 2009. Colgan Air, Inc., Continental Express flight 3407 accident outside Buffalo, New York, brought this issue to the forefront of public consciousness and resulted in recommendations from the National Transportation Safety Board to conduct training that incorporates stalls that are fully developed and develop simulator standards to support such training. In 2010, Congress responded to this accident with Public Law 11-216 (Section 208), which mandates full stall training for Part 121 flight operations. Efforts are currently in progress to develop recommendations on implementation of stall training for airline pilots. The International Committee on Aviation Training in Extended Envelopes (ICATEE) is currently defining simulator fidelity standards that will be necessary for effective stall training. These recommendations will apply to all civil transport aircraft including straight-wing turboprop aircraft. Government-funded research over the previous decade provides a strong foundation for stall/post-stall simulation for swept-wing, conventional tail jets to respond to this mandate, but turboprops present additional and unique modeling challenges. First among these challenges is the effect of power, which can provide enhanced flow attachment behind the propellers. Furthermore, turboprops tend to operate for longer periods in an environment more susceptible to ice. As a result, there have been a significant number of turboprop accidents as a result of the early (lower angle of attack) stalls in icing. The vulnerability of turboprop configurations to icing has led to studies on ice accumulation and the resulting effects on flight behavior. Piloted simulations of these effects have highlighted the important training needs for recognition and mitigation of icing effects, including the reduction of stall margins

  17. Incorporating model uncertainty into spatial predictions

    SciTech Connect

    Handcock, M.S.

    1996-12-31

    We consider a modeling approach for spatially distributed data. We are concerned with aspects of statistical inference for Gaussian random fields when the ultimate objective is to predict the value of the random field at unobserved locations. However the exact statistical model is seldom known before hand and is usually estimated from the very same data relative to which the predictions are made. Our objective is to assess the effect of the fact that the model is estimated, rather than known, on the prediction and the associated prediction uncertainty. We describe a method for achieving this objective. We, in essence, consider the best linear unbiased prediction procedure based on the model within a Bayesian framework. These ideas are implemented for the spring temperature over the region in the northern United States based on the stations in the United States historical climatological network reported in Karl, Williams, Quinlan & Boden.

  18. COMPARING SAFE VS. AT-RISK BEHAVIORAL DATA TO PREDICT ACCIDENTS

    SciTech Connect

    Jeffrey C. Joe

    2001-11-01

    The Safety Observations Achieve Results (SOAR) program at the Idaho National Laboratory (INL) encourages employees to perform in-field observations of each other’s behaviors. One purpose for performing these observations is that it gives the observers the opportunity to correct, if needed, their co-worker’s at-risk work practices and habits (i.e., behaviors). The underlying premise of doing this is that major injuries (e.g., OSHA-recordable events) are prevented from occurring because the lower level at-risk behaviors are identified and corrected before they can propagate into culturally accepted unsafe behaviors that result in injuries or fatalities. However, unlike other observation programs, SOAR also emphasizes positive reinforcement for safe behaviors observed. The underlying premise of doing this is that positive reinforcement of safe behaviors helps establish a strong positive safety culture. Since the SOAR program collects both safe and at-risk leading indicator data, this provides a unique opportunity to assess and compare the two kinds of data in terms of their ability to predict future adverse safety events. This paper describes the results of analyses performed on SOAR data to assess their relative predictive ability. Implications are discussed.

  19. Piloted Simulation of a Model-Predictive Automated Recovery System

    NASA Technical Reports Server (NTRS)

    Liu, James (Yuan); Litt, Jonathan; Sowers, T. Shane; Owens, A. Karl; Guo, Ten-Huei

    2014-01-01

    This presentation describes a model-predictive automatic recovery system for aircraft on the verge of a loss-of-control situation. The system determines when it must intervene to prevent an imminent accident, resulting from a poor approach. It estimates the altitude loss that would result from a go-around maneuver at the current flight condition. If the loss is projected to violate a minimum altitude threshold, the maneuver is automatically triggered. The system deactivates to allow landing once several criteria are met. Piloted flight simulator evaluation showed the system to provide effective envelope protection during extremely unsafe landing attempts. The results demonstrate how flight and propulsion control can be integrated to recover control of the vehicle automatically and prevent a potential catastrophe.

  20. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    PubMed

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. PMID:24878693

  1. Validation of advanced NSSS simulator model for loss-of-coolant accidents

    SciTech Connect

    Kao, S.P.; Chang, S.K.; Huang, H.C.

    1995-09-01

    The replacement of the NSSS (Nuclear Steam Supply System) model on the Millstone 2 full-scope simulator has significantly increased its fidelity to simulate adverse conditions in the RCS. The new simulator NSSS model is a real-time derivative of the Nuclear Plant Analyzer by ABB. The thermal-hydraulic model is a five-equation, non-homogeneous model for water, steam, and non-condensible gases. The neutronic model is a three-dimensional nodal diffusion model. In order to certify the new NSSS model for operator training, an extensive validation effort has been performed by benchmarking the model performance against RELAP5/MOD2. This paper presents the validation results for the cases of small-and large-break loss-of-coolant accidents (LOCA). Detailed comparisons in the phenomena of reflux-condensation, phase separation, and two-phase natural circulation are discussed.

  2. Prediction of Severe Eye Injuries in Automobile Accidents: Static and Dynamic Rupture Pressure of the Eye

    PubMed Central

    Kennedy, Eric A.; Voorhies, Katherine D.; Herring, Ian P.; Rath, Amber L.; Duma, Stefan M.

    2004-01-01

    The purpose of this paper is to determine the static and dynamic rupture pressures of 20 human and 20 porcine eyes. This study found the static test results show an average rupture pressure for porcine eyes of 1.00 ± 0.18 MPa while the average rupture pressure for human eyes was 0.36 ± 0.20 MPa. For dynamic loading, the average porcine rupture pressure was 1.64 ± 0.32 MPa, and the average rupture pressure for human eyes was 0.91 ± 0.29 MPa. Significant differences are found between average rupture pressures from all four groups of tests (p = 0.01). A risk function has been developed and predicts a 50% risk of globe rupture at 1.02 MPa, 1.66 MPa, 0.35 MPa, and 0.90 MPa internal pressure for porcine static, porcine dynamic, human static, and human dynamic loading conditions, respectively. PMID:15319124

  3. Predictive Modeling in Adult Education

    ERIC Educational Resources Information Center

    Lindner, Charles L.

    2011-01-01

    The current economic crisis, a growing workforce, the increasing lifespan of workers, and demanding, complex jobs have made organizations highly selective in employee recruitment and retention. It is therefore important, to the adult educator, to develop models of learning that better prepare adult learners for the workplace. The purpose of…

  4. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Predictive modelling of boiler fouling

    SciTech Connect

    Not Available

    1991-01-01

    The primary objective of this work is the development of a comprehensive numerical model describing the time evolution of fouling under realistic heat exchanger conditions. As fouling is a complex interaction of gas flow, mineral transport and adhesion mechanisms, understanding and subsequently improved controlling of fouling achieved via appropriate manipulation of the various coupled, nonlinear processes in a complex fluid mechanics environment will undoubtedly help reduce the substantial operating costs incurred by the utilities annually, as well as afford greater flexibility in coal selection and reduce the emission of various pollutants. In a more specialized context, the numerical model to be developed as part of this activity will be used as a tool to address the interaction of the various mechanisms controlling deposit development in specific regimes or correlative relationships. These should prove of direct use to the coal burning industry.

  16. Predictive modelling of boiler fouling

    SciTech Connect

    Not Available

    1991-01-01

    The primary objective of this work is the development of a comprehensive numerical model describing the time evolution of fouling under realistic heat exchanger conditions. As fouling is a complex interaction of gas flow, mineral transport and adhesion mechanisms, understanding and subsequently improved controlling of fouling achieved via appropriate manipulation of the various coupled, nonlinear processes in a complex fluid mechanics environment will undoubtedly help reduce the substantial operating costs incurred by the utilities annually, as well as afford greater flexibility in coal selection and reduce the emission of various pollutants. In a more specialized context, the numerical model to be developed as part of this activity will be used as a tool to address the interaction of the various mechanisms controlling deposit development in specific regimes or correlative relationships. These should prove of direct use to the coal burning industry. 11 figs.

  17. Predictive modelling of boiler fouling

    SciTech Connect

    Not Available

    1992-01-01

    The primary objective of this work is the development of a comprehensive numerical model describing the time evolution of fouling under realistic heat exchanger conditions. As fouling is complex interaction of gas flow, mineral transport and adhesion mechanisms, understanding and subsequently improved controlling of fouling achieved via appropriate manipulation of the various coupled, nonlinear processes in a complex fluid mechanics environment will undoubtedly help reduce the substantial operating costs incurred by the utilities annually, as well as afford greater flexibility in coal selection and reduce the emission of various pollutants. In a more specialized context, the numerical model to be developed as part of this activity will be used as a tool to address the interaction of the various mechanisms controlling deposit development in specific regimes or correlative relationships. These should prove of direct use to the coal burning industry.

  18. Irma multisensor predictive signature model

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Flynn, David S.; Wellfare, Michael R.; Richards, Mike; Prestwood, Lee

    1995-06-01

    The Irma synthetic signature model was one of the first high resolution synthetic infrared (IR) target and background signature models to be developed for tactical air-to-surface weapon scenarios. Originally developed in 1980 by the Armament Directorate of the Air Force Wright Laboratory (WL/MN), the Irma model was used exclusively to generate IR scenes for smart weapons research and development. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser channel. This two channel version, Irma 3.0, was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model which supported correlated frame-to-frame imagery. This and other improvements were released in Irma 2.2. Recently, Irma 3.2, a passive IR/millimeter wave (MMW) code, was completed. Currently, upgrades are underway to include an active MMW channel. Designated Irma 4.0, this code will serve as a cornerstone of sensor fusion research in the laboratory from 6.1 concept development to 6.3 technology demonstration programs for precision guided munitions. Several significant milestones have been reached in this development process and are demonstrated. The Irma 4.0 software design has been developed and interim results are available. Irma is being developed to facilitate multi-sensor smart weapons research and development. It is currently in distribution to over 80 agencies within the U.S. Air Force, U.S. Army, U.S. Navy, ARPA, NASA, Department of Transportation, academia, and industry.

  19. Predicting and Modeling RNA Architecture

    PubMed Central

    Westhof, Eric; Masquida, Benoît; Jossinet, Fabrice

    2011-01-01

    SUMMARY A general approach for modeling the architecture of large and structured RNA molecules is described. The method exploits the modularity and the hierarchical folding of RNA architecture that is viewed as the assembly of preformed double-stranded helices defined by Watson-Crick base pairs and RNA modules maintained by non-Watson-Crick base pairs. Despite the extensive molecular neutrality observed in RNA structures, specificity in RNA folding is achieved through global constraints like lengths of helices, coaxiality of helical stacks, and structures adopted at the junctions of helices. The Assemble integrated suite of computer tools allows for sequence and structure analysis as well as interactive modeling by homology or ab initio assembly with possibilities for fitting within electronic density maps. The local key role of non-Watson-Crick pairs guides RNA architecture formation and offers metrics for assessing the accuracy of three-dimensional models in a more useful way than usual root mean square deviation (RMSD) values. PMID:20504963

  20. A Course in... Model Predictive Control.

    ERIC Educational Resources Information Center

    Arkun, Yaman; And Others

    1988-01-01

    Describes a graduate engineering course which specializes in model predictive control. Lists course outline and scope. Discusses some specific topics and teaching methods. Suggests final projects for the students. (MVL)

  1. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  2. Preliminary Design Report for Modeling of Hydrogen Uptake in Fuel Rod Cladding During Severe Accidents

    SciTech Connect

    Siefken, Larry James

    1999-02-01

    Preliminary designs are described for models of hydrogen and oxygen uptake in fuel rod cladding during severe accidents. Calculation of the uptake involves the modeling of seven processes: (1) diffusion of oxygen from the bulk gas into the boundary layer at the external cladding surface, (2) diffusion from the boundary layer into the oxide layer, (3) diffusion from the inner surface of the oxide layer into the metallic part of the cladding, (4) uptake of hydrogen in the event that the clad-ding oxide layer is dissolved in a steam-starved region, (5) embrittlement of cladding due to hydrogen uptake, (6) cracking of cladding during quenching due to its embrittlement and (7) release of hydrogen from the cladding after cracking of the cladding. An integral diffusion method is described for calculating the diffusion processes in the cladding. Experimental results are presented that show a rapid uptake of hydrogen in the event of dissolution of the oxide layer and a rapid release of hydrogen in the event of cracking of the oxide layer. These experimental results are used as a basis for calculating the rate of hydrogen uptake and the rate of hydrogen release. The uptake of hydrogen is limited to the equilibrium solubility calculated by applying Sievert's law. The uptake of hydrogen is an exothermic reaction that accelerates the heatup of a fuel rod. An embrittlement criteria is described that accounts for hydrogen and oxygen concentration and the extent of oxidation. A design is described for implementing the models for hydrogen and oxygen uptake and cladding embrittlement into the programming framework of the SCDAP/RELAP5 code. A test matrix is described for assessing the impact of the proposed models on the calculated behavior of fuel rods in severe accident conditions. This report is a revision and reissue of the report entitled; "Preliminary Design Report for Modeling of Hydrogen Uptake in Fuel Rod Cladding During Severe Accidents."

  3. [Accidents in a population of 350 adolescents and young adults: circumstances, risk factors and prediction of recurrence].

    PubMed

    Marcelli, Daniel; Ingrand, Pierre; Delamour, Magali; Ingrand, Isabelle

    2010-06-01

    Accidents among adolescents and young adults are a public health issue, and present two main characteristics: a strong association with sporting activities, and frequent recurrence. Sports accidents are generally relatively benign, but they show a marked tendency to recur Young people engaging in sporting activities do not generally exhibit psychological traits different from the general population. In contrast, the other types of accident, and particularly domestic and traffic accidents, appear to have specific features: they are often more serious, but above all they are associated with psychopathologic features, including depression, anxiety, disorders due to life events, and thrill-seeking These psychopathological features are strongly associated with recurrence. The authors describe a simple self-administered questionnaire (ECARR) designed to assess the risk of accident recurrence in this population. PMID:21513131

  4. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  5. Predictive modelling of boiler fouling

    SciTech Connect

    Not Available

    1992-01-01

    In this reporting period, efforts were initiated to supplement the comprehensive flow field description obtained from the RNG-Spectral Element Simulations by incorporating, in a general framework, appropriate modules to model particle and condensable species transport to the surface. Specifically, a brief survey of the literature revealed the following possible mechanisms for transporting different ash constituents from the host gas to boiler tubes as deserving prominence in building the overall comprehensive model: (1) Flame-volatilized species, chiefly sulfates, are deposited on cooled boiler tubes via the mechanism of classical vapor diffusion. This mechanism is more efficient than the particulate ash deposition, and as a result there is usually an enrichment of condensable salts, chiefly sulfates, in boiler deposits; (2) Particle diffusion (Brownian motion) may account for deposition of some fine particles below 0. 1 mm in diameter in comparison with the mechanism of vapor diffusion and particle depositions, however, the amount of material transported to the tubes via this route is probably small. (3) Eddy diffusion, thermophoretic and electrophoretic deposition mechanisms are likely to have a marked influence in transporting 0.1 to 5[mu]m particles from the host gas to cooled boiler tubes; (4) Inertial impaction is the dominant mechanism in transporting particles above 5[mu]m in diameter to water and steam tubes in pulverized coal fired boiler, where the typical flue gas velocity is between 10 to 25 m/s. Particles above 10[mu]m usually have kinetic energies in excess of what can be dissipated at impact (in the absence of molten sulfate or viscous slag deposit), resulting in their entrainment in the host gas.

  6. A model for the release, dispersion and environmental impact of a postulated reactor accident from a submerged commercial nuclear power plant

    NASA Astrophysics Data System (ADS)

    Bertch, Timothy Creston

    1998-12-01

    Nuclear power plants are inherently suitable for submerged applications and could provide power to the shore power grid or support future underwater applications. The technology exists today and the construction of a submerged commercial nuclear power plant may become desirable. A submerged reactor is safer to humans because the infinite supply of water for heat removal, particulate retention in the water column, sedimentation to the ocean floor and inherent shielding of the aquatic environment would significantly mitigate the effects of a reactor accident. A better understanding of reactor operation in this new environment is required to quantify the radioecological impact and to determine the suitability of this concept. The impact of release to the environment from a severe reactor accident is a new aspect of the field of marine radioecology. Current efforts have been centered on radioecological impacts of nuclear waste disposal, nuclear weapons testing fallout and shore nuclear plant discharges. This dissertation examines the environmental impact of a severe reactor accident in a submerged commercial nuclear power plant, modeling a postulated site on the Atlantic continental shelf adjacent to the United States. This effort models the effects of geography, decay, particle transport/dispersion, bioaccumulation and elimination with associated dose commitment. The use of a source term equivalent to the release from Chernobyl allows comparison between the impacts of that accident and the postulated submerged commercial reactor plant accident. All input parameters are evaluated using sensitivity analysis. The effect of the release on marine biota is determined. Study of the pathways to humans from gaseous radionuclides, consumption of contaminated marine biota and direct exposure as contaminated water reaches the shoreline is conducted. The model developed by this effort predicts a significant mitigation of the radioecological impact of the reactor accident release

  7. Accuracy assessment of landslide prediction models

    NASA Astrophysics Data System (ADS)

    Othman, A. N.; Mohd, W. M. N. W.; Noraini, S.

    2014-02-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones.

  8. Estimation of the time-dependent radioactive source-term from the Fukushima nuclear power plant accident using atmospheric transport modelling

    NASA Astrophysics Data System (ADS)

    Schoeppner, M.; Plastino, W.; Budano, A.; De Vincenzi, M.; Ruggieri, F.

    2012-04-01

    Several nuclear reactors at the Fukushima Dai-ichi power plant have been severely damaged from the Tōhoku earthquake and the subsequent tsunami in March 2011. Due to the extremely difficult on-site situation it has been not been possible to directly determine the emissions of radioactive material. However, during the following days and weeks radionuclides of 137-Caesium and 131-Iodine (amongst others) were detected at monitoring stations throughout the world. Atmospheric transport models are able to simulate the worldwide dispersion of particles accordant to location, time and meteorological conditions following the release. The Lagrangian atmospheric transport model Flexpart is used by many authorities and has been proven to make valid predictions in this regard. The Flexpart software has first has been ported to a local cluster computer at the Grid Lab of INFN and Department of Physics of University of Roma Tre (Rome, Italy) and subsequently also to the European Mediterranean Grid (EUMEDGRID). Due to this computing power being available it has been possible to simulate the transport of particles originating from the Fukushima Dai-ichi plant site. Using the time series of the sampled concentration data and the assumption that the Fukushima accident was the only source of these radionuclides, it has been possible to estimate the time-dependent source-term for fourteen days following the accident using the atmospheric transport model. A reasonable agreement has been obtained between the modelling results and the estimated radionuclide release rates from the Fukushima accident.

  9. Model for halftone color prediction from microstructure

    NASA Astrophysics Data System (ADS)

    Agar, A. U.

    2000-12-01

    In this work, we take a microstructure model based approach to the problem of color prediction of halftones created using an inkjet printer. We assume absorption and scattering of light through the colorant layers and model the subsurface light scattering in the substrate by a Gaussian point spread function. We restrict our analysis to transparent substrates. To model the absorption and scattering of light through the colorant layers, we employ the Kubelka-Munk color mixing mode. To model the scattering in the substrate and to predict the spectral distribution, we use a wavelength dependent version of the reflection prediction model developed by Ruckdeschel and Hauser. Using spectral distributions and ink weight measurements for transparencies completely and homogeneously coated with colorants, we compute the absorption and scattering spectra of the colorants using the Kubelka-Munk theory. We train our model using measured spectral distribution and synthesized microstructure images of primary ramps printed on transparent media. For each patch in the primary ramp, we synthesize a high-resolution halftone microstructure image from the halftone bitmap assuming dot profiles with Gaussian roll-offs, form which we compute a high-resolution transmission image using the Kubelka-Munk theory and the absorption and scattering spectra of the colorants. We then convolve this transmission image with the Gaussian point spread function of the transparent substrate to predict the average spectral distribution of the halftone. We use our model to predict the spectral distribution of a secondary ramp printed on the same media.

  10. Predictive modelling of boiler fouling

    SciTech Connect

    Not Available

    1992-01-01

    As this study incorporates in a general framework, appropriate modules to model condensable species transport to the surface along with particles, the need for a suitable solver for the reaction component of the species equations with regard to issues of stability, stiffness, economy, etc. becomes obvious. It is generally agreed in the literature that the major problem associated with the simultaneous integration of large sets of chemical kinetic rate equations is that of stiffness. Although stiffness does not have a simple definition, it is characterized by widely varying time constants. For example, in hydrogen-air combustion, the induction time is of the order of microseconds whereas the nitric oxide formation time is of the order of milliseconds. These widely different time constants present classical methods (such as the popular explicit Runge-Kutta method) with the following difficulty: to ensure stability of the numerical solution, these methods are restricted to using very short time steps that are determined by the smallest time constant. However, the time for all chemical species to reach near-equilibrium values is determined by the longest time constant. As a result, classical methods require excessive amounts of computer time to solve stiff systems of ordinary differential equations (ODE's). Several approaches for the solution of stiff ODE's have been proposed. Of all these techniques, the general purpose codes EPISODE and LSODE are regarded as the best available packaged'' codes for the solution of stiff systems of ODE'S. However, although these codes may be the best available for solving an arbitrary systems ODE'S, it may be possible to construct superior methods for solving a particular system of ODE's governing the behavior of a specific problem. In this view, an exponentially fitted method, CREK1D, deserves a special mention and is described briefly.

  11. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." PMID:27085591

  12. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  13. Radiological health effects models for nuclear power plant accident consequence analysis.

    PubMed

    Evans, J S; Moeller, D W

    1989-04-01

    Improved health effects models have been developed for assessing the early effects, late somatic effects and genetic effects that might result from low-LET radiation exposures to populations following a major accident in a nuclear power plant. All the models have been developed in such a way that the dynamics of population risks can be analyzed. Estimates of life years lost and the duration of illnesses were generated and a framework recommended for summarizing health impacts. Uncertainty is addressed by providing models for upper, central and lower estimates of most effects. The models are believed to be a significant improvement over the models used in the U.S. Nuclear Regulatory Commission's Reactor Safety Study, and they can easily be modified to reflect advances in scientific understanding of the health effects of ionizing radiation. PMID:2925380

  14. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    SciTech Connect

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  15. Prediction of PARP Inhibition with Proteochemometric Modelling and Conformal Prediction.

    PubMed

    Cortés-Ciriano, Isidro; Bender, Andreas; Malliavin, Thérèse

    2015-06-01

    Poly(ADP-ribose) polymerases (PARPs) play a key role in DNA damage repair. PARP inhibitors act as chemo- and radio- sensitizers and thus potentiate the cytotoxicity of DNA damaging agents. Although PARP inhibitors are currently investigated as chemotherapeutic agents, their cross-reactivity with other members of the PARP family remains unclear. Here, we apply Proteochemometric Modelling (PCM) to model the activity of 181 compounds on 12 human PARPs. We demonstrate that PCM (R0 (2) test =0.65-0.69; RMSEtest =0.95-1.01 °C) displays higher performance on the test set (interpolation) than Family QSAR and Family QSAM (Tukey's HSD, α 0.05), and outperforms Inductive Transfer knowledge among targets (Tukey's HSD, α 0.05). We benchmark the predictive signal of 8 amino acid and 11 full-protein sequence descriptors, obtaining that all of them (except for SOCN) perform at the same level of statistical significance (Tukey's HSD, α 0.05). The extrapolation power of PCM to new compounds (RMSE=1.02±0.80 °C) and targets (RMSE=1.03±0.50 °C) is comparable to interpolation, although the extrapolation ability is not uniform across the chemical and the target space. For this reason, we also provide confidence intervals calculated with conformal prediction. In addition, we present the R package conformal, which permits the calculation of confidence intervals for regression and classification caret models. PMID:27490382

  16. A review and test of predictive models for the bioaccumulation of radiostrontium in fish.

    PubMed

    Smith, J T; Sasina, N V; Kryshev, A I; Belova, N V; Kudelsky, A V

    2009-11-01

    Empirical relations between the (90)Sr concentration factor (CF) and the calcium concentration in freshwater aquatic systems have previously been determined in studies based on data obtained prior to the Chernobyl accident. The purpose of the present research is to review and compare these models, and to test them against a database of post-Chernobyl measurements from rivers and lakes in Ukraine, Russia, Belarus and Finland. It was found that two independently developed models, based on pre-Chernobyl empirical data, are in close agreement with each other, and with empirical data. Testing of both models against new data obtained after the Chernobyl accident confirms the models' predictive ability. An investigation of the influence of fish size on (90)Sr accumulation showed no significant relationship, though the data set was somewhat limited. PMID:19656592

  17. Solar Weather Event Modelling and Prediction

    NASA Astrophysics Data System (ADS)

    Messerotti, Mauro; Zuccarello, Francesca; Guglielmino, Salvatore L.; Bothmer, Volker; Lilensten, Jean; Noci, Giancarlo; Storini, Marisa; Lundstedt, Henrik

    2009-11-01

    Key drivers of solar weather and mid-term solar weather are reviewed by considering a selection of relevant physics- and statistics-based scientific models as well as a selection of related prediction models, in order to provide an updated operational scenario for space weather applications. The characteristics and outcomes of the considered scientific and prediction models indicate that they only partially cope with the complex nature of solar activity for the lack of a detailed knowledge of the underlying physics. This is indicated by the fact that, on one hand, scientific models based on chaos theory and non-linear dynamics reproduce better the observed features, and, on the other hand, that prediction models based on statistics and artificial neural networks perform better. To date, the solar weather prediction success at most time and spatial scales is far from being satisfactory, but the forthcoming ground- and space-based high-resolution observations can add fundamental tiles to the modelling and predicting frameworks as well as the application of advanced mathematical approaches in the analysis of diachronic solar observations, that are a must to provide comprehensive and homogeneous data sets.

  18. Posterior predictive checking of multiple imputation models.

    PubMed

    Nguyen, Cattram D; Lee, Katherine J; Carlin, John B

    2015-07-01

    Multiple imputation is gaining popularity as a strategy for handling missing data, but there is a scarcity of tools for checking imputation models, a critical step in model fitting. Posterior predictive checking (PPC) has been recommended as an imputation diagnostic. PPC involves simulating "replicated" data from the posterior predictive distribution of the model under scrutiny. Model fit is assessed by examining whether the analysis from the observed data appears typical of results obtained from the replicates produced by the model. A proposed diagnostic measure is the posterior predictive "p-value", an extreme value of which (i.e., a value close to 0 or 1) suggests a misfit between the model and the data. The aim of this study was to evaluate the performance of the posterior predictive p-value as an imputation diagnostic. Using simulation methods, we deliberately misspecified imputation models to determine whether posterior predictive p-values were effective in identifying these problems. When estimating the regression parameter of interest, we found that more extreme p-values were associated with poorer imputation model performance, although the results highlighted that traditional thresholds for classical p-values do not apply in this context. A shortcoming of the PPC method was its reduced ability to detect misspecified models with increasing amounts of missing data. Despite the limitations of posterior predictive p-values, they appear to have a valuable place in the imputer's toolkit. In addition to automated checking using p-values, we recommend imputers perform graphical checks and examine other summaries of the test quantity distribution. PMID:25939490

  19. A simplified model for calculating early offsite consequences from nuclear reactor accidents

    SciTech Connect

    Madni, I.K.; Cazzoli, E.G.; Khatib-Rahbar, M.

    1988-07-01

    A personal computer-based model, SMART, has been developed that uses an integral approach for calculating early offsite consequences from nuclear reactor accidents. The solution procedure uses simplified meteorology and involves direct analytic integration of air concentration equations over time and position. This is different from the discretization approach currently used in the CRAC2 and MACCS codes. The SMART code is fast-running, thereby providing a valuable tool for sensitivity and uncertainty studies. The code was benchmarked against both MACCS version 1.4 and CRAC2. Results of benchmarking and detailed sensitivity/uncertainty analyses using SMART are presented. 34 refs., 21 figs., 24 tabs.

  20. An improved model for prediction of resuspension.

    PubMed

    Maxwell, Reed M; Anspaugh, Lynn R

    2011-12-01

    A complete, historical dataset is presented of radionuclide resuspension-factors. These data span six orders of magnitude in time (ranging from 0.1 to 73,000 d), encompass more than 300 individual values, and combine observations from events on three continents. These data were then used to derive improved, empirical models that can be used to predict resuspension of trace materials after their deposit on the ground. Data-fitting techniques were used to derive models of various types and an estimate of uncertainty in model prediction. Two models were found to be suitable: a power law and the modified Anspaugh et al. model, which is a double exponential. Though statistically the power-law model provides the best metrics of fit, the modified Anspaugh model is deemed the more appropriate due to its better fit to data at early times and its ease of implementation in terms of closed analytical integrals. PMID:22048490

  1. Predicting Naming Latencies with an Analogical Model

    ERIC Educational Resources Information Center

    Chandler, Steve

    2008-01-01

    Skousen's (1989, Analogical modeling of language, Kluwer Academic Publishers, Dordrecht) Analogical Model (AM) predicts behavior such as spelling pronunciation by comparing the characteristics of a test item (a given input word) to those of individual exemplars in a data set of previously encountered items. While AM and other exemplar-based models…

  2. Mathematical model for predicting human vertebral fracture

    NASA Technical Reports Server (NTRS)

    Benedict, J. V.

    1973-01-01

    Mathematical model has been constructed to predict dynamic response of tapered, curved beam columns in as much as human spine closely resembles this form. Model takes into consideration effects of impact force, mass distribution, and material properties. Solutions were verified by dynamic tests on curved, tapered, elastic polyethylene beam.

  3. Thermal barrier coating life prediction model development

    NASA Technical Reports Server (NTRS)

    Meier, Susan M.; Nissley, David M.; Sheffler, Keith D.; Cruse, Thomas A.

    1991-01-01

    A thermal barrier coated (TBC) turbine component design system, including an accurate TBC life prediction model, is needed to realize the full potential of available TBC engine performance and/or durability benefits. The objective of this work, which was sponsored in part by NASA, was to generate a life prediction model for electron beam - physical vapor deposited (EB-PVD) zirconia TBC. Specific results include EB-PVD zirconia mechanical and physical properties, coating adherence strength measurements, interfacial oxide growth characteristics, quantitative cyclic thermal spallation life data, and a spallation life model.

  4. Ice jam flooding: a location prediction model

    NASA Astrophysics Data System (ADS)

    Collins, H. A.

    2009-12-01

    Flooding created by ice jamming is a climatically dependent natural hazard frequently affecting cold regions with disastrous results. Basic known physical characteristics which combine in the landscape to create an ice jam flood are modeled on the Cattaraugus Creek Watershed, located in Western New York State. Terrain analysis of topographic features, and the built environment features is conducted using Geographic Information Systems in order to predict the location of ice jam flooding events. The purpose of this modeling is to establish a broadly applicable Watershed scale model for predicting the probable locations of ice jam flooding.location of historic ice jam flooding events

  5. The R-γ transition prediction model

    NASA Astrophysics Data System (ADS)

    Goldberg, Uriel C.; Batten, Paul; Peroomian, Oshin; Chakravarthy, Sukumar

    2015-01-01

    The Rt turbulence closure (Goldberg 2003) is coupled with an intermittency transport equation, γ, to enable prediction of laminar-to-turbulent flow by-pass transition. The model is not correlation-based and is completely topography-parameter-free, thus ready for use in parallelized Computational Fluid Dynamics (CFD) solvers based on unstructured book-keeping. Several examples compare the R-γ model's performance with experimental data and with predictions by the Langtry-Menter γ-Reθ transition closure (2009). Like the latter, the R-γ model is very sensitive to freestream turbulence levels, limiting its utility for engineering purposes.

  6. Thermal barrier coating life prediction model development

    NASA Technical Reports Server (NTRS)

    Hillery, R. V.; Pilsner, B. H.; Mcknight, R. L.; Cook, T. S.; Hartle, M. S.

    1988-01-01

    This report describes work performed to determine the predominat modes of degradation of a plasma sprayed thermal barrier coating system and to develop and verify life prediction models accounting for these degradation modes. The primary TBC system consisted of a low pressure plasma sprayed NiCrAlY bond coat, an air plasma sprayed ZrO2-Y2O3 top coat, and a Rene' 80 substrate. The work was divided into 3 technical tasks. The primary failure mode to be addressed was loss of the zirconia layer through spalling. Experiments showed that oxidation of the bond coat is a significant contributor to coating failure. It was evident from the test results that the species of oxide scale initially formed on the bond coat plays a role in coating degradation and failure. It was also shown that elevated temperature creep of the bond coat plays a role in coating failure. An empirical model was developed for predicting the test life of specimens with selected coating, specimen, and test condition variations. In the second task, a coating life prediction model was developed based on the data from Task 1 experiments, results from thermomechanical experiments performed as part of Task 2, and finite element analyses of the TBC system during thermal cycles. The third and final task attempted to verify the validity of the model developed in Task 2. This was done by using the model to predict the test lives of several coating variations and specimen geometries, then comparing these predicted lives to experimentally determined test lives. It was found that the model correctly predicts trends, but that additional refinement is needed to accurately predict coating life.

  7. Low-power and shutdown models for the accident sequence precursor (ASP) program

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.

    1997-02-01

    The US Nuclear Regulatory Commission (NRC) has been using full-power. Level 1, limited-scope risk models for the Accident Sequence Precursor (ASP) program for over fifteen years. These models have evolved and matured over the years, as have probabilistic risk assessment (PRA) and computer technologies. Significant upgrading activities have been undertaken over the past three years, with involvement from the Offices of Nuclear Reactor Regulation (NRR), Analysis and Evaluation of Operational Data (AEOD), and Nuclear Regulatory Research (RES), and several national laboratories. Part of these activities was an RES-sponsored feasibility study investigating the ability to extend the ASP models to include contributors to core damage from events initiated with the reactor at low power or shutdown (LP/SD), both internal events and external events. This paper presents only the LP/SD internal event modeling efforts.

  8. Opportunities for the testing of environmental transport models using data obtained following the Chernobyl accident

    SciTech Connect

    Hoffman, F.O.; Thiessen, K.M.; Watkins, B.

    1996-01-01

    The aftermath of the Chernobyl accident has provided a unique opportunity to collect data sets specifically for the purpose of model testing, and with these data to create scenarios against which environmental transport models may be tested in a format constituting a blind test. This article serves as an introduction to three test scenarios designed for testing models at the process level: (1) surface water contamination with radionuclides initially deposited onto soils; (2) contamination of different aquatic media and biota due to fallout of radionuclides into a body of water; and (3) atmospheric resuspension of radionuclides from contaminated land surfaces. These scenarios are the first such tests to use data sets collected in the former Soviet Union. Interested modelers are invited to participate in the test exercises by making calculations for any of these test scenarios. Information on participation is included. 9 refs.

  9. Opportunities for the testing of environmental transport models using data obtained following the Chernobyl accident.

    PubMed

    Hoffman, F O; Thiessen, K M; Watkins, B

    1996-01-01

    The aftermath of the Chernobyl accident has provided a unique opportunity to collect data sets specifically for the purpose of model testing, and with these data to create scenarios against which environmental transport models may be tested in a format constituting a blind test. This article serves as an introduction to three test scenarios designed for testing models at the process level: (1) surface water contamination with radionuclides initially deposited onto soils; (2) contamination of different aquatic media and biota due to fallout of radionuclides into a body of water; and (3) atmospheric resuspension of radionuclides from contaminated land surfaces. These scenarios are the first such tests to use data sets collected in the former Soviet Union. Interested modelers are invited to participate in the test exercises by making calculations for any of these test scenarios. Information on participation is included. PMID:7499152

  10. Are animal models predictive for humans?

    PubMed Central

    2009-01-01

    It is one of the central aims of the philosophy of science to elucidate the meanings of scientific terms and also to think critically about their application. The focus of this essay is the scientific term predict and whether there is credible evidence that animal models, especially in toxicology and pathophysiology, can be used to predict human outcomes. Whether animals can be used to predict human response to drugs and other chemicals is apparently a contentious issue. However, when one empirically analyzes animal models using scientific tools they fall far short of being able to predict human responses. This is not surprising considering what we have learned from fields such evolutionary and developmental biology, gene regulation and expression, epigenetics, complexity theory, and comparative genomics. PMID:19146696

  11. Predictive models of implicit and explicit attitudes.

    PubMed

    Perugini, Marco

    2005-03-01

    Explicit attitudes have long been assumed to be central factors influencing behaviour. A recent stream of studies has shown that implicit attitudes, typically measured with the Implicit Association Test (IAT), can also predict a significant range of behaviours. This contribution is focused on testing different predictive models of implicit and explicit attitudes. In particular, three main models can be derived from the literature: (a) additive (the two types of attitudes explain different portion of variance in the criterion), (b) double dissociation (implicit attitudes predict spontaneous whereas explicit attitudes predict deliberative behaviour), and (c) multiplicative (implicit and explicit attitudes interact in influencing behaviour). This paper reports two studies testing these models. The first study (N = 48) is about smoking behaviour, whereas the second study (N = 109) is about preferences for snacks versus fruit. In the first study, the multiplicative model is supported, whereas the double dissociation model is supported in the second study. The results are discussed in light of the importance of focusing on different patterns of prediction when investigating the directive influence of implicit and explicit attitudes on behaviours. PMID:15901390

  12. Effects of improved modeling on best estimate BWR severe accident analysis

    SciTech Connect

    Hyman, C.R.; Ott, L.J.

    1984-01-01

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H/sub 2/O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B/sub 4/C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table.

  13. Toward predictive models of mammalian cells.

    PubMed

    Ma'ayan, Avi; Blitzer, Robert D; Iyengar, Ravi

    2005-01-01

    Progress in experimental and theoretical biology is likely to provide us with the opportunity to assemble detailed predictive models of mammalian cells. Using a functional format to describe the organization of mammalian cells, we describe current approaches for developing qualitative and quantitative models using data from a variety of experimental sources. Recent developments and applications of graph theory to biological networks are reviewed. The use of these qualitative models to identify the topology of regulatory motifs and functional modules is discussed. Cellular homeostasis and plasticity are interpreted within the framework of balance between regulatory motifs and interactions between modules. From this analysis we identify the need for detailed quantitative models on the basis of the representation of the chemistry underlying the cellular process. The use of deterministic, stochastic, and hybrid models to represent cellular processes is reviewed, and an initial integrated approach for the development of large-scale predictive models of a mammalian cell is presented. PMID:15869393

  14. A High Precision Prediction Model Using Hybrid Grey Dynamic Model

    ERIC Educational Resources Information Center

    Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro

    2008-01-01

    In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…

  15. Mechanistic prediction of fission product release under normal and accident conditions: key uncertainties that need better resolution

    SciTech Connect

    Rest, J.

    1983-09-01

    A theoretical model has been used for predicting the behavior of fission gas and volatile fission products (VFPs) in UO/sub 2/-base fuels during steady-state and transient conditions. This model represents an attempt to develop an efficient predictive capability for the full range of possible reactor operating conditions. Fission products released from the fuel are assumed to reach the fuel surface by successively diffusing (via atomic and gas-bubble mobility) from the grains to grain faces and then to the grain edges, where the fission products are released through a network of interconnected tunnels of fission-gas induced and fabricated porosity. The model provides for a multi-region calculation and uses only one size class to characterize a distribution of fission gas bubbles.

  16. Mechanistic prediction of fission-product release under normal and accident conditions: key uncertainties that need better resolution. [PWR; BWR

    SciTech Connect

    Rest, J.

    1983-09-01

    A theoretical model has been used for predicting the behavior of fission gas and volatile fission products (VFPs) in UO/sub 2/-base fuels during steady-state and transient conditions. This model represents an attempt to develop an efficient predictive capability for the full range of possible reactor operating conditions. Fission products released from the fuel are assumed to reach the fuel surface by successively diffusing (via atomic and gas-bubble mobility) from the grains to grain faces and then to the grain edges, where the fission products are released through a network of interconnected tunnels of fission-gas induced and fabricated porosity. The model provides for a multi-region calculation and uses only one size class to characterize a distribution of fission gas bubbles.

  17. Predictive model for segmented poly(urea)

    NASA Astrophysics Data System (ADS)

    Gould, P. J.; Cornish, R.; Frankl, P.; Lewtas, I.

    2012-08-01

    Segmented poly(urea) has been shown to be of significant benefit in protecting vehicles from blast and impact and there have been several experimental studies to determine the mechanisms by which this protective function might occur. One suggested route is by mechanical activation of the glass transition. In order to enable design of protective structures using this material a constitutive model and equation of state are needed for numerical simulation hydrocodes. Determination of such a predictive model may also help elucidate the beneficial mechanisms that occur in polyurea during high rate loading. The tool deployed to do this has been Group Interaction Modelling (GIM) - a mean field technique that has been shown to predict the mechanical and physical properties of polymers from their structure alone. The structure of polyurea has been used to characterise the parameters in the GIM scheme without recourse to experimental data and the equation of state and constitutive model predicts response over a wide range of temperatures and strain rates. The shock Hugoniot has been predicted and validated against existing data. Mechanical response in tensile tests has also been predicted and validated.

  18. Multi-Model Ensemble Wake Vortex Prediction

    NASA Technical Reports Server (NTRS)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  19. Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor

    SciTech Connect

    Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

    1992-10-01

    This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

  20. Mathematical modeling of ignition of woodlands resulted from accident on the pipeline

    NASA Astrophysics Data System (ADS)

    Perminov, V. A.; Loboda, E. L.; Reyno, V. V.

    2014-11-01

    Accidents occurring at the sites of pipelines, accompanied by environmental damage, economic loss, and sometimes loss of life. In this paper we calculated the sizes of the possible ignition zones in emergency situations on pipelines located close to the forest, accompanied by the appearance of fireballs. In this paper, using the method of mathematical modeling calculates the maximum size of the ignition zones of vegetation as a result of accidental releases of flammable substances. The paper suggested in the context of the general mathematical model of forest fires give a new mathematical setting and method of numerical solution of a problem of a forest fire modeling. The boundary-value problem is solved numerically using the method of splitting according to physical processes. The dependences of the size of the forest fuel for different amounts of leaked flammable substances and moisture content of vegetation.

  1. Predictive QSAR modeling of phosphodiesterase 4 inhibitors.

    PubMed

    Kovalishyn, Vasyl; Tanchuk, Vsevolod; Charochkina, Larisa; Semenuta, Ivan; Prokopenko, Volodymyr

    2012-02-01

    A series of diverse organic compounds, phosphodiesterase type 4 (PDE-4) inhibitors, have been modeled using a QSAR-based approach. 48 QSAR models were compared by following the same procedure with different combinations of descriptors and machine learning methods. QSAR methodologies used random forests and associative neural networks. The predictive ability of the models was tested through leave-one-out cross-validation, giving a Q² = 0.66-0.78 for regression models and total accuracies Ac=0.85-0.91 for classification models. Predictions for the external evaluation sets obtained accuracies in the range of 0.82-0.88 (for active/inactive classifications) and Q² = 0.62-0.76 for regressions. The method showed itself to be a potential tool for estimation of IC₅₀ of new drug-like candidates at early stages of drug development. PMID:22023934

  2. A Predictive Model for Root Caries Incidence.

    PubMed

    Ritter, André V; Preisser, John S; Puranik, Chaitanya P; Chung, Yunro; Bader, James D; Shugars, Daniel A; Makhija, Sonia; Vollmer, William M

    2016-01-01

    This study aimed to find the set of risk indicators best able to predict root caries (RC) incidence in caries-active adults utilizing data from the Xylitol for Adult Caries Trial (X-ACT). Five logistic regression models were compared with respect to their predictive performance for incident RC using data from placebo-control participants with exposed root surfaces at baseline and from two study centers with ancillary data collection (n = 155). Prediction performance was assessed from baseline variables and after including ancillary variables [smoking, diet, use of removable partial dentures (RPD), toothbrush use, income, education, and dental insurance]. A sensitivity analysis added treatment to the models for both the control and treatment participants (n = 301) to predict RC for the control participants. Forty-nine percent of the control participants had incident RC. The model including the number of follow-up years at risk, the number of root surfaces at risk, RC index, gender, race, age, and smoking resulted in the best prediction performance, having the highest AUC and lowest Brier score. The sensitivity analysis supported the primary analysis and gave slightly better performance summary measures. The set of risk indicators best able to predict RC incidence included an increased number of root surfaces at risk and increased RC index at baseline, followed by white race and nonsmoking, which were strong nonsignificant predictors. Gender, age, and increased number of follow-up years at risk, while included in the model, were also not statistically significant. The inclusion of health, diet, RPD use, toothbrush use, income, education, and dental insurance variables did not improve the prediction performance. PMID:27160516

  3. Development of Models for Predicting the Predominant Taste and Odor Compounds in Taihu Lake, China

    PubMed Central

    Sun, Xiaoxue; Deng, Xuwei; Niu, Yuan; Xie, Ping

    2012-01-01

    Taste and odor (T&O) problems, which have adversely affected the quality of water supplied to millions of residents, have repeatedly occurred in Taihu Lake (e.g., a serious odor accident occurred in 2007). Because these accidents are difficult for water resource managers to forecast in a timely manner, there is an urgent need to develop optimum models to predict these T&O problems. For this purpose, various biotic and abiotic environmental parameters were monitored monthly for one year at 30 sites across Taihu Lake. This is the first investigation of this huge lake to sample T&O compounds at the whole-lake level. Certain phytoplankton taxa were important variables in the models; for instance, the concentrations of the particle-bound 2-methylisoborneol (p-MIB) were correlated with the presence of Oscillatoria, whereas those of the p-β-cyclocitral and p-β-ionone were correlated with Microcystis levels. Abiotic factors such as nitrogen (TN, TDN, NO3-N, and NO2-N), pH, DO, COND, COD and Chl-a also contributed significantly to the T&O predictive models. The dissolved (d) T&O compounds were related to both the algal biomass and to certain abiotic environmental factors, whereas the particle-bound (p) T&O compounds were more strongly related to the algal presence. We also tested the validity of these models using an independent data set that was previously collected from Taihu Lake in 2008. In comparing the concentrations of the T&O compounds observed in 2008 with those concentrations predicted from our models, we found that most of the predicted data points fell within the 90% confidence intervals of the observed values. This result supported the validity of these models in the studied system. These models, basing on easily collected environmental data, will be of practical value to the water resource managers of Taihu Lake for evaluating the probability of T&O accidents. PMID:23284835

  4. Influence of the meteorological input on the atmospheric transport modelling with FLEXPART of radionuclides from the Fukushima Daiichi nuclear accident.

    PubMed

    Arnold, D; Maurer, C; Wotawa, G; Draxler, R; Saito, K; Seibert, P

    2015-01-01

    In the present paper the role of precipitation as FLEXPART model input is investigated for one possible release scenario of the Fukushima Daiichi accident. Precipitation data from the European Center for Medium-Range Weather Forecast (ECMWF), the NOAA's National Center for Environmental Prediction (NCEP), the Japan Meteorological Agency's (JMA) mesoscale analysis and a JMA radar-rain gauge precipitation analysis product were utilized. The accident of Fukushima in March 2011 and the following observations enable us to assess the impact of these precipitation products at least for this single case. As expected the differences in the statistical scores are visible but not large. Increasing the ECMWF resolution of all the fields from 0.5° to 0.2° rises the correlation from 0.71 to 0.80 and an overall rank from 3.38 to 3.44. Substituting ECMWF precipitation, while the rest of the variables remains unmodified, by the JMA mesoscale precipitation analysis and the JMA radar gauge precipitation data yield the best results on a regional scale, specially when a new and more robust wet deposition scheme is introduced. The best results are obtained with a combination of ECMWF 0.2° data with precipitation from JMA mesoscale analyses and the modified wet deposition with a correlation of 0.83 and an overall rank of 3.58. NCEP-based results with the same source term are generally poorer, giving correlations around 0.66, and comparatively large negative biases and an overall rank of 3.05 that worsens when regional precipitation data is introduced. PMID:24679678

  5. Predictive coding as a model of cognition.

    PubMed

    Spratling, M W

    2016-08-01

    Previous work has shown that predictive coding can provide a detailed explanation of a very wide range of low-level perceptual processes. It is also widely believed that predictive coding can account for high-level, cognitive, abilities. This article provides support for this view by showing that predictive coding can simulate phenomena such as categorisation, the influence of abstract knowledge on perception, recall and reasoning about conceptual knowledge, context-dependent behavioural control, and naive physics. The particular implementation of predictive coding used here (PC/BC-DIM) has previously been used to simulate low-level perceptual behaviour and the neural mechanisms that underlie them. This algorithm thus provides a single framework for modelling both perceptual and cognitive brain function. PMID:27118562

  6. Modeling and Prediction of Fan Noise

    NASA Technical Reports Server (NTRS)

    Envia, Ed

    2008-01-01

    Fan noise is a significant contributor to the total noise signature of a modern high bypass ratio aircraft engine and with the advent of ultra high bypass ratio engines like the geared turbofan, it is likely to remain so in the future. As such, accurate modeling and prediction of the basic characteristics of fan noise are necessary ingredients in designing quieter aircraft engines in order to ensure compliance with ever more stringent aviation noise regulations. In this paper, results from a comprehensive study aimed at establishing the utility of current tools for modeling and predicting fan noise will be summarized. It should be emphasized that these tools exemplify present state of the practice and embody what is currently used at NASA and Industry for predicting fan noise. The ability of these tools to model and predict fan noise is assessed against a set of benchmark fan noise databases obtained for a range of representative fan cycles and operating conditions. Detailed comparisons between the predicted and measured narrowband spectral and directivity characteristics of fan nose will be presented in the full paper. General conclusions regarding the utility of current tools and recommendations for future improvements will also be given.

  7. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt

  8. Observation simulation experiments with regional prediction models

    NASA Technical Reports Server (NTRS)

    Diak, George; Perkey, Donald J.; Kalb, Michael; Robertson, Franklin R.; Jedlovec, Gary

    1990-01-01

    Research efforts in FY 1990 included studies employing regional scale numerical models as aids in evaluating potential contributions of specific satellite observing systems (current and future) to numerical prediction. One study involves Observing System Simulation Experiments (OSSEs) which mimic operational initialization/forecast cycles but incorporate simulated Advanced Microwave Sounding Unit (AMSU) radiances as input data. The objective of this and related studies is to anticipate the potential value of data from these satellite systems, and develop applications of remotely sensed data for the benefit of short range forecasts. Techniques are also being used that rely on numerical model-based synthetic satellite radiances to interpret the information content of various types of remotely sensed image and sounding products. With this approach, evolution of simulated channel radiance image features can be directly interpreted in terms of the atmospheric dynamical processes depicted by a model. Progress is being made in a study using the internal consistency of a regional prediction model to simplify the assessment of forced diabatic heating and moisture initialization in reducing model spinup times. Techniques for model initialization are being examined, with focus on implications for potential applications of remote microwave observations, including AMSU and Special Sensor Microwave Imager (SSM/I), in shortening model spinup time for regional prediction.

  9. Analysis and predictive modeling of asthma phenotypes.

    PubMed

    Brasier, Allan R; Ju, Hyunsu

    2014-01-01

    Molecular classification using robust biochemical measurements provides a level of diagnostic precision that is unattainable using indirect phenotypic measurements. Multidimensional measurements of proteins, genes, or metabolites (analytes) can identify subtle differences in the pathophysiology of patients with asthma in a way that is not otherwise possible using physiological or clinical assessments. We overview a method for relating biochemical analyte measurements to generate predictive models of discrete (categorical) clinical outcomes, a process referred to as "supervised classification." We consider problems inherent in wide (small n and large p) high-dimensional data, including the curse of dimensionality, collinearity and lack of information content. We suggest methods for reducing the data to the most informative features. We describe different approaches for phenotypic modeling, using logistic regression, classification and regression trees, random forest and nonparametric regression spline modeling. We provide guidance on post hoc model evaluation and methods to evaluate model performance using ROC curves and generalized additive models. The application of validated predictive models for outcome prediction will significantly impact the clinical management of asthma. PMID:24162915

  10. Preliminary design report for modeling of hydrogen uptake in fuel rod cladding during severe accidents

    SciTech Connect

    Siefken, L.J.

    1998-08-01

    Preliminary designs are described for models of the interaction of Zircaloy and hydrogen and the consequences of this interaction on the behavior of fuel rod cladding during severe accidents. The modeling of this interaction and its consequences involves the modeling of seven processes: (1) diffusion of oxygen from the bulk gas into the boundary layer at the external cladding surface, (2) diffusion from the boundary layer into the oxide layer at the cladding external surface, (3) diffusion from the inner surface of the oxide layer into the metallic part of the cladding, (4) uptake of hydrogen in the event that the cladding oxide layer is dissolved in a steam-starved region, (5) embrittlement of cladding due to hydrogen uptake, (6) cracking of cladding during quenching due to its embrittlement and (7) release of hydrogen from the cladding after cracking of the cladding. An integral diffusion method is described for calculating the diffusion processes in the cladding. Experimental and theoretical results are presented that show the uptake of hydrogen in the event of dissolution of the oxide layer occurs rapidly and that show the release of hydrogen in the event of cracking of the cladding occurs rapidly. These experimental results are used as a basis for calculating the rate of hydrogen uptake and the rate of hydrogen release. The uptake of hydrogen is limited to the equilibrium solubility calculated by applying Sievert`s law. The uptake of hydrogen is an exothermic reaction that accelerates the heatup of a fuel rod. An embrittlement criteria is described that accounts for hydrogen and oxygen concentration and the extent of oxidation. A design is described for implementing the models for Zr-H interaction into the programming framework of the SCDAP/RELAP5 code. A test matrix is described for assessing the impact of the Zr-H interaction models on the calculated behavior of fuel rods in severe accident conditions.

  11. Modelling language evolution: Examples and predictions.

    PubMed

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines. PMID:24286718

  12. Modelling language evolution: Examples and predictions

    NASA Astrophysics Data System (ADS)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  13. Combining Modeling and Gaming for Predictive Analytics

    SciTech Connect

    Riensche, Roderick M.; Whitney, Paul D.

    2012-08-22

    Many of our most significant challenges involve people. While human behavior has long been studied, there are recent advances in computational modeling of human behavior. With advances in computational capabilities come increases in the volume and complexity of data that humans must understand in order to make sense of and capitalize on these modeling advances. Ultimately, models represent an encapsulation of human knowledge. One inherent challenge in modeling is efficient and accurate transfer of knowledge from humans to models, and subsequent retrieval. The simulated real-world environment of games presents one avenue for these knowledge transfers. In this paper we describe our approach of combining modeling and gaming disciplines to develop predictive capabilities, using formal models to inform game development, and using games to provide data for modeling.

  14. Regional long-term model of radioactivity dispersion and fate in the Northwestern Pacific and adjacent seas: application to the Fukushima Dai-ichi accident.

    PubMed

    Maderich, V; Bezhenar, R; Heling, R; de With, G; Jung, K T; Myoung, J G; Cho, Y-K; Qiao, F; Robertson, L

    2014-05-01

    The compartment model POSEIDON-R was modified and applied to the Northwestern Pacific and adjacent seas to simulate the transport and fate of radioactivity in the period 1945-2010, and to perform a radiological assessment on the releases of radioactivity due to the Fukushima Dai-ichi accident for the period 2011-2040. The model predicts the dispersion of radioactivity in the water column and in sediments, the transfer of radionuclides throughout the marine food web, and subsequent doses to humans due to the consumption of marine products. A generic predictive dynamic food-chain model is used instead of the biological concentration factor (BCF) approach. The radionuclide uptake model for fish has as a central feature the accumulation of radionuclides in the target tissue. The three layer structure of the water column makes it possible to describe the vertical structure of radioactivity in deep waters. In total 175 compartments cover the Northwestern Pacific, the East China and Yellow Seas and the East/Japan Sea. The model was validated from (137)Cs data for the period 1945-2010. Calculated concentrations of (137)Cs in water, bottom sediments and marine organisms in the coastal compartment, before and after the accident, are in close agreement with measurements from the Japanese agencies. The agreement for water is achieved when an additional continuous flux of 3.6 TBq y(-1) is used for underground leakage of contaminated water from the Fukushima Dai-ichi NPP, during the three years following the accident. The dynamic food web model predicts that due to the delay of the transfer throughout the food web, the concentration of (137)Cs for piscivorous fishes returns to background level only in 2016. For the year 2011, the calculated individual dose rate for Fukushima Prefecture due to consumption of fishery products is 3.6 μSv y(-1). Following the Fukushima Dai-ichi accident the collective dose due to ingestion of marine products for Japan increased in 2011 by a

  15. Persistence and predictability in a perfect model

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried D.; Suarez, Max J.; Schemm, Jae-Kyung

    1992-01-01

    A realistic two-level GCM is used to examine the relationship between predictability and persistence. Predictability is measured by the average divergence of ensembles of solutions starting from perturbed initial conditions, and persistence is defined in terms of the autocorrelation function based on a single long-term model integration. The average skill of the dynamical forecasts is compared with the skill of simple persistence-based statistical forecasts. For initial errors comparable in magnitude to present-day analysis errors, the statistical forecast loses all skill after about one week, reflecting the lifetime of the lowest frequency fluctuations in the model. Large ensemble mean dynamical forecasts would be expected to remain skillful for about 3 wk. The disparity between the skill of the statistical and dynamical forecasts is greater for the higher frequency modes, which have little memory beyond 1 d, yet remain predictable for about 2 wk. The results are analyzed in terms of two characteristic time scales.

  16. An exponential filter model predicts lightness illusions

    PubMed Central

    Zeman, Astrid; Brooks, Kevin R.; Ghebreab, Sennay

    2015-01-01

    Lightness, or perceived reflectance of a surface, is influenced by surrounding context. This is demonstrated by the Simultaneous Contrast Illusion (SCI), where a gray patch is perceived lighter against a black background and vice versa. Conversely, assimilation is where the lightness of the target patch moves toward that of the bounding areas and can be demonstrated in White's effect. Blakeslee and McCourt (1999) introduced an oriented difference-of-Gaussian (ODOG) model that is able to account for both contrast and assimilation in a number of lightness illusions and that has been subsequently improved using localized normalization techniques. We introduce a model inspired by image statistics that is based on a family of exponential filters, with kernels spanning across multiple sizes and shapes. We include an optional second stage of normalization based on contrast gain control. Our model was tested on a well-known set of lightness illusions that have previously been used to evaluate ODOG and its variants, and model lightness values were compared with typical human data. We investigate whether predictive success depends on filters of a particular size or shape and whether pooling information across filters can improve performance. The best single filter correctly predicted the direction of lightness effects for 21 out of 27 illusions. Combining two filters together increased the best performance to 23, with asymptotic performance at 24 for an arbitrarily large combination of filter outputs. While normalization improved prediction magnitudes, it only slightly improved overall scores in direction predictions. The prediction performance of 24 out of 27 illusions equals that of the best performing ODOG variant, with greater parsimony. Our model shows that V1-style orientation-selectivity is not necessary to account for lightness illusions and that a low-level model based on image statistics is able to account for a wide range of both contrast and assimilation effects

  17. An exponential filter model predicts lightness illusions.

    PubMed

    Zeman, Astrid; Brooks, Kevin R; Ghebreab, Sennay

    2015-01-01

    Lightness, or perceived reflectance of a surface, is influenced by surrounding context. This is demonstrated by the Simultaneous Contrast Illusion (SCI), where a gray patch is perceived lighter against a black background and vice versa. Conversely, assimilation is where the lightness of the target patch moves toward that of the bounding areas and can be demonstrated in White's effect. Blakeslee and McCourt (1999) introduced an oriented difference-of-Gaussian (ODOG) model that is able to account for both contrast and assimilation in a number of lightness illusions and that has been subsequently improved using localized normalization techniques. We introduce a model inspired by image statistics that is based on a family of exponential filters, with kernels spanning across multiple sizes and shapes. We include an optional second stage of normalization based on contrast gain control. Our model was tested on a well-known set of lightness illusions that have previously been used to evaluate ODOG and its variants, and model lightness values were compared with typical human data. We investigate whether predictive success depends on filters of a particular size or shape and whether pooling information across filters can improve performance. The best single filter correctly predicted the direction of lightness effects for 21 out of 27 illusions. Combining two filters together increased the best performance to 23, with asymptotic performance at 24 for an arbitrarily large combination of filter outputs. While normalization improved prediction magnitudes, it only slightly improved overall scores in direction predictions. The prediction performance of 24 out of 27 illusions equals that of the best performing ODOG variant, with greater parsimony. Our model shows that V1-style orientation-selectivity is not necessary to account for lightness illusions and that a low-level model based on image statistics is able to account for a wide range of both contrast and assimilation effects

  18. Advancements in predictive plasma formation modeling

    NASA Astrophysics Data System (ADS)

    Purvis, Michael A.; Schafgans, Alexander; Brown, Daniel J. W.; Fomenkov, Igor; Rafac, Rob; Brown, Josh; Tao, Yezheng; Rokitski, Slava; Abraham, Mathew; Vargas, Mike; Rich, Spencer; Taylor, Ted; Brandt, David; Pirati, Alberto; Fisher, Aaron; Scott, Howard; Koniges, Alice; Eder, David; Wilks, Scott; Link, Anthony; Langer, Steven

    2016-03-01

    We present highlights from plasma simulations performed in collaboration with Lawrence Livermore National Labs. This modeling is performed to advance the rate of learning about optimal EUV generation for laser produced plasmas and to provide insights where experimental results are not currently available. The goal is to identify key physical processes necessary for an accurate and predictive model capable of simulating a wide range of conditions. This modeling will help to drive source performance scaling in support of the EUV Lithography roadmap. The model simulates pre-pulse laser interaction with the tin droplet and follows the droplet expansion into the main pulse target zone. Next, the interaction of the expanded droplet with the main laser pulse is simulated. We demonstrate the predictive nature of the code and provide comparison with experimental results.

  19. Time prediction model of subway transfer.

    PubMed

    Zhou, Yuyang; Yao, Lin; Gong, Yi; Chen, Yanyan

    2016-01-01

    Walking time prediction aims to deduce waiting time and travel time for passengers and provide a quantitative basis for the subway schedule management. This model is founded based on transfer passenger flow and type of pedestrian facilities. Chaoyangmen station in Beijing was taken as the learning set to obtain the relationship between transfer walking speed and passenger volume. The sectional passenger volume of different facilities was calculated related to the transfer passage classification. Model parameters were computed by curve fitting with respect to various pedestrian facilities. The testing set contained four transfer stations with large passenger volume. It is validated that the established model is effective and practical. The proposed model offers a real-time prediction method with good applicability. It can provide transfer scheme reference for passengers, meanwhile, improve the scheduling and management of the subway operation. PMID:26835224

  20. DKIST Polarization Modeling and Performance Predictions

    NASA Astrophysics Data System (ADS)

    Harrington, David

    2016-05-01

    Calibrating the Mueller matrices of large aperture telescopes and associated coude instrumentation requires astronomical sources and several modeling assumptions to predict the behavior of the system polarization with field of view, altitude, azimuth and wavelength. The Daniel K Inouye Solar Telescope (DKIST) polarimetric instrumentation requires very high accuracy calibration of a complex coude path with an off-axis f/2 primary mirror, time dependent optical configurations and substantial field of view. Polarization predictions across a diversity of optical configurations, tracking scenarios, slit geometries and vendor coating formulations are critical to both construction and contined operations efforts. Recent daytime sky based polarization calibrations of the 4m AEOS telescope and HiVIS spectropolarimeter on Haleakala have provided system Mueller matrices over full telescope articulation for a 15-reflection coude system. AEOS and HiVIS are a DKIST analog with a many-fold coude optical feed and similar mirror coatings creating 100% polarization cross-talk with altitude, azimuth and wavelength. Polarization modeling predictions using Zemax have successfully matched the altitude-azimuth-wavelength dependence on HiVIS with the few percent amplitude limitations of several instrument artifacts. Polarization predictions for coude beam paths depend greatly on modeling the angle-of-incidence dependences in powered optics and the mirror coating formulations. A 6 month HiVIS daytime sky calibration plan has been analyzed for accuracy under a wide range of sky conditions and data analysis algorithms. Predictions of polarimetric performance for the DKIST first-light instrumentation suite have been created under a range of configurations. These new modeling tools and polarization predictions have substantial impact for the design, fabrication and calibration process in the presence of manufacturing issues, science use-case requirements and ultimate system calibration

  1. Predictive performance models and multiple task performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  2. Predictive Modeling of the CDRA 4BMS

    NASA Technical Reports Server (NTRS)

    Coker, Robert; Knox, James

    2016-01-01

    Fully predictive models of the Four Bed Molecular Sieve of the Carbon Dioxide Removal Assembly on the International Space Station are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  3. Cognitive modeling to predict video interpretability

    NASA Astrophysics Data System (ADS)

    Young, Darrell L.; Bakir, Tariq

    2011-06-01

    Processing framework for cognitive modeling to predict video interpretability is discussed. Architecture consists of spatiotemporal video preprocessing, metric computation, metric normalization, pooling of like metric groups with masking adjustments, multinomial logistic pooling of Minkowski pooled groups of similar quality metrics, and estimation of confidence interval of final result.

  4. A Robustly Stabilizing Model Predictive Control Algorithm

    NASA Technical Reports Server (NTRS)

    Ackmece, A. Behcet; Carson, John M., III

    2007-01-01

    A model predictive control (MPC) algorithm that differs from prior MPC algorithms has been developed for controlling an uncertain nonlinear system. This algorithm guarantees the resolvability of an associated finite-horizon optimal-control problem in a receding-horizon implementation.

  5. A Predictive Model for MSSW Student Success

    ERIC Educational Resources Information Center

    Napier, Angela Michele

    2011-01-01

    This study tested a hypothetical model for predicting both graduate GPA and graduation of University of Louisville Kent School of Social Work Master of Science in Social Work (MSSW) students entering the program during the 2001-2005 school years. The preexisting characteristics of demographics, academic preparedness and culture shock along with…

  6. Nearshore Operational Model for Rip Current Predictions

    NASA Astrophysics Data System (ADS)

    Sembiring, L. E.; Van Dongeren, A. R.; Van Ormondt, M.; Winter, G.; Roelvink, J.

    2012-12-01

    A coastal operational model system can serve as a tool in order to monitor and predict coastal hazards, and to acquire up-to-date information on coastal state indicators. The objective of this research is to develop a nearshore operational model system for the Dutch coast focusing on swimmer safety. For that purpose, an operational model system has been built which can predict conditions up to 48 hours ahead. The model system consists of three different nested model domain covering The North Sea, The Dutch coastline, and one local model which is the area of interest. Three different process-based models are used to simulate physical processes within the system: SWAN to simulate wave propagation, Delft3D-Flow for hydraulics flow simulation, and XBeach for the nearshore models. The SWAN model is forced by wind fields from operational HiRLAM, as well as two dimensional wave spectral data from WaveWatch 3 Global as the ocean boundaries. The Delft3D Flow model is forced by assigning the boundaries with tidal constants for several important astronomical components as well as HiRLAM wind fields. For the local XBeach model, up-to-date bathymetry will be obtained by assimilating model computation and Argus video data observation. A hindcast is carried out on the Continental Shelf Model, covering the North Sea and nearby Atlantic Ocean, for the year 2009. Model skills are represented by several statistical measures such as rms error and bias. In general the results show that the model system exhibits a good agreement with field data. For SWAN results, integral significant wave heights are predicted well by the model for all wave buoys considered, with rms errors ranging from 0.16 m for the month of May with observed mean significant wave height of 1.08 m, up to rms error of 0.39 m for the month of November, with observed mean significant wave height of 1.91 m. However, it is found that the wave model slightly underestimates the observation for the period of June, especially

  7. Analyzing the causation of a railway accident based on a complex network

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  8. Disease Prediction Models and Operational Readiness

    PubMed Central

    Corley, Courtney D.; Pullum, Laura L.; Hartley, David M.; Benedum, Corey; Noonan, Christine; Rabinowitz, Peter M.; Lancaster, Mary J.

    2014-01-01

    The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4), spatial (26), ecological niche (28), diagnostic or clinical (6), spread or response (9), and reviews (3). The model parameters (e.g., etiology, climatic, spatial, cultural) and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological) were recorded and reviewed. A component of this review is the identification of verification and validation (V&V) methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology Readiness

  9. Disease prediction models and operational readiness.

    PubMed

    Corley, Courtney D; Pullum, Laura L; Hartley, David M; Benedum, Corey; Noonan, Christine; Rabinowitz, Peter M; Lancaster, Mary J

    2014-01-01

    The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4), spatial (26), ecological niche (28), diagnostic or clinical (6), spread or response (9), and reviews (3). The model parameters (e.g., etiology, climatic, spatial, cultural) and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological) were recorded and reviewed. A component of this review is the identification of verification and validation (V&V) methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology Readiness

  10. Can contaminant transport models predict breakthrough?

    USGS Publications Warehouse

    Peng, Wei-Shyuan; Hampton, Duane R.; Konikow, Leonard F.; Kambham, Kiran; Benegar, Jeffery J.

    2000-01-01

    A solute breakthrough curve measured during a two-well tracer test was successfully predicted in 1986 using specialized contaminant transport models. Water was injected into a confined, unconsolidated sand aquifer and pumped out 125 feet (38.3 m) away at the same steady rate. The injected water was spiked with bromide for over three days; the outflow concentration was monitored for a month. Based on previous tests, the horizontal hydraulic conductivity of the thick aquifer varied by a factor of seven among 12 layers. Assuming stratified flow with small dispersivities, two research groups accurately predicted breakthrough with three-dimensional (12-layer) models using curvilinear elements following the arc-shaped flowlines in this test. Can contaminant transport models commonly used in industry, that use rectangular blocks, also reproduce this breakthrough curve? The two-well test was simulated with four MODFLOW-based models, MT3D (FD and HMOC options), MODFLOWT, MOC3D, and MODFLOW-SURFACT. Using the same 12 layers and small dispersivity used in the successful 1986 simulations, these models fit almost as accurately as the models using curvilinear blocks. Subtle variations in the curves illustrate differences among the codes. Sensitivities of the results to number and size of grid blocks, number of layers, boundary conditions, and values of dispersivity and porosity are briefly presented. The fit between calculated and measured breakthrough curves degenerated as the number of layers and/or grid blocks decreased, reflecting a loss of model predictive power as the level of characterization lessened. Therefore, the breakthrough curve for most field sites can be predicted only qualitatively due to limited characterization of the hydrogeology and contaminant source strength.

  11. Simulations of the transport and deposition of 137Cs over Europe after the Chernobyl NPP accident: influence of varying emission-altitude and model horizontal and vertical resolution

    NASA Astrophysics Data System (ADS)

    Evangeliou, N.; Balkanski, Y.; Cozic, A.; Møller, A. P.

    2013-03-01

    The coupled model LMDzORINCA has been used to simulate the transport, wet and dry deposition of the radioactive tracer 137Cs after accidental releases. For that reason, two horizontal resolutions were deployed and used in the model, a regular grid of 2.5°×1.25°, and the same grid stretched over Europe to reach a resolution of 0.45°×0.51°. The vertical dimension is represented with two different resolutions, 19 and 39 levels, respectively, extending up to mesopause. Four different simulations are presented in this work; the first uses the regular grid over 19 vertical levels assuming that the emissions took place at the surface (RG19L(S)), the second also uses the regular grid over 19 vertical levels but realistic source injection heights (RG19L); in the third resolution the grid is regular and the vertical resolution 39 vertical levels (RG39L) and finally, it is extended to the stretched grid with 19 vertical levels (Z19L). The best choice for the model validation was the Chernobyl accident which occurred in Ukraine (ex-USSR) on 26 May 1986. This accident has been widely studied since 1986, and a large database has been created containing measurements of atmospheric activity concentration and total cumulative deposition for 137Cs from most of the European countries. According to the results, the performance of the model to predict the transport and deposition of the radioactive tracer was efficient and accurate presenting low biases in activity concentrations and deposition inventories, despite the large uncertainties on the intensity of the source released. However, the best agreement with observations was obtained using the highest horizontal resolution of the model (Z19L run). The model managed to predict the radioactive contamination in most of the European regions (similar to Atlas), and also the arrival times of the radioactive fallout. As regards to the vertical resolution, the largest biases were obtained for the 39 layers run due to the increase of

  12. Thermal barrier coating life prediction model development

    NASA Technical Reports Server (NTRS)

    Hillery, R. V.

    1984-01-01

    In order to fully exploit thermal barrier coatings (TBCs) on turbine components and achieve the maximum performance benefit, the knowledge and understanding of TBC failure mechanisms must be increased and the means to predict coating life developed. The proposed program will determine the predominant modes of TBC system degradation and then develop and verify life prediction models accounting for those degradation modes. The successful completion of the program will have dual benefits: the ability to take advantage of the performance benefits offered by TBCs, and a sounder basis for making future improvements in coating behavior.

  13. Model predictive control of constrained LPV systems

    NASA Astrophysics Data System (ADS)

    Yu, Shuyou; Böhm, Christoph; Chen, Hong; Allgöwer, Frank

    2012-06-01

    This article considers robust model predictive control (MPC) schemes for linear parameter varying (LPV) systems in which the time-varying parameter is assumed to be measured online and exploited for feedback. A closed-loop MPC with a parameter-dependent control law is proposed first. The parameter-dependent control law reduces conservativeness of the existing results with a static control law at the cost of higher computational burden. Furthermore, an MPC scheme with prediction horizon '1' is proposed to deal with the case of asymmetric constraints. Both approaches guarantee recursive feasibility and closed-loop stability if the considered optimisation problem is feasible at the initial time instant.

  14. Hidden Markov models for threat prediction fusion

    NASA Astrophysics Data System (ADS)

    Ross, Kenneth N.; Chaney, Ronald D.

    2000-04-01

    This work addresses the often neglected, but important problem of Level 3 fusion or threat refinement. This paper describes algorithms for threat prediction and test results from a prototype threat prediction fusion engine. The threat prediction fusion engine selectively models important aspects of the battlespace state using probability-based methods and information obtained from lower level fusion engines. Our approach uses hidden Markov models of a hierarchical threat state to find the most likely Course of Action (CoA) for the opposing forces. Decision tress use features derived from the CoA probabilities and other information to estimate the level of threat presented by the opposing forces. This approach provides the user with several measures associated with the level of threat, including: probability that the enemy is following a particular CoA, potential threat presented by the opposing forces, and likely time of the threat. The hierarchical approach used for modeling helps us efficiently represent the battlespace with a structure that permits scaling the models to larger scenarios without adding prohibitive computational costs or sacrificing model fidelity.

  15. Genetic models of homosexuality: generating testable predictions

    PubMed Central

    Gavrilets, Sergey; Rice, William R

    2006-01-01

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality including: (i) chromosomal location, (ii) dominance among segregating alleles and (iii) effect sizes that distinguish between the two major models for their polymorphism: the overdominance and sexual antagonism models. We conclude that the measurement of the genetic characteristics of quantitative trait loci (QTLs) found in genomic screens for genes influencing homosexuality can be highly informative in resolving the form of natural selection maintaining their polymorphism. PMID:17015344

  16. ENSO Prediction using Vector Autoregressive Models

    NASA Astrophysics Data System (ADS)

    Chapman, D. R.; Cane, M. A.; Henderson, N.; Lee, D.; Chen, C.

    2013-12-01

    A recent comparison (Barnston et al, 2012 BAMS) shows the ENSO forecasting skill of dynamical models now exceeds that of statistical models, but the best statistical models are comparable to all but the very best dynamical models. In this comparison the leading statistical model is the one based on the Empirical Model Reduction (EMR) method. Here we report on experiments with multilevel Vector Autoregressive models using only sea surface temperatures (SSTs) as predictors. VAR(L) models generalizes Linear Inverse Models (LIM), which are a VAR(1) method, as well as multilevel univariate autoregressive models. Optimal forecast skill is achieved using 12 to 14 months of prior state information (i.e 12-14 levels), which allows SSTs alone to capture the effects of other variables such as heat content as well as seasonality. The use of multiple levels allows the model advancing one month at a time to perform at least as well for a 6 month forecast as a model constructed to explicitly forecast 6 months ahead. We infer that the multilevel model has fully captured the linear dynamics (cf. Penland and Magorian, 1993 J. Climate). Finally, while VAR(L) is equivalent to L-level EMR, we show in a 150 year cross validated assessment that we can increase forecast skill by improving on the EMR initialization procedure. The greatest benefit of this change is in allowing the prediction to make effective use of information over many more months.

  17. Prediction failure of a wolf landscape model

    USGS Publications Warehouse

    Mech, L.D.

    2006-01-01

    I compared 101 wolf (Canis lupus) pack territories formed in Wisconsin during 1993-2004 to the logistic regression predictive model of Mladenoff et al. (1995, 1997, 1999). Of these, 60% were located in putative habitat suitabilities 50% remained unoccupied by known packs after 24 years of recolonization. This model was a poor predictor of wolf re-colonizing locations in Wisconsin, apparently because it failed to consider the adaptability of wolves. Such models should be used cautiously in wolf-management or restoration plans.

  18. STELLA Experiment: Design and Model Predictions

    SciTech Connect

    Kimura, W. D.; Babzien, M.; Ben-Zvi, I.; Campbell, L. P.; Cline, D. B.; Fiorito, R. B.; Gallardo, J. C.; Gottschalk, S. C.; He, P.; Kusche, K. P.; Liu, Y.; Pantell, R. H.; Pogorelsky, I. V.; Quimby, D. C.; Robinson, K. E.; Rule, D. W.; Sandweiss, J.; Skaritka, J.; van Steenbergen, A.; Steinhauer, L. C.; Yakimenko, V.

    1998-07-05

    The STaged ELectron Laser Acceleration (STELLA) experiment will be one of the first to examine the critical issue of staging the laser acceleration process. The BNL inverse free electron laser (EEL) will serve as a prebuncher to generate {approx} 1 {micro}m long microbunches. These microbunches will be accelerated by an inverse Cerenkov acceleration (ICA) stage. A comprehensive model of the STELLA experiment is described. This model includes the EEL prebunching, drift and focusing of the microbunches into the ICA stage, and their subsequent acceleration. The model predictions will be presented including the results of a system error study to determine the sensitivity to uncertainties in various system parameters.

  19. A predictive model of human performance.

    NASA Technical Reports Server (NTRS)

    Walters, R. F.; Carlson, L. D.

    1971-01-01

    An attempt is made to develop a model describing the overall responses of humans to exercise and environmental stresses for prediction of exhaustion vs an individual's physical characteristics. The principal components of the model are a steady state description of circulation and a dynamic description of thermal regulation. The circulatory portion of the system accepts changes in work load and oxygen pressure, while the thermal portion is influenced by external factors of ambient temperature, humidity and air movement, affecting skin blood flow. The operation of the model is discussed and its structural details are given.

  20. Retrospective dosimetry with the MAX/EGS4 exposure model for the radiological accident in Nesvizh-Belarus

    NASA Astrophysics Data System (ADS)

    Santos, A. M.; Kramer, R.; Brayner, C. A.; Khoury, H. J.; Vieira, J. W.

    2007-09-01

    On October 26, 1991 a fatal radiological accident occurred in a 60Co irradiation facility in the town of Nesvizh in Belarus. Following a jam in the product transport system, the operator entered the facility to clear the fault. On entering the irradiation room the operator bypassed a number of safety features, which prevented him from perceiving that the source rack was in the irradiation position. After the accident average whole body absorbed doses between 8 and 16 Gy have been determined by TLD measurements, by isodose rate distributions, by biological dosimetry and by ESR measurements of clothes and teeth. In an earlier investigation the MAX/EGS4 exposure model had been used to calculate absorbed dose distributions for the radiological accident in Yanango/Peru, which actually represented the simulation of exposure from a point source on the surface of the body. After updating the phantom as well as the Monte Carlo code, the MAX/EGS4 exposure model was used to calculate the absorbed dose distribution for the worker involved in the radiological accident in Nesvizh/Belarus. For this purpose, the arms of the MAX phantom had to be raised above the head, and a rectangular 60Co source was designed to represent the source rack used in the irradiation facility. Average organ absorbed doses, depth-absorbed doses, maximum absorbed dose and average whole body absorbed dose have been calculated and compared with the corresponding data given in the IAEA report of the accident.

  1. Predicting freakish sea state with an operational third-generation wave model

    NASA Astrophysics Data System (ADS)

    Waseda, T.; In, K.; Kiyomatsu, K.; Tamura, H.; Miyazawa, Y.; Iyama, K.

    2014-04-01

    The understanding of freak wave generation mechanisms has advanced and the community has reached a consensus that spectral geometry plays an important role. Numerous marine accident cases were studied and revealed that the narrowing of the directional spectrum is a good indicator of dangerous sea. However, the estimation of the directional spectrum depends on the performance of the third-generation wave model. In this work, a well-studied marine accident case in Japan in 1980 (Onomichi-Maru incident) is revisited and the sea states are hindcasted using both the DIA (discrete interaction approximation) and SRIAM (Simplified Research Institute of Applied Mechanics) nonlinear source terms. The result indicates that the temporal evolution of the basic parameters (directional spreading and frequency bandwidth) agree reasonably well between the two schemes and therefore the most commonly used DIA method is qualitatively sufficient to predict freakish sea state. The analyses revealed that in the case of Onomichi-Maru, a moving gale system caused the spectrum to grow in energy with limited downshifting at the accident's site. This conclusion contradicts the marine inquiry report speculating that the two swell systems crossed at the accident's site. The unimodal wave system grew under strong influence of local wind with a peculiar energy transfer.

  2. Predicting freakish sea state with an operational third generation wave model

    NASA Astrophysics Data System (ADS)

    Waseda, T.; In, K.; Kiyomatsu, K.; Tamura, H.; Miyazawa, Y.; Iyama, K.

    2013-11-01

    Understanding of freak wave generation mechanism has advanced and the community has reached to a consensus that spectral geometry plays an important role. Numerous marine accident cases were studied and revealed that the narrowing of the directional spectrum is a good indicator of dangerous sea. However, the estimation of the directional spectrum depends on the performance of the third generation wave model. In this work, a well-studied marine accident case in Japan in 1980 (Onomichi-Maru incident) is revisited and the sea states are hind-casted using both the DIA and SRIAM nonlinear source terms. The result indicates that the temporal evolution of the basic parameters (directional spreading and frequency bandwidth) agree reasonably well between the two schemes and therefore most commonly used DIA method is qualitatively sufficient to predict freakish sea state. The analyses revealed that in the case of Onomichi-Maru, a moving gale system caused the spectrum to grow in energy with limited down-shifting at the accident site. This conclusion contradicts the marine inquiry report speculating that the two swell systems crossed at the accident site. The unimodal wave system grew under strong influence of local wind with a peculiar energy transfer.

  3. Modeling operator actions during a small break loss-of-coolant accident in a Babcock and Wilcox nuclear power plant

    SciTech Connect

    Ghan, L.S.; Ortiz, M.G.

    1991-12-31

    A small break loss-of-accident (SBLOCA) in a typical Babcock and Wilcox (B&W) nuclear power plant was modeled using RELAP5/MOD3. This work was performed as part of the United States Regulatory Commission`s (USNRC) Code, Scaling, Applicability and Uncertainty (CSAU) study. The break was initiated by severing one high pressure injection (HPI) line at the cold leg. Thus, the small break was further aggravated by reduced HPI flow. Comparisons between scoping runs with minimal operator action, and full operator action, clearly showed that the operator plays a key role in recovering the plant. Operator actions were modeled based on the emergency operating procedures (EOPs) and the Technical Bases Document for the EOPs. The sequence of operator actions modeled here is only one of several possibilities. Different sequences of operator actions are possible for a given accident because of the subjective decisions the operator must make when determining the status of the plant, hence, which branch of the EOP to follow. To assess the credibility of the modeled operator actions, these actions and results of the simulated accident scenario were presented to operator examiners who are familiar with B&W nuclear power plants. They agreed that, in general, the modeled operator actions conform to the requirements set forth in the EOPs and are therefore plausible. This paper presents the method for modeling the operator actions and discusses the simulated accident scenario from the viewpoint of operator actions.

  4. Applying hierarchical loglinear models to nonfatal underground coal mine accidents for safety management.

    PubMed

    Onder, Mustafa; Onder, Seyhan; Adiguzel, Erhan

    2014-01-01

    Underground mining is considered to be one of the most dangerous industries and mining remains the most hazardous occupation. Categorical analysis of accident records may present valuable information for preventing accidents. In this study, hierarchical loglinear analysis was applied to occupational injuries that occurred in an underground coal mine. The main factors affecting the accidents were defined as occupation, area, reason, accident time and part of body affected. By considering subfactors of the main factors, multiway contingency tables were prepared and, thus, the probabilities that might affect nonfatal injuries were investigated. At the end of the study, important accident risk factors and job groups with a high probability of being exposed to those risk factors were determined. This article presents important information on decreasing the number accidents in underground coal mines. PMID:24934420

  5. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  6. Progresses in tritium accident modelling in the frame of IAEA EMRAS II

    SciTech Connect

    Galeriu, D.; Melintescu, A.

    2015-03-15

    The assessment of the environmental impact of tritium release from nuclear facilities is a topic of interest in many countries. In the IAEA's Environmental Modelling for Radiation Safety (EMRAS I) programme, progresses for routine releases were done and in the EMRAS II programme a dedicated working group (WG 7 - Tritium Accidents) focused on the potential accidental releases (liquid and atmospheric pathways). The progresses achieved in WG 7 were included in a complex report - a technical document of IAEA covering both liquid and atmospheric accidental release consequences. A brief description of the progresses achieved in the frame of EMRAS II WG 7 is presented. Important results have been obtained concerning washout rate, the deposition on the soil of HTO and HT, the HTO uptake by leaves and the subsequent conversion to OBT (organically bound tritium) during daylight. Further needs of the processes understanding and the experimental efforts are emphasised.

  7. Dynamic modeling of physical phenomena for probabilistic assessment of spent fuel accidents

    SciTech Connect

    Benjamin, A.S.

    1997-11-01

    If there should be an accident involving drainage of all the water from a spent fuel pool, the fuel elements will heat up until the heat produced by radioactive decay is balanced by that removed by natural convection to air, thermal radiation, and other means. If the temperatures become high enough for the cladding or other materials to ignite due to rapid oxidation, then some of the fuel might melt, leading to an undesirable release of radioactive materials. The amount of melting is dependent upon the fuel loading configuration and its age, the oxidation and melting characteristics of the materials, and the potential effectiveness of recovery actions. The authors have developed methods for modeling the pertinent physical phenomena and integrating the results with a probabilistic treatment of the uncertainty distributions. The net result is a set of complementary cumulative distribution functions for the amount of fuel melted.

  8. Predicting Consequences of Technological Disasters from Natural Hazard Events: Challenges and Opportunities Associated with Industrial Accident Data Sources

    NASA Astrophysics Data System (ADS)

    Wood, M.

    2009-04-01

    The increased focus on the possibility of technological accidents caused by natural events (Natech) is foreseen to continue for years to come. In this case, experts in prevention, mitigation and preparation activities associated with natural events will increasingly need to borrow data and expertise traditionally associated with the technological fields to carry out the work. An important question is how useful is the data for understanding consequences from such natech events. Data and case studies provided on major industrial accidents tend to focus on lessons learned for re-engineering the process. While consequence data are reported at least nominally in most reports, their precision, quality and completeness is often lacking. Consequences that are often or sometimes available but not provided can include severity and type of injuries, distance of victims from the source, exposure measurements, volume of the release, population in potentially affected zones, and weather conditions. Yet these are precisely the type of data that will aid natural hazard experts in land-use planning and emergency response activities when a Natech event may be foreseen. This work discusses the results of a study of consequence data from accidents involving toxic releases reported in the EU's MARS accident database. The study analysed the precision, quality and completeness of three categories of consequence data reported: the description of health effects, consequence assessment and chemical risk assessment factors, and emergency response information. This work reports on the findings from this study and discusses how natural hazards experts might interact with industrial accident experts to promote more consistent and accurate reporting of the data that will be useful in consequence-based activities.

  9. Urban daytime traffic noise prediction models.

    PubMed

    da Paz, Elaine Carvalho; Zannin, Paulo Henrique Trombetta

    2010-04-01

    An evaluation was made of the acoustic environment generated by an urban highway using in situ measurements. Based on the data collected, a mathematical model was designed for the main sound levels (L (eq), L (10), L (50), and L (90)) as a function of the correlation between sound levels and between the equivalent sound pressure level and traffic variables. Four valid groups of mathematical models were generated to calculate daytime sound levels, which were statistically validated. It was found that the new models can be considered as accurate as other models presented in the literature to assess and predict daytime traffic noise, and that they stand out and differ from the existing models described in the literature thanks to two characteristics, namely, their linearity and the application of class intervals. PMID:19353296

  10. Validated predictive modelling of the environmental resistome

    PubMed Central

    Amos, Gregory CA; Gozzard, Emma; Carter, Charlotte E; Mead, Andrew; Bowes, Mike J; Hawkey, Peter M; Zhang, Lihong; Singer, Andrew C; Gaze, William H; Wellington, Elizabeth M H

    2015-01-01

    Multi-drug-resistant bacteria pose a significant threat to public health. The role of the environment in the overall rise in antibiotic-resistant infections and risk to humans is largely unknown. This study aimed to evaluate drivers of antibiotic-resistance levels across the River Thames catchment, model key biotic, spatial and chemical variables and produce predictive models for future risk assessment. Sediment samples from 13 sites across the River Thames basin were taken at four time points across 2011 and 2012. Samples were analysed for class 1 integron prevalence and enumeration of third-generation cephalosporin-resistant bacteria. Class 1 integron prevalence was validated as a molecular marker of antibiotic resistance; levels of resistance showed significant geospatial and temporal variation. The main explanatory variables of resistance levels at each sample site were the number, proximity, size and type of surrounding wastewater-treatment plants. Model 1 revealed treatment plants accounted for 49.5% of the variance in resistance levels. Other contributing factors were extent of different surrounding land cover types (for example, Neutral Grassland), temporal patterns and prior rainfall; when modelling all variables the resulting model (Model 2) could explain 82.9% of variations in resistance levels in the whole catchment. Chemical analyses correlated with key indicators of treatment plant effluent and a model (Model 3) was generated based on water quality parameters (contaminant and macro- and micro-nutrient levels). Model 2 was beta tested on independent sites and explained over 78% of the variation in integron prevalence showing a significant predictive ability. We believe all models in this study are highly useful tools for informing and prioritising mitigation strategies to reduce the environmental resistome. PMID:25679532