Science.gov

Sample records for prediction project modificado

  1. Prediction in projection.

    PubMed

    Garland, Joshua; Bradley, Elizabeth

    2015-12-01

    Prediction models that capture and use the structure of state-space dynamics can be very effective. In practice, however, one rarely has access to full information about that structure, and accurate reconstruction of the dynamics from scalar time-series data-e.g., via delay-coordinate embedding-can be a real challenge. In this paper, we show that forecast models that employ incomplete reconstructions of the dynamics-i.e., models that are not necessarily true embeddings-can produce surprisingly accurate predictions of the state of a dynamical system. In particular, we demonstrate the effectiveness of a simple near-neighbor forecast technique that works with a two-dimensional time-delay reconstruction of both low- and high-dimensional dynamical systems. Even though correctness of the topology may not be guaranteed for incomplete reconstructions like this, the dynamical structure that they do capture allows for accurate predictions-in many cases, even more accurate than predictions generated using a traditional embedding. This could be very useful in the context of real-time forecasting, where the human effort required to produce a correct delay-coordinate embedding is prohibitive. PMID:26723147

  2. The Experimental MJO Prediction Project

    NASA Technical Reports Server (NTRS)

    Waliser, Duane; Weickmann, Klaus; Dole, Randall; Schubert, Siegfried; Alves, Oscar; Jones, Charles; Newman, Matthew; Pan, Hua-Lu; Roubicek, Andres; Saha, Suranjana; Smith, Cathy; VanDenDool, Huug; Vitart, Frederic; Wheeler, Matthew; Whitaker, Jeffrey

    2006-01-01

    Weather prediction is typically concerned with lead times of hours to days, while seasonal-to-interannual climate prediction is concerned with lead times of months to seasons. Recently, there has been growing interest in 'subseasonal' forecasts---those that have lead times on the order of weeks (e.g., Schubert et al. 2002; Waliser et al. 2003; Waliser et al. 2005). The basis for developing and exploiting subseasonal predictions largely resides with phenomena such as the Pacific North American (PNA) pattern, the North Atlantic oscillation (NAO), the Madden-Julian Oscillation (MJO), mid-latitude blocking, and the memory associated with soil moisture, as well as modeling techniques that rely on both initial conditions and slowly varying boundary conditions (e.g., tropical Pacific SST). An outgrowth of this interest has been the development of an Experimental MJO Prediction Project (EMPP). Th project provides real-time weather and climate information and predictions for a variety of applications, broadly encompassing the subseasonal weather-climate connection. Th focus is on the MJO because it represents a repeatable, low-frequency phenomenon. MJO's importance among the subseasonal phenomena is very similar to that of El Nino-Southern Oscillation(ENSO) among the interannual phenomena. This note describes the history and objectives of EMPP, its status,capabilities, and plans.

  3. Decadal climate prediction (project GCEP).

    PubMed

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability. PMID:19087944

  4. The Predictive Validity of Projective Measures.

    ERIC Educational Resources Information Center

    Suinn, Richard M.; Oskamp, Stuart

    Written for use by clinical practitioners as well as psychological researchers, this book surveys recent literature (1950-1965) on projective test validity by reviewing and critically evaluating studies which shed light on what may reliably be predicted from projective test results. Two major instruments are covered: the Rorschach and the Thematic…

  5. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    PubMed Central

    Mansouri, Kamel; Abdelaziz, Ahmed; Rybacka, Aleksandra; Roncaglioni, Alessandra; Tropsha, Alexander; Varnek, Alexandre; Zakharov, Alexey; Worth, Andrew; Richard, Ann M.; Grulke, Christopher M.; Trisciuzzi, Daniela; Fourches, Denis; Horvath, Dragos; Benfenati, Emilio; Muratov, Eugene; Wedebye, Eva Bay; Grisoni, Francesca; Mangiatordi, Giuseppe F.; Incisivo, Giuseppina M.; Hong, Huixiao; Ng, Hui W.; Tetko, Igor V.; Balabin, Ilya; Kancherla, Jayaram; Shen, Jie; Burton, Julien; Nicklaus, Marc; Cassotti, Matteo; Nikolov, Nikolai G.; Nicolotti, Orazio; Andersson, Patrik L.; Zang, Qingda; Politi, Regina; Beger, Richard D.; Todeschini, Roberto; Huang, Ruili; Farag, Sherif; Rosenberg, Sine A.; Slavov, Svetoslav; Hu, Xin; Judson, Richard S.

    2016-01-01

    Background: Humans are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Most of these chemicals have never been tested for their ability to interact with the estrogen receptor (ER). Risk assessors need tools to prioritize chemicals for evaluation in costly in vivo tests, for instance, within the U.S. EPA Endocrine Disruptor Screening Program. Objectives: We describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) and demonstrate the efficacy of using predictive computational models trained on high-throughput screening data to evaluate thousands of chemicals for ER-related activity and prioritize them for further testing. Methods: CERAPP combined multiple models developed in collaboration with 17 groups in the United States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure–activity relationship models and docking approaches were employed, mostly using a common training set of 1,677 chemical structures provided by the U.S. EPA, to build a total of 40 categorical and 8 continuous models for binding, agonist, and antagonist ER activity. All predictions were evaluated on a set of 7,522 chemicals curated from the literature. To overcome the limitations of single models, a consensus was built by weighting models on scores based on their evaluated accuracies. Results: Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. Out of the 32,464 chemicals, the consensus model predicted 4,001 chemicals (12.3%) as high priority actives and 6,742 potential actives (20.8%) to be considered for further testing. Conclusion: This project demonstrated the possibility to screen large libraries of chemicals using a consensus of different in silico approaches. This concept will be applied in future projects related to other

  6. Projections and predictability of Arctic shipping accessibility

    NASA Astrophysics Data System (ADS)

    Melia, Nathanael; Haines, Keith; Hawkins, Ed

    2016-04-01

    The observed reduction in Arctic sea ice opens up the potential for shorter shipping routes across the Arctic Ocean, leading to potentially significant global economic savings. We demonstrate, using bias-corrected global climate models, that the projected sea ice melt through the 21st century increases opportunities for ships to sail through the Arctic between North Atlantic and East Asian ports. Transit potential for Open Water vessels doubles from early to mid-century and coincides with the opening of the trans-polar sea route. Although seasonal, routes become more reliable with an overall increased shipping season length, but with considerable variability from year-to-year. We also demonstrate that there is potential predictability for whether a particular season will be relatively open or closed to shipping access from a few months ahead.

  7. The Vog Measurement and Prediction (VMAP) Project

    NASA Astrophysics Data System (ADS)

    Businger, S.; Huff, R.; Sutton, A. J.; Elias, T.; Horton, K. A.

    2011-12-01

    Emissions from Kilauea volcano pose significant environmental and health risks to Hawaii. The overarching goal of this feasibility project is to develop an accurate and timely volcanic air-pollution forecasting capacity with a program of verification using state-of-the-art observation methods. To date VMAP has (i) created a real-time modeling and forecast capability using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to predict the concentration and dispersion of SO2 gas and sulfate aerosol from Kilauea volcano (Fig. 1). HYSPLIT uses the output of a high-resolution operational run of the Weather Research and Forecast (WRF) model for initial and boundary conditions. (ii) Developed an operational spectrometer-based SO2 emission rate monitor for use as input to the dispersion model, (iii) Cooperatively deployed an array of stationary SO2 gas and sulfate aerosol sensors to record the ground-level spatial characteristics of Kilauea's gas plume in high temporal and spatial resolution for verification and improvement of the gas dispersion prediction, (iv) Developed a series of web pages to disseminate observations and forecasts, which can be used by safety officials to protect the public, and to raise public awareness of the hazards of volcanic emissions to respiratory health, agriculture, and general aviation, (v) Developed an archive of vog data to facilitate estimation of historical concentration frequency-of-exposure. VMAP provides technical support for researchers, health professionals, and to our stakeholders, who have also provided constructive input in the development of our products. Preliminary results of our efforts will be presented and future work will be discussed.

  8. Earth Orientation Parameters Combination of Prediction Pilot Project

    NASA Astrophysics Data System (ADS)

    Shumate, N. A.; Luzum, B. J.; Kosek, W.

    2013-12-01

    The International Earth Rotation and Reference Systems Service (IERS) has been producing ensemble predictions of Earth Orientation Parameters (EOPs) on a daily basis as part of its Earth Orientation Parameters Combination of Prediction Pilot Project (EOPCPPP). By combining EOP predictions originating from a variety of different algorithms and initial conditions, the resulting ensemble predictions are expected to be more accurate and robust than any individual contribution. Since 2010, the Pilot Project has been collecting predictions of polar motion and UT1-UTC contributed by several organizations, and is currently combining seventeen different sets of EOP predictions on a daily basis with a simple arithmetic mean. This poster presents an analysis of the project comparing the EOPCPPP ensemble predictions and individual contributors' predictions as measured against the EOP 08 C04 series. Other informational diagnostics produced by the project to aid contributors and users will also be provided.

  9. Are Psychotherapeutic Changes Predictable? Comparison of a Chicago Counseling Center Project with a Penn Psychotherapy Project.

    ERIC Educational Resources Information Center

    Luborsky, Lester; And Others

    1979-01-01

    Compared studies predicting outcomes of psychotherapy. Level of prediction success in both projects was modest. Particularly for the rated benefits score, the profile of variables showed similar levels of success between the projects. Successful predictions were based on adequacy of personality functioning, match on marital status, and length of…

  10. EVA Robotic Assistant Project: Platform Attitude Prediction

    NASA Technical Reports Server (NTRS)

    Nickels, Kevin M.

    2003-01-01

    The Robotic Systems Technology Branch is currently working on the development of an EVA Robotic Assistant under the sponsorship of the Surface Systems Thrust of the NASA Cross Enterprise Technology Development Program (CETDP). This will be a mobile robot that can follow a field geologist during planetary surface exploration, carry his tools and the samples that he collects, and provide video coverage of his activity. Prior experiments have shown that for such a robot to be useful it must be able to follow the geologist at walking speed over any terrain of interest. Geologically interesting terrain tends to be rough rather than smooth. The commercial mobile robot that was recently purchased as an initial testbed for the EVA Robotic Assistant Project, an ATRV Jr., is capable of faster than walking speed outside but it has no suspension. Its wheels with inflated rubber tires are attached to axles that are connected directly to the robot body. Any angular motion of the robot produced by driving over rough terrain will directly affect the pointing of the on-board stereo cameras. The resulting image motion is expected to make tracking of the geologist more difficult. This will either require the tracker to search a larger part of the image to find the target from frame to frame or to search mechanically in pan and tilt whenever the image motion is large enough to put the target outside the image in the next frame. This project consists of the design and implementation of a Kalman filter that combines the output of the angular rate sensors and linear accelerometers on the robot to estimate the motion of the robot base. The motion of the stereo camera pair mounted on the robot that results from this motion as the robot drives over rough terrain is then straightforward to compute. The estimates may then be used, for example, to command the robot s on-board pan-tilt unit to compensate for the camera motion induced by the base movement. This has been accomplished in two ways

  11. Water Erosion Prediction Project (WEPP) model status and updates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This presentation will provide current information on the USDA-ARS Water Erosion Prediction Project (WEPP) model, and its implementation by the USDA-Forest Service (FS), USDA-Natural Resources Conservation Service (NRCS), and other agencies and universities. Most recently, the USDA-NRCS has begun ef...

  12. Factors That Predict Success in an Early Literacy Intervention Project.

    ERIC Educational Resources Information Center

    Leslie, Lauren; Allen, Linda

    1999-01-01

    Examines the effectiveness of an early literacy intervention project for inner-city children in grades 1-4. Discusses the factors that predicted reading growth as the number of rime patterns taught, story grammar instruction, the number of words the child read at home, and parent involvement in recreational reading. Considers results and…

  13. GEWEX America Prediction Project (GAPP) Science and Implementation Plan

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose of this Science and Implementation Plan is to describe GAPP science objectives and the activities required to meet these objectives, both specifically for the near-term and more generally for the longer-term. The GEWEX Americas Prediction Project (GAPP) is part of the Global Energy and Water Cycle Experiment (GEWEX) initiative that is aimed at observing, understanding and modeling the hydrological cycle and energy fluxes at various time and spatial scales. The mission of GAPP is to demonstrate skill in predicting changes in water resources over intraseasonal-to-interannual time scales, as an integral part of the climate system.

  14. Predictions of Chemical Weather in Asia: The EU Panda Project

    NASA Astrophysics Data System (ADS)

    Brasseur, G. P.; Petersen, A. K.; Wang, X.; Granier, C.; Bouarar, I.

    2014-12-01

    Air quality has become a pressing problem in Asia and specifically in China due to rapid economic development (i.e., rapidly expanding motor vehicle fleets, growing industrial and power generation activities, domestic and biomass burning). In spite of efforts to reduce chemical emissions, high levels of particle matter and ozone are observed and lead to severe health problems with a large number of premature deaths. To support efforts to reduce air pollution, the European Union is supporting the PANDA project whose objective is to use space and surface observations of chemical species as well as advanced meteorological and chemical models to analyze and predict air quality in China. The Project involves 7 European and 7 Chinese groups. The paper will describe the objectives of the project and present some first accomplishments. The project focuses on the improvement of methods for monitoring air quality from combined space and in-situ observations, the development of a comprehensive prediction system that makes use of these observations, the elaboration of indicators for air quality in support of policies, and the development of toolboxes for the dissemination of information.

  15. Predicting future uncertainty constraints on global warming projections.

    PubMed

    Shiogama, H; Stone, D; Emori, S; Takahashi, K; Mori, S; Maeda, A; Ishizaki, Y; Allen, M R

    2016-01-11

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.

  16. Predicting future uncertainty constraints on global warming projections

    NASA Astrophysics Data System (ADS)

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.

  17. Predicting future uncertainty constraints on global warming projections.

    PubMed

    Shiogama, H; Stone, D; Emori, S; Takahashi, K; Mori, S; Maeda, A; Ishizaki, Y; Allen, M R

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  18. Prediction of climate variability and projection of climate change

    SciTech Connect

    Grassl, H.

    1996-12-31

    The years since 1985 have seen rapid progress in climate research. By the implementation of a new observing system in the Tropical Pacific Ocean combined with the development of adapted coupled ocean-atmosphere models the Tropical Ocean-Global Atmosphere (TOGA) project of the World Climate Research Programme (WCRP) led to the breakthrough to physically-based climate predictions. For most of the tropics and partly extending to mid-latitudes, climate anomalies can now be predicted for the next season and in some places even for the next year. On the other hand, global coupled ocean-atmosphere-land models have recently approached natural climate variability on time-scales to several decades to such an extent, that these models, partly validated with data from the past, became useful for answering the following two questions: Has mankind already changed global climate? Is anthropogenic global climate change, in the coming century, surmounting at least all variability observed during the last 10,000 years? Both questions are answered by yes. For the first question, the observed patterns of warming and cooling with respect to geographical, seasonal and vertical dependence can only be explained by a combined action of global greenhouse gas increase, regional sulfate aerosol load and stratospheric ozone depletion. For the second, even low climate sensitivity and low economic growth, will lead, if no measures are taken, to a mean global warming of 1.0 C, thus surmounting the warmest phase of the holocene. Implications of these findings for the implementation of the UN Framework Convention on Climate Change will also be discussed.

  19. Transistor roadmap projection using predictive full-band atomistic modeling

    SciTech Connect

    Salmani-Jelodar, M. Klimeck, G.; Kim, S.; Ng, K.

    2014-08-25

    In this letter, a full band atomistic quantum transport tool is used to predict the performance of double gate metal-oxide-semiconductor field-effect transistors (MOSFETs) over the next 15 years for International Technology Roadmap for Semiconductors (ITRS). As MOSFET channel lengths scale below 20 nm, the number of atoms in the device cross-sections becomes finite. At this scale, quantum mechanical effects play an important role in determining the device characteristics. These quantum effects can be captured with the quantum transport tool. Critical results show the ON-current degradation as a result of geometry scaling, which is in contrast to previous ITRS compact model calculations. Geometric scaling has significant effects on the ON-current by increasing source-to-drain (S/D) tunneling and altering the electronic band structure. By shortening the device gate length from 20 nm to 5.1 nm, the ratio of S/D tunneling current to the overall subthreshold OFF-current increases from 18% to 98%. Despite this ON-current degradation by scaling, the intrinsic device speed is projected to increase at a rate of at least 8% per year as a result of the reduction of the quantum capacitance.

  20. Drought Prediction for Socio-Cultural Stability Project

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa; Eylander, John B.; Koster, Randall; Narapusetty, Balachandrudu; Kumar, Sujay; Rodell, Matt; Bolten, John; Mocko, David; Walker, Gregory; Arsenault, Kristi; Rheingrover, Scott

    2014-01-01

    The primary objective of this project is to answer the question: "Can existing, linked infrastructures be used to predict the onset of drought months in advance?" Based on our work, the answer to this question is "yes" with the qualifiers that skill depends on both lead-time and location, and especially with the associated teleconnections (e.g., ENSO, Indian Ocean Dipole) active in a given region season. As part of this work, we successfully developed a prototype drought early warning system based on existing/mature NASA Earth science components including the Goddard Earth Observing System Data Assimilation System Version 5 (GEOS-5) forecasting model, the Land Information System (LIS) land data assimilation software framework, the Catchment Land Surface Model (CLSM), remotely sensed terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE) and remotely sensed soil moisture products from the Aqua/Advanced Microwave Scanning Radiometer - EOS (AMSR-E). We focused on a single drought year - 2011 - during which major agricultural droughts occurred with devastating impacts in the Texas-Mexico region of North America (TEXMEX) and the Horn of Africa (HOA). Our results demonstrate that GEOS-5 precipitation forecasts show skill globally at 1-month lead, and can show up to 3 months skill regionally in the TEXMEX and HOA areas. Our results also demonstrate that the CLSM soil moisture percentiles are a goof indicator of drought, as compared to the North American Drought Monitor of TEXMEX and a combination of Famine Early Warning Systems Network (FEWS NET) data and Moderate Resolution Imaging Spectrometer (MODIS)'s Normalizing Difference Vegetation Index (NDVI) anomalies over HOA. The data assimilation experiments produced mixed results. GRACE terrestrial water storage (TWS) assimilation was found to significantly improve soil moisture and evapotransportation, as well as drought monitoring via soil moisture percentiles, while AMSR-E soil moisture

  1. Project Evaluation: Validation of a Scale and Analysis of Its Predictive Capacity

    ERIC Educational Resources Information Center

    Fernandes Malaquias, Rodrigo; de Oliveira Malaquias, Fernanda Francielle

    2014-01-01

    The objective of this study was to validate a scale for assessment of academic projects. As a complement, we examined its predictive ability by comparing the scores of advised/corrected projects based on the model and the final scores awarded to the work by an examining panel (approximately 10 months after the project design). Results of…

  2. Genetic programming as alternative for predicting development effort of individual software projects.

    PubMed

    Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R; Meda-Campaña, M E

    2012-01-01

    Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment.

  3. Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects

    PubMed Central

    Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.

    2012-01-01

    Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305

  4. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    ERIC Educational Resources Information Center

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  5. Reflexión bioética sobre el uso de organismos genéticamente modificados

    PubMed Central

    Yunta, Eduardo Rodríguez

    2011-01-01

    El presente artículo reflexiona desde los 4 principios de la bioética el uso comercial de organismos genéticamente modificados. Se cuestiona fundamentalmente la falta de transferencia de tecnología entre el mundo desarrollado y en desarrollo y el que el presente sistema de patentamiento de organismos vivos modificados fomenta intereses comerciales y no da debida importancia al desarrollo sostenible de la agricultura y ganadería en los países en desarrollo, donde más se necesita. Se reflexiona sobre la importancia que tiene evaluar los riesgos antes de introducirse en el mercado organismos genéticamente modificados y la necesidad de regulación en los países. PMID:21927675

  6. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project

    PubMed Central

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%–80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects. PMID:26495427

  7. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    PubMed

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  8. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    PubMed

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects. PMID:26495427

  9. Demonstration of the Water Erosion Prediction Project (WEPP) internet interface and services

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based FORTRAN computer simulation program for prediction of runoff and soil erosion by water at hillslope profile, field, and small watershed scales. To effectively run the WEPP model and interpret results additional software has been de...

  10. An Overview of the MATERHORN Fog Project: Observations and Predictability

    NASA Astrophysics Data System (ADS)

    Gultepe, I.; Fernando, H. J. S.; Pardyjak, E. R.; Hoch, S. W.; Silver, Z.; Creegan, E.; Leo, L. S.; Pu, Zhaoxia; De Wekker, S. F. J.; Hang, Chaoxun

    2016-09-01

    . Temperature profiles suggested that an inversion layer contributed significantly to IF formation at Heber. Ice fog forecasts via Weather Research and Forecasting (WRF) model indicated the limitations of IF predictability. Results suggest that IF predictions need to be improved based on ice microphysical parameterizations and ice nucleation processes.

  11. An Overview of the MATERHORN Fog Project: Observations and Predictability

    NASA Astrophysics Data System (ADS)

    Gultepe, I.; Fernando, H. J. S.; Pardyjak, E. R.; Hoch, S. W.; Silver, Z.; Creegan, E.; Leo, L. S.; Pu, Zhaoxia; De Wekker, S. F. J.; Hang, Chaoxun

    2016-08-01

    . Temperature profiles suggested that an inversion layer contributed significantly to IF formation at Heber. Ice fog forecasts via Weather Research and Forecasting (WRF) model indicated the limitations of IF predictability. Results suggest that IF predictions need to be improved based on ice microphysical parameterizations and ice nucleation processes.

  12. A Global Perspective: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    NASA Technical Reports Server (NTRS)

    Zhang, Taiping; Stackhouse, Paul W., Jr.; Chandler, William S.; Hoell, James M.; Westberg, David; Whitlock, Charles H.

    2007-01-01

    The Prediction of the Worldwide Energy Resources (POWER) Project, initiated under the NASA Science Mission Directorate Applied Science Energy Management Program, synthesizes and analyzes data on a global scale that are invaluable to the renewable energy industries, especially to the solar and wind energy sectors. The POWER project derives its data primarily from NASA's World Climate Research Programme (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Version 2.9) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (Version 4). The latest development of the NASA POWER Project and its plans for the future are presented in this paper.

  13. The NASA Seasonal-to-Interannual Prediction Project (NSIPP). [Annual Report for 2000

    NASA Technical Reports Server (NTRS)

    Rienecker, Michele; Suarez, Max; Adamec, David; Koster, Randal; Schubert, Siegfried; Hansen, James; Koblinsky, Chester (Technical Monitor)

    2001-01-01

    The goal of the project is to develop an assimilation and forecast system based on a coupled atmosphere-ocean-land-surface-sea-ice model capable of using a combination of satellite and in situ data sources to improve the prediction of ENSO and other major S-I signals and their global teleconnections. The objectives of this annual report are to: (1) demonstrate the utility of satellite data, especially surface height surface winds, air-sea fluxes and soil moisture, in a coupled model prediction system; and (2) aid in the design of the observing system for short-term climate prediction by conducting OSSE's and predictability studies.

  14. Inroads to Predict in Vivo Toxicology—An Introduction to the eTOX Project

    PubMed Central

    Briggs, Katharine; Cases, Montserrat; Heard, David J.; Pastor, Manuel; Pognan, François; Sanz, Ferran; Schwab, Christof H.; Steger-Hartmann, Thomas; Sutter, Andreas; Watson, David K.; Wichard, Jörg D.

    2012-01-01

    There is a widespread awareness that the wealth of preclinical toxicity data that the pharmaceutical industry has generated in recent decades is not exploited as efficiently as it could be. Enhanced data availability for compound comparison (“read-across”), or for data mining to build predictive tools, should lead to a more efficient drug development process and contribute to the reduction of animal use (3Rs principle). In order to achieve these goals, a consortium approach, grouping numbers of relevant partners, is required. The eTOX (“electronic toxicity”) consortium represents such a project and is a public-private partnership within the framework of the European Innovative Medicines Initiative (IMI). The project aims at the development of in silico prediction systems for organ and in vivo toxicity. The backbone of the project will be a database consisting of preclinical toxicity data for drug compounds or candidates extracted from previously unpublished, legacy reports from thirteen European and European operation-based pharmaceutical companies. The database will be enhanced by incorporation of publically available, high quality toxicology data. Seven academic institutes and five small-to-medium size enterprises (SMEs) contribute with their expertise in data gathering, database curation, data mining, chemoinformatics and predictive systems development. The outcome of the project will be a predictive system contributing to early potential hazard identification and risk assessment during the drug development process. The concept and strategy of the eTOX project is described here, together with current achievements and future deliverables. PMID:22489185

  15. A Resource Requirements Prediction Model (RRPM-1): Guide for the Project Manager.

    ERIC Educational Resources Information Center

    Hussain, K. M.

    This report is one in a series written on the Resource Requirements Prediction Model (RRPM-1) developed by the National Center for Higher Education Management Systems (NCHEMS). This particular document is a guide to the project manager, the person responsible for marshaling and coordinating the resources necessary to implement RRPM-1. The Guide…

  16. International H2O Project (IHOP) 2002: Datasets Related to Atmospheric Moisture and Rainfall Prediction

    DOE Data Explorer

    Schanot, Allen [IHOP 2002 PI; Friesen, Dick [IHOP 2002 PI

    IHOP 2002 was a field experiment that took place over the Southern Great Plains of the United States from 13 May to 25 June 2002. The chief aim of IHOP_2002 was improved characterization of the four-dimensional (4-D) distribution of water vapor and its application to improving the understanding and prediction of convection. The region was an optimal location due to existing experimental and operational facilities, strong variability in moisture, and active convection [copied from http://www.eol.ucar.edu/projects/ihop/]. The project's master list of data identifies 146 publicly accessible datasets.

  17. A GLOBAL ASSESSMENT OF SOLAR ENERGY RESOURCES: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Stackhouse, P. W., Jr.; Chandler, W.; Hoell, J. M.; Westberg, D.; Whitlock, C. H.

    2010-12-01

    NASA's POWER project, or the Prediction of the Worldwide Energy Resources project, synthesizes and analyzes data on a global scale. The products of the project find valuable applications in the solar and wind energy sectors of the renewable energy industries. The primary source data for the POWER project are NASA's World Climate Research Project (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Release 3.0) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (V 4.0.3). Users of the POWER products access the data through NASA's Surface meteorology and Solar Energy (SSE, Version 6.0) website (http://power.larc.nasa.gov). Over 200 parameters are available to the users. The spatial resolution is 1 degree by 1 degree now and will be finer later. The data covers from July 1983 to December 2007, a time-span of 24.5 years, and are provided as 3-hourly, daily and monthly means. As of now, there have been over 18 million web hits and over 4 million data file downloads. The POWER products have been systematically validated against ground-based measurements, and in particular, data from the Baseline Surface Radiation Network (BSRN) archive, and also against the National Solar Radiation Data Base (NSRDB). Parameters such as minimum, maximum, daily mean temperature and dew points, relative humidity and surface pressure are validated against the National Climate Data Center (NCDC) data. SSE feeds data directly into Decision Support Systems including RETScreen International clean energy project analysis software that is written in 36 languages and has greater than 260,000 users worldwide.

  18. Integrative Modeling Strategies for Predicting Drug Toxicities at the eTOX Project.

    PubMed

    Sanz, Ferran; Carrió, Pau; López, Oriol; Capoferri, Luigi; Kooi, Derk P; Vermeulen, Nico P E; Geerke, Daan P; Montanari, Floriane; Ecker, Gerhard F; Schwab, Christof H; Kleinöder, Thomas; Magdziarz, Tomasz; Pastor, Manuel

    2015-06-01

    Early prediction of safety issues in drug development is at the same time highly desirable and highly challenging. Recent advances emphasize the importance of understanding the whole chain of causal events leading to observable toxic outcomes. Here we describe an integrative modeling strategy based on these ideas that guided the design of eTOXsys, the prediction system used by the eTOX project. Essentially, eTOXsys consists of a central server that marshals requests to a collection of independent prediction models and offers a single user interface to the whole system. Every of such model lives in a self-contained virtual machine easy to maintain and install. All models produce toxicity-relevant predictions on their own but the results of some can be further integrated and upgrade its scale, yielding in vivo toxicity predictions. Technical aspects related with model implementation, maintenance and documentation are also discussed here. Finally, the kind of models currently implemented in eTOXsys is illustrated presenting three example models making use of diverse methodology (3D-QSAR and decision trees, Molecular Dynamics simulations and Linear Interaction Energy theory, and fingerprint-based QSAR).

  19. Low amplitude insult project: Structural analysis and prediction of low order reaction

    SciTech Connect

    Scammon, R.J.; Browning, R.V.; Middleditch, J.; Dienes, J.K.; Haberman, K.S.; Bennett, J.G.

    1998-12-31

    The low velocity impact sensitivity of PBX 9501 has been investigated through a series of experiments based on the Steven Test targets and a set of Shear Impact experiments. The authors describe calculations done using DYNA2D, SPRONTO and DYNA3D to support these, and other, low amplitude insult experiments. The calculations allow them to study pressure and strain rate variables, to investigate structural aspects of the experiment, and to predict velocities required for reaction. Structural analyses have played an active role in this project beginning with the original target design and continuing through analyses of the experimental results. Alternative designs and various ideas for active instrumentation were examined as part of the experiment evolution process. Predictions of reaction are used to guide these design studies, even though the authors do not yet have enough experimental data to fully calibrate any of the models.

  20. EPA Project Updates: DSSTox and ToxCast Generating New Data and Data Linkages for Use in Predictive Modeling

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...

  1. The Climate Variability & Predictability (CVP) Program at NOAA - DYNAMO Recent Project Advancements

    NASA Astrophysics Data System (ADS)

    Lucas, S. E.; Todd, J. F.; Higgins, W.

    2013-12-01

    The Climate Variability & Predictability (CVP) Program supports research aimed at providing process-level understanding of the climate system through observation, modeling, analysis, and field studies. This vital knowledge is needed to improve climate models and predictions so that scientists can better anticipate the impacts of future climate variability and change. To achieve its mission, the CVP Program supports research carried out at NOAA and other federal laboratories, NOAA Cooperative Institutes, and academic institutions. The Program also coordinates its sponsored projects with major national and international scientific bodies including the World Climate Research Programme (WCRP), the International Geosphere-Biosphere Programme (IGBP), and the U.S. Global Change Research Program (USGCRP). The CVP program sits within the Earth System Science (ESS) Division at NOAA's Climate Program Office. Dynamics of the Madden-Julian Oscillation (DYNAMO): The Indian Ocean is one of Earth's most sensitive regions because the interactions between ocean and atmosphere there have a discernable effect on global climate patterns. The tropical weather that brews in that region can move eastward along the equator and reverberate around the globe, shaping weather and climate in far-off places. The vehicle for this variability is a phenomenon called the Madden-Julian Oscillation, or MJO. The MJO, which originates over the Indian Ocean roughly every 30 to 90 days, is known to influence the Asian and Australian monsoons. It can also enhance hurricane activity in the northeast Pacific and Gulf of Mexico, trigger torrential rainfall along the west coast of North America, and affect the onset of El Niño. CVP-funded scientists participated in the DYNAMO field campaign in 2011-12. Results from this international campaign are expected to improve researcher's insights into this influential phenomenon. A better understanding of the processes governing MJO is an essential step toward

  2. Predicting and mapping potential Whooping Crane stopover habitat to guide site selection for wind energy projects.

    PubMed

    Belaire, J Amy; Kreakie, Betty J; Keitt, Timothy; Minor, Emily

    2014-04-01

    Migratory stopover habitats are often not part of planning for conservation or new development projects. We identified potential stopover habitats within an avian migratory flyway and demonstrated how this information can guide the site-selection process for new development. We used the random forests modeling approach to map the distribution of predicted stopover habitat for the Whooping Crane (Grus americana), an endangered species whose migratory flyway overlaps with an area where wind energy development is expected to become increasingly important. We then used this information to identify areas for potential wind power development in a U.S. state within the flyway (Nebraska) that minimize conflicts between Whooping Crane stopover habitat and the development of clean, renewable energy sources. Up to 54% of our study area was predicted to be unsuitable as Whooping Crane stopover habitat and could be considered relatively low risk for conflicts between Whooping Cranes and wind energy development. We suggest that this type of analysis be incorporated into the habitat conservation planning process in areas where incidental take permits are being considered for Whooping Cranes or other species of concern. Field surveys should always be conducted prior to construction to verify model predictions and understand baseline conditions. PMID:24372936

  3. Predicting and mapping potential Whooping Crane stopover habitat to guide site selection for wind energy projects.

    PubMed

    Belaire, J Amy; Kreakie, Betty J; Keitt, Timothy; Minor, Emily

    2014-04-01

    Migratory stopover habitats are often not part of planning for conservation or new development projects. We identified potential stopover habitats within an avian migratory flyway and demonstrated how this information can guide the site-selection process for new development. We used the random forests modeling approach to map the distribution of predicted stopover habitat for the Whooping Crane (Grus americana), an endangered species whose migratory flyway overlaps with an area where wind energy development is expected to become increasingly important. We then used this information to identify areas for potential wind power development in a U.S. state within the flyway (Nebraska) that minimize conflicts between Whooping Crane stopover habitat and the development of clean, renewable energy sources. Up to 54% of our study area was predicted to be unsuitable as Whooping Crane stopover habitat and could be considered relatively low risk for conflicts between Whooping Cranes and wind energy development. We suggest that this type of analysis be incorporated into the habitat conservation planning process in areas where incidental take permits are being considered for Whooping Cranes or other species of concern. Field surveys should always be conducted prior to construction to verify model predictions and understand baseline conditions.

  4. Toward a unified system for understanding, predicting and projecting regional hurricane activity

    NASA Astrophysics Data System (ADS)

    Vecchi, G. A.; Delworth, T. L.; Yang, X.; Murakami, H.; Zhang, W.; Underwood, S.; Zeng, F. J.; Jia, L.; Kapnick, S. B.; Paffendorf, K.; Krishnamurthy, L.; Wittenberg, A. T.; Msadek, R.; Villarini, G.; Chen, J. H.; Lin, S. J.; Harris, L.; Gudgel, R.; Stern, B.; Zhang, S.

    2015-12-01

    A family of high-resolution (50km and 25km atmospheric/land resolution) global coupled climate models provide a unified framework towards the understanding, intraseasonal-to-decadal prediction and decadal to multi-decadal projection of regional and extreme climate, including tropical cyclones. Initialized predictions of global hurricane activity show skill on regional scales, comparable to the skill on basin-wide scales, suggesting that regional seasonal TC predictions may be a feasible forecast target. The 25km version of the model shows skill at seasonal predictions of the frequency of the most intense hurricanes (Cat. 3-4-5 and Cat. 4-5). It is shown that large-scale systematic errors in the mean-state are a key constraint on the simulation and prediction of variations of regional climate and extremes, and methodologies for overcoming model biases are explored. Improvements in predictions of regional climate are due both to improved representation of local processes, and to improvements in the large-scale climate and variability from improved process representation. These models are used to explore the the response of tropical cyclones, both globally and regionally, to increasing greenhouse gases and to internal climate variations. The 25km model in generally shows a more faithful representation of the impact of climate variability on hurricane activity than the 50km model. The response of the total number and the total power dissipation index of tropical cyclones to increasing greenhouse gases can differ substantially between models of two atmospheric resolutions, 50km and 25km - with the 25km version of the model showing a larger increase in power dissipation from increasing greenhouse gases, principally because - in contrast to that of the 50km model - its global hurricane frequency does not decrease with increasing CO2. Some thoughts on the reasons behind those differences will be offered. The 25km model shows an increase in the frequency of intense tropical

  5. Predicting project environmental performance under market uncertainties: case study of oil sands coke.

    PubMed

    McKellar, Jennifer M; Bergerson, Joule A; Kettunen, Janne; MacLean, Heather L

    2013-06-01

    A method combining life cycle assessment (LCA) and real options analyses is developed to predict project environmental and financial performance over time, under market uncertainties and decision-making flexibility. The method is applied to examine alternative uses for oil sands coke, a carbonaceous byproduct of processing the unconventional petroleum found in northern Alberta, Canada. Under uncertainties in natural gas price and the imposition of a carbon price, our method identifies that selling the coke to China for electricity generation by integrated gasification combined cycle is likely to be financially preferred initially, but eventually hydrogen production in Alberta is likely to be preferred. Compared to the results of a previous study that used life cycle costing to identify the financially preferred alternative, the inclusion of real options analysis adds value as it accounts for flexibility in decision-making (e.g., to delay investment), increasing the project's expected net present value by 25% and decreasing the expected life cycle greenhouse gas emissions by 11%. Different formulations of the carbon pricing policy or changes to the natural gas price forecast alter these findings. The combined LCA/real options method provides researchers and decision-makers with more comprehensive information than can be provided by either technique alone. PMID:23675646

  6. NERI PROJECT 99-119. TASK 2. DATA-DRIVEN PREDICTION OF PROCESS VARIABLES. FINAL REPORT

    SciTech Connect

    Upadhyaya, B.R.

    2003-04-10

    This report describes the detailed results for task 2 of DOE-NERI project number 99-119 entitled ''Automatic Development of Highly Reliable Control Architecture for Future Nuclear Power Plants''. This project is a collaboration effort between the Oak Ridge National Laboratory (ORNL,) The University of Tennessee, Knoxville (UTK) and the North Carolina State University (NCSU). UTK is the lead organization for Task 2 under contract number DE-FG03-99SF21906. Under task 2 we completed the development of data-driven models for the characterization of sub-system dynamics for predicting state variables, control functions, and expected control actions. We have also developed the ''Principal Component Analysis (PCA)'' approach for mapping system measurements, and a nonlinear system modeling approach called the ''Group Method of Data Handling (GMDH)'' with rational functions, and includes temporal data information for transient characterization. The majority of the results are presented in detailed reports for Phases 1 through 3 of our research, which are attached to this report.

  7. Predicting project environmental performance under market uncertainties: case study of oil sands coke.

    PubMed

    McKellar, Jennifer M; Bergerson, Joule A; Kettunen, Janne; MacLean, Heather L

    2013-06-01

    A method combining life cycle assessment (LCA) and real options analyses is developed to predict project environmental and financial performance over time, under market uncertainties and decision-making flexibility. The method is applied to examine alternative uses for oil sands coke, a carbonaceous byproduct of processing the unconventional petroleum found in northern Alberta, Canada. Under uncertainties in natural gas price and the imposition of a carbon price, our method identifies that selling the coke to China for electricity generation by integrated gasification combined cycle is likely to be financially preferred initially, but eventually hydrogen production in Alberta is likely to be preferred. Compared to the results of a previous study that used life cycle costing to identify the financially preferred alternative, the inclusion of real options analysis adds value as it accounts for flexibility in decision-making (e.g., to delay investment), increasing the project's expected net present value by 25% and decreasing the expected life cycle greenhouse gas emissions by 11%. Different formulations of the carbon pricing policy or changes to the natural gas price forecast alter these findings. The combined LCA/real options method provides researchers and decision-makers with more comprehensive information than can be provided by either technique alone.

  8. Predicting fire activity in the US over the next 50 years using new IPCC climate projections

    NASA Astrophysics Data System (ADS)

    Wang, D.; Morton, D. C.; Collatz, G. J.

    2012-12-01

    Fire is an integral part of the Earth system with both direct and indirect effects on terrestrial ecosystems, the atmosphere, and human societies (Bowman et al. 2009). Climate conditions regulate fire activities through a variety of ways, e.g., influencing the conditions for ignition and fire spread, changing vegetation growth and decay and thus the accumulation of fuels for combustion (Arora and Boer 2005). Our recent study disclosed the burned area (BA) in US is strongly correlated with potential evaporation (PE), a measurement of climatic dryness derived from National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) climate data (Morton et al. 2012). The correlation varies spatially and temporally. With regard to fire of peak fire seasons, Northwestern US, Great Plains and Alaska have the strongest BA/PE relationship. Using the recently released the Global Fire Emissions Database (GFED) Version 3 (van der Werf et al. 2010), we showed increasing BA in the last decade in most of NCA regions. Longer time series of Monitoring Trends in Burn Severity (MTBS) (Eidenshink et al. 2007) data showed the increasing trends occurred in all NCA regions from 1984 to 2010. This relationship between BA and PE provides us the basis to predict the future fire activities in the projected climate conditions. In this study, we build spatially explicit predictors using the historic PE/BA relationship. PE from 2011 to 2060 is calculated from the Coupled Model Intercomparison Project Phase 5 (CMIP5) data and the historic PE/BA relationship is then used to estimate BA. This study examines the spatial pattern and temporal dynamics of the future US fires driven by new climate predictions for the next 50 years. Reference: Arora, V.K., & Boer, G.J. (2005). Fire as an interactive component of dynamic vegetation models. Journal of Geophysical Research-Biogeosciences, 110 Bowman, D.M.J.S., Balch, J.K., Artaxo, P., Bond, W.J., Carlson, J.M., Cochrane, M.A., D

  9. Introduction of the NWP Model Development Project at Korea Institute of Atmospheric Prediction Systems - KIAPS

    NASA Astrophysics Data System (ADS)

    Kim, Y.

    2012-12-01

    Korea Meteorological Administration (KMA) launched a 9-year project in 2011 to develop Korea's own global NWP system with the total funding of about 100 million US dollars. To lead the effort, Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded by KMA as a non-profit foundation. The project consists of three stages. We are in the middle of the first stage (2011-2013), which is to set up the Institute, recruit researchers, lay out plans for the research and development, and design the basic structure and explore/develop core NWP technologies. The second stage (2014-2016) aims at developing the modules for the dynamical core, physical parameterizations and data assimilation systems as well as the system framework and couplers to connect the modules in a systematic and efficient way, and eventually building a prototype NWP system. The third stage (2017-2019) is for evaluating the prototype system by selecting/improving modules, and refining/finalizing it for operational use at KMA as well as developing necessary post-processing systems. In 2012, we are designing key modules for the dynamical core by adopting existing and/or developing new cores, and developing the barographic model first and the baroclinic model later with code parallelization and optimization in mind. We are collecting various physical parameterization schemes, mostly developed by Korean scientists, and evaluating and improving them by using single-column and LES models, etc. We are designing control variables for variational data assimilation systems, constructing testbeds for observational data pre-processing systems, developing linear models for a barographic system, designing modules for cost function minimization. We are developing the module framework, which is flexible for prognostic and diagnostic variables, designing the I/O structure of the system, coupling modules for external systems, and also developing post-processing systems. At the meeting, we will present the

  10. EU framework 6 project: predictive toxicology (PredTox)--overview and outcome.

    PubMed

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-04-15

    In this publication, we report the outcome of the integrated EU Framework 6 PROJECT: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and "omics" technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of "omics" and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that "omics" technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated "omics" analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.

  11. EU Framework 6 Project: Predictive Toxicology (PredTox)-overview and outcome

    SciTech Connect

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-04-15

    In this publication, we report the outcome of the integrated EU Framework 6 Project: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and 'omics' technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of 'omics' and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that 'omics' technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated 'omics' analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.

  12. Fault kinematics and retro-deformation analysis for prediction of potential leakage pathways - joint project PROTECT

    NASA Astrophysics Data System (ADS)

    Ziesch, Jennifer; Tanner, David C.; Dance, Tess; Beilecke, Thies; Krawczyk, Charlotte M.

    2014-05-01

    Within the context of long-term CO2 storage integrity, we determine the seismic and sub-seismic characteristics of potential fluid migration pathways between reservoir and surface. As a part of the PROTECT project we focus on the sub-seismic faults of the CO2CRC Otway Project pilot site in Australia. We carried out a detailed interpretation of 3D seismic data and have built a geological 3D model of 8 km x 7 km x 4.5 km (depth). The model comprises triangulated surfaces of 8 stratigraphic horizons and 24 large-scale faults with 75 m grid size. We have confirmed the site to comprise a complex system of south-dipping normal faults and north-dipping antithetic normal faults. Good knowledge of the kinematics of the large-scale faults is essential to predict sub-seismic structures. For this reason preconditioning analyses, such as thickness maps, fault curvature, cylindricity and connectivity studies, as well as Allan mapping were carried out. The most important aspect is that two different types of fault kinematics were simultaneously active: Dip-slip and a combination of dip-slip with dextral strike slip movement. Using these input parameters stratigraphic volumes are kinematically restored along the large-scale faults, taking fault topography into account (retro-deformation). The stratigraphic volumes are analyzed at the same time with respect to sub-seismic strain variation. Thereby we produce strain tensor maps to locate highly deformed or fractured zones and their orientation within the stratigraphic volumes. We will discuss the results in the framework of possible fluid/gas migration pathways and communication between storage reservoir and overburden. This will provide a tool to predict CO2 leakage and thus to adapt time-dependent monitoring strategies for subsurface storage in general. Acknowledgement: This work was sponsored in part by the Australian Commonwealth Government through the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC). PROTECT

  13. Adapting WEPP (Water Erosion Prediction Project) for Forest Watershed Erosion Modeling

    NASA Astrophysics Data System (ADS)

    Dun, S.; Wu, J. Q.; Elliot, W. J.; Robichaud, P. R.; Flanagan, D. C.

    2006-12-01

    There has been an increasing public concern over forest stream pollution by excessive sedimentation resulting from human activities. Adequate and reliable erosion simulation tools are urgently needed for sound forest resources management. Computer models for predicting watershed runoff and erosion have been developed during the past. These models, however, are often limited in their applications due to inappropriate representation of the hydrological processes involved. The Water Erosion Prediction Project (WEPP) watershed model has proved useful in certain forest applications such as modeling erosion from a segment of insloped or outsloped road, harvested units, and burned units. Nevertheless, when used for modeling water flow and sediment discharge from a forest watershed of complex topography and channel systems, WEPP consistently underestimates these quantities, in particular, the water flow at the watershed outlet. The main purpose of this study was to improve the WEPP watershed model such that it can be applied to adequately simulate forest watershed hydrology and erosion. The specific objectives were to: (1) identify and correct WEPP algorithms and subroutines that inappropriately represent forest watershed hydrologic processes; and (2) assess the performance of the modified model by applying it a real forested watershed in the Pacific Northwest, USA. In modifying the WEPP model, changes were primarily made in the approach to, and algorithms for modeling deep percolation of soil water and subsurface lateral flow. The modified codes were subsequently applied to Hermada watershed, a small watershed located in the Boise National Forest in northern Idaho. The modeling results were compared with those obtained by using the original WEPP and the field-observed runoff and erosion data. Conclusions of this study include: (1) compared to the original model, the modified WEPP more realistically and properly represents the hydrologic processes in a forest setting; and

  14. Aeroheating Testing and Predictions for Project Orion CEV at Turbulent Conditions

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Berger, Karen T.; Horvath, Thomas J.; Coblish, Joseph J.; Norris, Joseph D.; Lillard, Randolph P.; Kirk, Benjamin S.

    2009-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Exploration Vehicle was performed in the Arnold Engineering Development Center Hypervelocity Wind Tunnel No. 9 Mach 8 and Mach 10 nozzles and in the NASA Langley Research Center 20 - Inch Mach 6 Air Tunnel. Heating data were obtained using a thermocouple-instrumented approx.0.035-scale model (0.1778-m/7-inch diameter) of the flight vehicle. Runs were performed in the Tunnel 9 Mach 10 nozzle at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 20x10(exp 6)/ft, in the Tunnel 9 Mach 8 nozzle at free stream unit Reynolds numbers of 8 x 10(exp 6)/ft to 48x10(exp 6)/ft, and in the 20-Inch Mach 6 Air Tunnel at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 7x10(exp 6)/ft. In both facilities, enthalpy levels were low and the test gas (N2 in Tunnel 9 and air in the 20-Inch Mach 6) behaved as a perfect-gas. These test conditions produced laminar, transitional and turbulent data in the Tunnel 9 Mach 10 nozzle, transitional and turbulent data in the Tunnel 9 Mach 8 nozzle, and laminar and transitional data in the 20- Inch Mach 6 Air Tunnel. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the experimental data to help define the accuracy of computational method. In general, it was found that both laminar data and predictions, and turbulent data and predictions, agreed to within less than the estimated 12% experimental uncertainty estimate. Laminar heating distributions from all three data sets were shown to correlate well and demonstrated Reynolds numbers independence when expressed in terms of the Stanton number based on adiabatic wall-recovery enthalpy. Transition onset locations on the leeside centerline were determined from the data and correlated in terms of boundary-layer parameters. Finally turbulent heating augmentation ratios were determined for several body-point locations and correlated in terms of the

  15. The CONVEX project - Using Observational Evidence and Process Understanding to Improve Predictions of Extreme Rainfall Change

    NASA Astrophysics Data System (ADS)

    Fowler, Hayley; Kendon, Elizabeth; Blenkinsop, Stephen; Chan, Steven; Ferro, Christopher; Roberts, Nigel; Stephenson, David; Jones, Richard; Sessford, Pat

    2013-04-01

    During the last decade, widespread major flood events in the UK and across the rest of Europe have focussed attention on perceived increases in rainfall intensities. Whilst Regional Climate Models (RCMs) are able to simulate the magnitude and spatial pattern of observed daily extreme rainfall events more reliably than Global Circulation Models (GCMs), they still underestimate extreme rainfall in relation to observations. Particularly during the summer a large proportion of the precipitation comes from convective storms that are typically too small to be explicitly represented by climate models. Instead, convection parameterisation schemes are necessary to represent the larger-scale effect of unresolved convective cells. Given the deficiencies in the simulation of extreme rainfall by climate models, even in the current generation of high-resolution RCMs, the CONVEX project (CONVective EXtremes) argues that an integrated approach is needed that brings together observations, basic understanding and models. This should go hand in hand with a change from a focus on traditional validation exercises (comparing modelled and observed extremes) to an understanding and quantification of the causes of model deficiencies in the simulation of extreme rainfall processes on different spatial and temporal scales. It is particularly true for localised intense summer convection. CONVEX therefore aims to contribute to the goals of enabling society to respond to global climate change and predicting the regional and local impacts of environmental change. In addition to an improved understanding of the spatial-temporal characteristics of extreme rainfall processes (principally in the UK) the project is also assessing the influence of model parameterisations and resolution on the simulation of extreme rainfall events and processes. This includes the running of new RCM simulations undertaken by the UK Meteorological Office at 50km and 12km resolutions (parameterised convection) and

  16. Projected climate change impacts and short term predictions on staple crops in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Mereu, V.; Spano, D.; Gallo, A.; Carboni, G.

    2013-12-01

    . Multiple combinations of soils and climate conditions, crop management and varieties were considered for the different Agro-Ecological Zones. The climate impact was assessed using future climate prediction, statistically and/or dynamically downscaled, for specific areas. Direct and indirect effects of different CO2 concentrations projected for the future periods were separately explored to estimate their effects on crops. Several adaptation strategies (e.g., introduction of full irrigation, shift of the ordinary sowing/planting date, changes in the ordinary fertilization management) were also evaluated with the aim to reduce the negative impact of climate change on crop production. The results of the study, analyzed at local, AEZ and country level, will be discussed.

  17. Evaluation of numerical weather predictions performed in the context of the project DAPHNE

    NASA Astrophysics Data System (ADS)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Bampzelis, Dimitris; Karacostas, Theodore

    2014-05-01

    The region of Thessaly in central Greece is one of the main areas of agricultural production in Greece. Severe weather phenomena affect the agricultural production in this region with adverse effects for farmers and the national economy. For this reason the project DAPHNE aims at tackling the problem of drought by means of weather modification through the development of the necessary tools to support the application of a rainfall enhancement program. In the present study the numerical weather prediction system WRF-ARW is used, in order to assess its ability to represent extreme weather phenomena in the region of Thessaly. WRF is integrated in three domains covering Europe, Eastern Mediterranean and Central-Northern Greece (Thessaly and a large part of Macedonia) using telescoping nesting with grid spacing of 15km, 5km and 1.667km, respectively. The cases examined span throughout the transitional and warm period (April to September) of the years 2008 to 2013, including days with thunderstorm activity. Model results are evaluated against all available surface observations and radar products, taking into account the spatial characteristics and intensity of the storms. Preliminary results indicate a good level of agreement between the simulated and observed fields as far as the standard parameters (such as temperature, humidity and precipitation) are concerned. Moreover, the model generally exhibits a potential to represent the occurrence of the convective activity, but not its exact spatiotemporal characteristics. Acknowledgements This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-2013)

  18. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies.

    PubMed

    DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S

    2016-10-01

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation. PMID:27280379

  19. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies

    DOE PAGESBeta

    Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.

    2016-06-06

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less

  20. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies.

    PubMed

    DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S

    2016-10-01

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.

  1. Predicting and Mapping Potential Whooping Crane Stopover Habitat to Guide Site Selection for Wind Energy Projects

    EPA Science Inventory

    Migration is one of the most poorly understood components of a bird’s life cycle. For that reason, migratory stopover habitats are often not part of conservation planning and may be overlooked when planning new development projects. This project highlights and addresses an overl...

  2. Demographic models and IPCC climate projections predict the decline of an emperor penguin population

    PubMed Central

    Jenouvrier, Stéphanie; Caswell, Hal; Barbraud, Christophe; Holland, Marika; Strœve, Julienne; Weimerskirch, Henri

    2009-01-01

    Studies have reported important effects of recent climate change on Antarctic species, but there has been to our knowledge no attempt to explicitly link those results to forecasted population responses to climate change. Antarctic sea ice extent (SIE) is projected to shrink as concentrations of atmospheric greenhouse gases (GHGs) increase, and emperor penguins (Aptenodytes forsteri) are extremely sensitive to these changes because they use sea ice as a breeding, foraging and molting habitat. We project emperor penguin population responses to future sea ice changes, using a stochastic population model that combines a unique long-term demographic dataset (1962–2005) from a colony in Terre Adélie, Antarctica and projections of SIE from General Circulation Models (GCM) of Earth's climate included in the most recent Intergovernmental Panel on Climate Change (IPCC) assessment report. We show that the increased frequency of warm events associated with projected decreases in SIE will reduce the population viability. The probability of quasi-extinction (a decline of 95% or more) is at least 36% by 2100. The median population size is projected to decline from ≈6,000 to ≈400 breeding pairs over this period. To avoid extinction, emperor penguins will have to adapt, migrate or change the timing of their growth stages. However, given the future projected increases in GHGs and its effect on Antarctic climate, evolution or migration seem unlikely for such long lived species at the remote southern end of the Earth. PMID:19171908

  3. Demographic models and IPCC climate projections predict the decline of an emperor penguin population.

    PubMed

    Jenouvrier, Stéphanie; Caswell, Hal; Barbraud, Christophe; Holland, Marika; Stroeve, Julienne; Weimerskirch, Henri

    2009-02-10

    Studies have reported important effects of recent climate change on Antarctic species, but there has been to our knowledge no attempt to explicitly link those results to forecasted population responses to climate change. Antarctic sea ice extent (SIE) is projected to shrink as concentrations of atmospheric greenhouse gases (GHGs) increase, and emperor penguins (Aptenodytes forsteri) are extremely sensitive to these changes because they use sea ice as a breeding, foraging and molting habitat. We project emperor penguin population responses to future sea ice changes, using a stochastic population model that combines a unique long-term demographic dataset (1962-2005) from a colony in Terre Adélie, Antarctica and projections of SIE from General Circulation Models (GCM) of Earth's climate included in the most recent Intergovernmental Panel on Climate Change (IPCC) assessment report. We show that the increased frequency of warm events associated with projected decreases in SIE will reduce the population viability. The probability of quasi-extinction (a decline of 95% or more) is at least 36% by 2100. The median population size is projected to decline from approximately 6,000 to approximately 400 breeding pairs over this period. To avoid extinction, emperor penguins will have to adapt, migrate or change the timing of their growth stages. However, given the future projected increases in GHGs and its effect on Antarctic climate, evolution or migration seem unlikely for such long lived species at the remote southern end of the Earth.

  4. Predictive In Vitro Screening of Environmental Chemicals – The ToxCast Project

    EPA Science Inventory

    ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...

  5. Performance predictions for mechanical excavators in Yucca Mountain tuffs; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ozdemir, L.; Gertsch, L.; Neil, D.; Friant, J.

    1992-09-01

    The performances of several mechanical excavators are predicted for use in the tuffs at Yucca Mountain: Tunnel boring machines, the Mobile Miner, a roadheader, a blind shaft borer, a vertical wheel shaft boring machine, raise drills, and V-Moles. Work summarized is comprised of three parts: Initial prediction using existing rock physical property information; Measurement of additional rock physical properties; and Revision of the initial predictions using the enhanced database. The performance predictions are based on theoretical and empirical relationships between rock properties and the forces-experienced by rock cutters and bits during excavation. Machine backup systems and excavation design aspects, such as curves and grades, are considered in determining excavator utilization factors. Instanteous penetration rate, advance rate, and cutter costs are the fundamental performance indicators.

  6. Analysis and Prediction of User Editing Patterns in Ontology Development Projects

    PubMed Central

    Wang, Hao; Tudorache, Tania; Dou, Dejing; Noy, Natalya F.; Musen, Mark A.

    2014-01-01

    The development of real-world ontologies is a complex undertaking, commonly involving a group of domain experts with different expertise that work together in a collaborative setting. These ontologies are usually large scale and have complex structures. To assist in the authoring process, ontology tools are key at making the editing process as streamlined as possible. Being able to predict confidently what the users are likely to do next as they edit an ontology will enable us to focus and structure the user interface accordingly and to facilitate more efficient interaction and information discovery. In this paper, we use data mining, specifically the association rule mining, to investigate whether we are able to predict the next editing operation that a user will make based on the change history. We simulated and evaluated continuous prediction across time using sliding window model. We used the association rule mining to generate patterns from the ontology change logs in the training window and tested these patterns on logs in the adjacent testing window. We also evaluated the impact of different training and testing window sizes on the prediction accuracies. At last, we evaluated our prediction accuracies across different user groups and different ontologies. Our results indicate that we can indeed predict the next editing operation a user is likely to make. We will use the discovered editing patterns to develop a recommendation module for our editing tools, and to design user interface components that better fit with the user editing behaviors. PMID:26052350

  7. Retrospective Exposure Estimation and Predicted versus Observed Serum Perfluorooctanoic Acid Concentrations for Participants in the C8 Health Project

    PubMed Central

    Vieira, Verónica M.; Ryan, P. Barry; Steenland, Kyle; Bartell, Scott M.

    2011-01-01

    Background: People living or working in eastern Ohio and western West Virginia have been exposed to perfluorooctanoic acid (PFOA) released by DuPont Washington Works facilities. Objectives: Our objective was to estimate historical PFOA exposures and serum concentrations experienced by 45,276 non-occupationally exposed participants in the C8 Health Project who consented to share their residential histories and a 2005–2006 serum PFOA measurement. Methods: We estimated annual PFOA exposure rates for each individual based on predicted calibrated water concentrations and predicted air concentrations using an environmental fate and transport model, individual residential histories, and maps of public water supply networks. We coupled individual exposure estimates with a one-compartment absorption, distribution, metabolism, and excretion (ADME) model to estimate time-dependent serum concentrations. Results: For all participants (n = 45,276), predicted and observed median serum concentrations in 2005–2006 are 14.2 and 24.3 ppb, respectively [Spearman’s rank correlation coefficient (rs) = 0.67]. For participants who provided daily public well water consumption rate and who had the same residence and workplace in one of six municipal water districts for 5 years before the serum sample (n = 1,074), predicted and observed median serum concentrations in 2005–2006 are 32.2 and 40.0 ppb, respectively (rs = 0.82). Conclusions: Serum PFOA concentrations predicted by linked exposure and ADME models correlated well with observed 2005–2006 human serum concentrations for C8 Health Project participants. These individualized retrospective exposure and serum estimates are being used in a variety of epidemiologic studies being conducted in this region. PMID:21813367

  8. Derivation and Evaluation of a Risk-Scoring Tool to Predict Participant Attrition in a Lifestyle Intervention Project.

    PubMed

    Jiang, Luohua; Yang, Jing; Huang, Haixiao; Johnson, Ann; Dill, Edward J; Beals, Janette; Manson, Spero M; Roubideaux, Yvette

    2016-05-01

    Participant attrition in clinical trials and community-based interventions is a serious, common, and costly problem. In order to develop a simple predictive scoring system that can quantify the risk of participant attrition in a lifestyle intervention project, we analyzed data from the Special Diabetes Program for Indians Diabetes Prevention Program (SDPI-DP), an evidence-based lifestyle intervention to prevent diabetes in 36 American Indian and Alaska Native communities. SDPI-DP participants were randomly divided into a derivation cohort (n = 1600) and a validation cohort (n = 801). Logistic regressions were used to develop a scoring system from the derivation cohort. The discriminatory power and calibration properties of the system were assessed using the validation cohort. Seven independent factors predicted program attrition: gender, age, household income, comorbidity, chronic pain, site's user population size, and average age of site staff. Six factors predicted long-term attrition: gender, age, marital status, chronic pain, site's user population size, and average age of site staff. Each model exhibited moderate to fair discriminatory power (C statistic in the validation set: 0.70 for program attrition, and 0.66 for long-term attrition) and excellent calibration. The resulting scoring system offers a low-technology approach to identify participants at elevated risk for attrition in future similar behavioral modification intervention projects, which may inform appropriate allocation of retention resources. This approach also serves as a model for other efforts to prevent participant attrition.

  9. Derivation and Evaluation of a Risk-Scoring Tool to Predict Participant Attrition in a Lifestyle Intervention Project.

    PubMed

    Jiang, Luohua; Yang, Jing; Huang, Haixiao; Johnson, Ann; Dill, Edward J; Beals, Janette; Manson, Spero M; Roubideaux, Yvette

    2016-05-01

    Participant attrition in clinical trials and community-based interventions is a serious, common, and costly problem. In order to develop a simple predictive scoring system that can quantify the risk of participant attrition in a lifestyle intervention project, we analyzed data from the Special Diabetes Program for Indians Diabetes Prevention Program (SDPI-DP), an evidence-based lifestyle intervention to prevent diabetes in 36 American Indian and Alaska Native communities. SDPI-DP participants were randomly divided into a derivation cohort (n = 1600) and a validation cohort (n = 801). Logistic regressions were used to develop a scoring system from the derivation cohort. The discriminatory power and calibration properties of the system were assessed using the validation cohort. Seven independent factors predicted program attrition: gender, age, household income, comorbidity, chronic pain, site's user population size, and average age of site staff. Six factors predicted long-term attrition: gender, age, marital status, chronic pain, site's user population size, and average age of site staff. Each model exhibited moderate to fair discriminatory power (C statistic in the validation set: 0.70 for program attrition, and 0.66 for long-term attrition) and excellent calibration. The resulting scoring system offers a low-technology approach to identify participants at elevated risk for attrition in future similar behavioral modification intervention projects, which may inform appropriate allocation of retention resources. This approach also serves as a model for other efforts to prevent participant attrition. PMID:26768431

  10. The Role of Social Relationships in Predicting Loneliness: The National Social Life, Health, and Aging Project

    ERIC Educational Resources Information Center

    Shiovitz-Ezra, Sharon; Leitsch, Sara A.

    2010-01-01

    The authors explore associations between objective and subjective social network characteristics and loneliness in later life, using data from the National Social Life, Health, and Aging Project, a nationally representative sample of individuals ages 57 to 85 in the United States. Hierarchical linear regression was used to examine the associations…

  11. Plate Boundaries and Earthquake Prediction. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  12. Applications systems verification and transfer project. Volume 8: Satellite snow mapping and runoff prediction handbook

    NASA Technical Reports Server (NTRS)

    Bowley, C. J.; Barnes, J. C.; Rango, A.

    1981-01-01

    The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.

  13. Performance of the operational high-resolution numerical weather predictions of the Daphne project

    NASA Astrophysics Data System (ADS)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Karacostas, Theodore; Kartsios, Stergios; Kotsopoulos, Stelios; Bampzelis, Dimitrios

    2015-04-01

    In the framework of the DAPHNE project, the Department of Meteorology and Climatology (http://meteo.geo.auth.gr) of the Aristotle University of Thessaloniki, Greece, utilizes the nonhydrostatic Weather Research and Forecasting model with the Advanced Research dynamic solver (WRF-ARW) in order to produce high-resolution weather forecasts over Thessaly in central Greece. The aim of the DAPHNE project is to tackle the problem of drought in this area by means of Weather Modification. Cloud seeding assists the convective clouds to produce rain more efficiently or reduce hailstone size in favour of raindrops. The most favourable conditions for such a weather modification program in Thessaly occur in the period from March to October when convective clouds are triggered more frequently. Three model domains, using 2-way telescoping nesting, cover: i) Europe, the Mediterranean sea and northern Africa (D01), ii) Greece (D02) and iii) the wider region of Thessaly (D03; at selected periods) at horizontal grid-spacings of 15km, 5km and 1km, respectively. This research work intents to describe the atmospheric model setup and analyse its performance during a selected period of the operational phase of the project. The statistical evaluation of the high-resolution operational forecasts is performed using surface observations, gridded fields and radar data. Well established point verification methods combined with novel object based upon these methods, provide in depth analysis of the model skill. Spatial characteristics are adequately captured but a variable time lag between forecast and observation is noted. Acknowledgments: This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness

  14. Predicting stroke through genetic risk functions: The CHARGE risk score project

    PubMed Central

    Ibrahim-Verbaas, Carla A; Fornage, Myriam; Bis, Joshua C; Choi, Seung Hoan; Psaty, Bruce M; Meigs, James B; Rao, Madhu; Nalls, Mike; Fontes, Joao D; O’Donnell, Christopher J.; Kathiresan, Sekar; Ehret, Georg B.; Fox, Caroline S; Malik, Rainer; Dichgans, Martin; Schmidt, Helena; Lahti, Jari; Heckbert, Susan R; Lumley, Thomas; Rice, Kenneth; Rotter, Jerome I; Taylor, Kent D; Folsom, Aaron R; Boerwinkle, Eric; Rosamond, Wayne D; Shahar, Eyal; Gottesman, Rebecca F.; Koudstaal, Peter J; Amin, Najaf; Wieberdink, Renske G.; Dehghan, Abbas; Hofman, Albert; Uitterlinden, André G; DeStefano, Anita L.; Debette, Stephanie; Xue, Luting; Beiser, Alexa; Wolf, Philip A.; DeCarli, Charles; Ikram, M. Arfan; Seshadri, Sudha; Mosley, Thomas H; Longstreth, WT; van Duijn, Cornelia M; Launer, Lenore J

    2014-01-01

    Background and Purpose Beyond the Framingham Stroke Risk Score (FSRS), prediction of future stroke may improve with a genetic risk score (GRS) based on Single nucleotide polymorphisms (SNPs) associated with stroke and its risk factors. Methods The study includes four population-based cohorts with 2,047 first incident strokes from 22,720 initially stroke-free European origin participants aged 55 years and older, who were followed for up to 20 years. GRS were constructed with 324 SNPs implicated in stroke and 9 risk factors. The association of the GRS to first incident stroke was tested using Cox regression; the GRS predictive properties were assessed with Area under the curve (AUC) statistics comparing the GRS to age sex, and FSRS models, and with reclassification statistics. These analyses were performed per cohort and in a meta-analysis of pooled data. Replication was sought in a case-control study of ischemic stroke (IS). Results In the meta-analysis, adding the GRS to the FSRS, age and sex model resulted in a significant improvement in discrimination (All stroke: Δjoint AUC =0.016, p-value=2.3*10-6; IS: Δ joint AUC =0.021, p-value=3.7*10−7), although the overall AUC remained low. In all studies there was a highly significantly improved net reclassification index (p-values <10−4). Conclusions The SNPs associated with stroke and its risk factors result only in a small improvement in prediction of future stroke compared to the classical epidemiological risk factors for stroke. PMID:24436238

  15. Zsyntax: a formal language for molecular biology with projected applications in text mining and biological prediction.

    PubMed

    Boniolo, Giovanni; D'Agostino, Marcello; Di Fiore, Pier Paolo

    2010-03-03

    We propose a formal language that allows for transposing biological information precisely and rigorously into machine-readable information. This language, which we call Zsyntax (where Z stands for the Greek word zetaomegaeta, life), is grounded on a particular type of non-classical logic, and it can be used to write algorithms and computer programs. We present it as a first step towards a comprehensive formal language for molecular biology in which any biological process can be written and analyzed as a sort of logical "deduction". Moreover, we illustrate the potential value of this language, both in the field of text mining and in that of biological prediction.

  16. Future global and regional climate change: From near-term prediction to long-term projections (Invited)

    NASA Astrophysics Data System (ADS)

    Knutti, R.; Collins, M.; Power, S.; Kirtman, B. P.; Christensen, J. H.; Krishna Kumar, K.

    2013-12-01

    The IPCC AR5 assessed results from a hierarchy of different climate models on how climate might change in the future from decades to millennia. The projections are based on a series of new climate models and for new scenarios. They are very consistent with projections in AR4 and confirm widespread changes in the atmosphere, ocean, sea ice and land under emission scenarios without mitigation. In the late 21st century and beyond, the warming is dominated by the total emissions of CO2, and many changes will persist for centuries even if emissions were stopped. Stabilization of global temperature at 2°C above the preindustrial value for example, requires strong emission reductions over the 21st century. In the near term and locally however, interannual and decadal climate variability remains a large and mostly irreducible component of the uncertainty in projections. Improving the quality of information on regional climate change and improving the ability of the scientific community to perform near-term climate predictions are key challenges for the future. The development of a consensus in the climate science community on (i) the major directions for future model development and (ii) the scope of future coordinated model experiments will help serve the needs of both future IPCC assessments and the wider research community.

  17. Prenatal maternal stress predicts childhood asthma in girls: project ice storm.

    PubMed

    Turcotte-Tremblay, Anne-Marie; Lim, Robert; Laplante, David P; Kobzik, Lester; Brunet, Alain; King, Suzanne

    2014-01-01

    Little is known about how prenatal maternal stress (PNMS) influences risks of asthma in humans. In this small study, we sought to determine whether disaster-related PNMS would predict asthma risk in children. In June 1998, we assessed severity of objective hardship and subjective distress in women pregnant during the January 1998 Quebec Ice Storm. Lifetime asthma symptoms, diagnoses, and corticosteroid utilization were assessed when the children were 12 years old (N = 68). No effects of objective hardship or timing of the exposure were found. However, we found that, in girls only, higher levels of prenatal maternal subjective distress predicted greater lifetime risk of wheezing (OR = 1.11; 90% CI = 1.01-1.23), doctor-diagnosed asthma (OR = 1.09; 90% CI = 1.00-1.19), and lifetime utilization of corticosteroids (OR = 1.12; 90% CI = 1.01-1.25). Other perinatal and current maternal life events were also associated with asthma outcomes. Findings suggest that stress during pregnancy opens a window for fetal programming of immune functioning. A sex-based approach may be useful to examine how prenatal and postnatal environments combine to program the immune system. This small study needs to be replicated with a larger, more representative sample.

  18. The ChemScreen project to design a pragmatic alternative approach to predict reproductive toxicity of chemicals.

    PubMed

    van der Burg, Bart; Wedebye, Eva Bay; Dietrich, Daniel R; Jaworska, Joanna; Mangelsdorf, Inge; Paune, Eduard; Schwarz, Michael; Piersma, Aldert H; Kroese, E Dinant

    2015-08-01

    There is a great need for rapid testing strategies for reproductive toxicity testing, avoiding animal use. The EU Framework program 7 project ChemScreen aimed to fill this gap in a pragmatic manner preferably using validated existing tools and place them in an innovative alternative testing strategy. In our approach we combined knowledge on critical processes affected by reproductive toxicants with knowledge on the mechanistic basis of such effects. We used in silico methods for prescreening chemicals for relevant toxic effects aiming at reduced testing needs. For those chemicals that need testing we have set up an in vitro screening panel that includes mechanistic high throughput methods and lower throughput assays that measure more integrative endpoints. In silico pharmacokinetic modules were developed for rapid exposure predictions via diverse exposure routes. These modules to match in vitro and in vivo exposure levels greatly improved predictivity of the in vitro tests. As a further step, we have generated examples how to predict reproductive toxicity of chemicals using available data. We have executed formal validations of panel constituents and also used more innovative manners to validate the test panel using mechanistic approaches. We are actively engaged in promoting regulatory acceptance of the tools developed as an essential step towards practical application, including case studies for read-across purposes. With this approach, a significant saving in animal use and associated costs seems very feasible.

  19. NASA's Evolutionary Xenon Thruster (NEXT) Project Qualification Propellant Throughput Milestone: Performance, Erosion, and Thruster Service Life Prediction After 450 kg

    NASA Technical Reports Server (NTRS)

    Herman, Daniel A.

    2010-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) program is tasked with significantly improving and extending the capabilities of current state-of-the-art NSTAR thruster. The service life capability of the NEXT ion thruster is being assessed by thruster wear test and life-modeling of critical thruster components, such as the ion optics and cathodes. The NEXT Long-Duration Test (LDT) was initiated to validate and qualify the NEXT thruster propellant throughput capability. The NEXT thruster completed the primary goal of the LDT; namely to demonstrate the project qualification throughput of 450 kg by the end of calendar year 2009. The NEXT LDT has demonstrated 28,500 hr of operation and processed 466 kg of xenon throughput--more than double the throughput demonstrated by the NSTAR flight-spare. Thruster performance changes have been consistent with a priori predictions. Thruster erosion has been minimal and consistent with the thruster service life assessment, which predicts the first failure mode at greater than 750 kg throughput. The life-limiting failure mode for NEXT is predicted to be loss of structural integrity of the accelerator grid due to erosion by charge-exchange ions.

  20. Joint Applications Pilot of the National Climate Predictions and Projections Platform and the North Central Climate Science Center: Delivering climate projections on regional scales to support adaptation planning

    NASA Astrophysics Data System (ADS)

    Ray, A. J.; Ojima, D. S.; Morisette, J. T.

    2012-12-01

    The DOI North Central Climate Science Center (NC CSC) and the NOAA/NCAR National Climate Predictions and Projections (NCPP) Platform and have initiated a joint pilot study to collaboratively explore the "best available climate information" to support key land management questions and how to provide this information. NCPP's mission is to support state of the art approaches to develop and deliver comprehensive regional climate information and facilitate its use in decision making and adaptation planning. This presentation will describe the evolving joint pilot as a tangible, real-world demonstration of linkages between climate science, ecosystem science and resource management. Our joint pilot is developing a deliberate, ongoing interaction to prototype how NCPP will work with CSCs to develop and deliver needed climate information products, including translational information to support climate data understanding and use. This pilot also will build capacity in the North Central CSC by working with NCPP to use climate information used as input to ecological modeling. We will discuss lessons to date on developing and delivering needed climate information products based on this strategic partnership. Four projects have been funded to collaborate to incorporate climate information as part of an ecological modeling project, which in turn will address key DOI stakeholder priorities in the region: Riparian Corridors: Projecting climate change effects on cottonwood and willow seed dispersal phenology, flood timing, and seedling recruitment in western riparian forests. Sage Grouse & Habitats: Integrating climate and biological data into land management decision models to assess species and habitat vulnerability Grasslands & Forests: Projecting future effects of land management, natural disturbance, and CO2 on woody encroachment in the Northern Great Plains The value of climate information: Supporting management decisions in the Plains and Prairie Potholes LCC. NCCSC's role in

  1. SWAT system performance predictions. Project report. [SWAT (Short-Wavelength Adaptive Techniques)

    SciTech Connect

    Parenti, R.R.; Sasiela, R.J.

    1993-03-10

    In the next phase of Lincoln Laboratory's SWAT (Short-Wavelength Adaptive Techniques) program, the performance of a 241-actuator adaptive-optics system will be measured using a variety of synthetic-beacon geometries. As an aid in this experimental investigation, a detailed set of theoretical predictions has also been assembled. The computational tools that have been applied in this study include a numerical approach in which Monte-Carlo ray-trace simulations of accumulated phase error are developed, and an analytical analysis of the expected system behavior. This report describes the basis of these two computational techniques and compares their estimates of overall system performance. Although their regions of applicability tend to be complementary rather than redundant, good agreement is usually obtained when both sets of results can be derived for the same engagement scenario.... Adaptive optics, Phase conjugation, Atmospheric turbulence Synthetic beacon, Laser guide star.

  2. Zsyntax: A Formal Language for Molecular Biology with Projected Applications in Text Mining and Biological Prediction

    PubMed Central

    Boniolo, Giovanni; D'Agostino, Marcello; Di Fiore, Pier Paolo

    2010-01-01

    We propose a formal language that allows for transposing biological information precisely and rigorously into machine-readable information. This language, which we call Zsyntax (where Z stands for the Greek word ζωή, life), is grounded on a particular type of non-classical logic, and it can be used to write algorithms and computer programs. We present it as a first step towards a comprehensive formal language for molecular biology in which any biological process can be written and analyzed as a sort of logical “deduction”. Moreover, we illustrate the potential value of this language, both in the field of text mining and in that of biological prediction. PMID:20209084

  3. Predicting Function of Biological Macromolecules: A Summary of LDRD Activities: Project 10746

    SciTech Connect

    FRINK, LAURA J. D.; REMPE, SUSAN L.; MEANS, SHAWN A.; STEVENS, MARK J.; CROZIER, PAUL S.; MARTIN, MARCUS G.; SEARS, MARK P.; HJALMARSON, HAROLD P.

    2002-11-01

    This LDRD project has involved the development and application of Sandia's massively parallel materials modeling software to several significant biophysical systems. They have been successful in applying the molecular dynamics code LAMMPS to modeling DNA, unstructured proteins, and lipid membranes. They have developed and applied a coupled transport-molecular theory code (Tramonto) to study ion channel proteins with gramicidin A as a prototype. they have used the Towhee configurational bias Monte-Carlo code to perform rigorous tests of biological force fields. they have also applied the MP-Sala reacting-diffusion code to model cellular systems. Electroporation of cell membranes has also been studied, and detailed quantum mechanical studies of ion solvation have been performed. In addition, new molecular theory algorithms have been developed (in FasTram) that may ultimately make protein solvation calculations feasible on workstations. Finally, they have begun implementation of a combined molecular theory and configurational bias Monte-Carlo code. They note that this LDRD has provided a basis for several new internal (e.g. several new LDRD) and external (e.g. 4 NIH proposals and a DOE/Genomes to Life) proposals.

  4. Prenatal maternal stress predicts autism traits in 6½ year-old children: Project Ice Storm.

    PubMed

    Walder, Deborah J; Laplante, David P; Sousa-Pires, Alexandra; Veru, Franz; Brunet, Alain; King, Suzanne

    2014-10-30

    Research implicates prenatal maternal stress (PNMS) as a risk factor for neurodevelopmental disorders; however few studies report PNMS effects on autism risk in offspring. We examined, prospectively, the degree to which objective and subjective elements of PNMS explained variance in autism-like traits among offspring, and tested moderating effects of sex and PNMS timing in utero. Subjects were 89 (46F/43M) children who were in utero during the 1998 Quebec Ice Storm. Soon after the storm, mothers completed questionnaires on objective exposure and subjective distress, and completed the Autism Spectrum Screening Questionnaire (ASSQ) for their children at age 6½. ASSQ scores were higher among boys than girls. Greater objective and subjective PNMS predicted higher ASSQ independent of potential confounds. An objective-by-subjective interaction suggested that when subjective PNMS was high, objective PNMS had little effect; whereas when subjective PNMS was low, objective PNMS strongly affected ASSQ scores. A timing-by-objective stress interaction suggested objective stress significantly affected ASSQ in first-trimester exposed children, though less so with later exposure. The final regression explained 43% of variance in ASSQ scores; the main effect of sex and the sex-by-PNMS interactions were not significant. Findings may help elucidate neurodevelopmental origins of non-clinical autism-like traits from a dimensional perspective.

  5. LLPi: Liverpool Lung Project Risk Prediction Model for Lung Cancer Incidence.

    PubMed

    Marcus, Michael W; Chen, Ying; Raji, Olaide Y; Duffy, Stephen W; Field, John K

    2015-06-01

    Identification of high-risk individuals will facilitate early diagnosis, reduce overall costs, and also improve the current poor survival from lung cancer. The Liverpool Lung Project prospective cohort of 8,760 participants ages 45 to 79 years, recruited between 1998 and 2008, was followed annually through the hospital episode statistics until January 31, 2013. Cox proportional hazards models were used to identify risk predictors of lung cancer incidence. C-statistic was used to assess the discriminatory accuracy of the models. Models were internally validated using the bootstrap method. During mean follow-up of 8.7 years, 237 participants developed lung cancer. Age [hazard ratio (HR), 1.04; 95% confidence interval (CI), 1.02-1.06], male gender (HR, 1.48; 95% CI, 1.10-1.98), smoking duration (HR, 1.04; 95% CI, 1.03-1.05), chronic obstructive pulmonary disease (HR, 2.43; 95% CI, 1.79-3.30), prior diagnosis of malignant tumor (HR, 2.84; 95% CI, 2.08-3.89), and early onset of family history of lung cancer (HR, 1.68; 95% CI, 1.04-2.72) were associated with the incidence of lung cancer. The LLPi risk model had a good calibration (goodness-of-fit χ(2) 7.58, P = 0.371). The apparent C-statistic was 0.852 (95% CI, 0.831-0.873) and the optimism-corrected bootstrap resampling C-statistic was 0.849 (95% CI, 0.829-0.873). The LLPi risk model may assist in identifying individuals at high risk of developing lung cancer in population-based screening programs.

  6. Constructing Predictive Estimates for Worker Exposure to Radioactivity During Decommissioning: Analysis of Completed Decommissioning Projects - Master Thesis

    SciTech Connect

    Dettmers, Dana Lee; Eide, Steven Arvid

    2002-10-01

    An analysis of completed decommissioning projects is used to construct predictive estimates for worker exposure to radioactivity during decommissioning activities. The preferred organizational method for the completed decommissioning project data is to divide the data by type of facility, whether decommissioning was performed on part of the facility or the complete facility, and the level of radiation within the facility prior to decommissioning (low, medium, or high). Additional data analysis shows that there is not a downward trend in worker exposure data over time. Also, the use of a standard estimate for worker exposure to radioactivity may be a best estimate for low complete storage, high partial storage, and medium reactor facilities; a conservative estimate for some low level of facility radiation facilities (reactor complete, research complete, pits/ponds, other), medium partial process facilities, and high complete research facilities; and an underestimate for the remaining facilities. Limited data are available to compare different decommissioning alternatives, so the available data are reported and no conclusions can been drawn. It is recommended that all DOE sites and the NRC use a similar method to document worker hours, worker exposure to radiation (person-rem), and standard industrial accidents, injuries, and deaths for all completed decommissioning activities.

  7. Predicting the spatial extent of injection-induced zones of enhanced permeability at the Northwest Geysers EGS Demonstration Project

    SciTech Connect

    Rutqvist, J.; Oldenburg, C.M.; Dobson, P.F.

    2010-02-01

    We present the results of coupled thermal, hydraulic, and mechanical (THM) modeling of a proposed stimulation injection associated with an Enhanced Geothermal System (EGS) demonstration project at the northwest part of The Geysers geothermal field, California. The project aims at creating an EGS by directly and systematically injecting cool water at relatively low pressure into a known High Temperature (about 280 to 350 C) Zone (HTZ) located under the conventional (240 C) steam reservoir at depths below 3 km. Accurate micro-earthquake monitoring from the start of the injection will be used as a tool for tracking the development of the EGS. We first analyzed historic injection and micro-earthquake data from an injection well (Aidlin 11), located about 3 miles to the west of the new EGS demonstration area. Thereafter, we used the same modeling approach to predict the likely extent of the zone of enhanced permeability for a proposed initial injection in two wells (Prati State 31 and Prati 32) at the new EGS demonstration area. Our modeling indicates that the proposed injection scheme will provide additional steam production in the area by creating a zone of permeability enhancement extending about 0.5 km from each injection well which will connect to the overlying conventional steam reservoir.

  8. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    NASA Astrophysics Data System (ADS)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  9. The PredictAD project: development of novel biomarkers and analysis software for early diagnosis of the Alzheimer's disease.

    PubMed

    Antila, Kari; Lötjönen, Jyrki; Thurfjell, Lennart; Laine, Jarmo; Massimini, Marcello; Rueckert, Daniel; Zubarev, Roman A; Orešič, Matej; van Gils, Mark; Mattila, Jussi; Hviid Simonsen, Anja; Waldemar, Gunhild; Soininen, Hilkka

    2013-04-01

    Alzheimer's disease (AD) is the most common cause of dementia affecting 36 million people worldwide. As the demographic transition in the developed countries progresses towards older population, the worsening ratio of workers per retirees and the growing number of patients with age-related illnesses such as AD will challenge the current healthcare systems and national economies. For these reasons AD has been identified as a health priority, and various methods for diagnosis and many candidates for therapies are under intense research. Even though there is currently no cure for AD, its effects can be managed. Today the significance of early and precise diagnosis of AD is emphasized in order to minimize its irreversible effects on the nervous system. When new drugs and therapies enter the market it is also vital to effectively identify the right candidates to benefit from these. The main objective of the PredictAD project was to find and integrate efficient biomarkers from heterogeneous patient data to make early diagnosis and to monitor the progress of AD in a more efficient, reliable and objective manner. The project focused on discovering biomarkers from biomolecular data, electrophysiological measurements of the brain and structural, functional and molecular brain images. We also designed and built a statistical model and a framework for exploiting these biomarkers with other available patient history and background data. We were able to discover several potential novel biomarker candidates and implement the framework in software. The results are currently used in several research projects, licensed to commercial use and being tested for clinical use in several trials.

  10. The PredictAD project: development of novel biomarkers and analysis software for early diagnosis of the Alzheimer's disease

    PubMed Central

    Antila, Kari; Lötjönen, Jyrki; Thurfjell, Lennart; Laine, Jarmo; Massimini, Marcello; Rueckert, Daniel; Zubarev, Roman A.; Orešič, Matej; van Gils, Mark; Mattila, Jussi; Hviid Simonsen, Anja; Waldemar, Gunhild; Soininen, Hilkka

    2013-01-01

    Alzheimer's disease (AD) is the most common cause of dementia affecting 36 million people worldwide. As the demographic transition in the developed countries progresses towards older population, the worsening ratio of workers per retirees and the growing number of patients with age-related illnesses such as AD will challenge the current healthcare systems and national economies. For these reasons AD has been identified as a health priority, and various methods for diagnosis and many candidates for therapies are under intense research. Even though there is currently no cure for AD, its effects can be managed. Today the significance of early and precise diagnosis of AD is emphasized in order to minimize its irreversible effects on the nervous system. When new drugs and therapies enter the market it is also vital to effectively identify the right candidates to benefit from these. The main objective of the PredictAD project was to find and integrate efficient biomarkers from heterogeneous patient data to make early diagnosis and to monitor the progress of AD in a more efficient, reliable and objective manner. The project focused on discovering biomarkers from biomolecular data, electrophysiological measurements of the brain and structural, functional and molecular brain images. We also designed and built a statistical model and a framework for exploiting these biomarkers with other available patient history and background data. We were able to discover several potential novel biomarker candidates and implement the framework in software. The results are currently used in several research projects, licensed to commercial use and being tested for clinical use in several trials. PMID:24427524

  11. Seismic and sub-seismic deformation prediction for the assessment of possible pathways - the joint project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Tanner, D. C.

    2014-12-01

    In the joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps), we determine the specific potential of communicating systems that occur between reservoir and surface in the framework of CO2 underground storage. The development of a new seismo-mechanical workflow permits an estimation of the long-term storage integrity. The study target in the Otway Basin in south-western Victoria is Australia´s first demonstration of the deep geological storage of CO2, operated by the CO2CRC. Our objective is to predict and quantify the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of the reservoir. For this purpose, we applied three independent approaches to fill the sub-seismic space, and validated them. Firstly, we built a geometrical kinematic 3-D depth model based on 2-D and 3-D seismic data that are provided by the CO2CRC Consortium. This interpretation was stabilized by additional seismic attribute processing that images small-scale lineaments with high resolution by multi-attribute displays that combine curvature and coherency. Retro-deformation, i.e. kinematically restoring the strata in 3-D, was performed on the reservoir seal. The highest strain magnitudes are around 4-5%. They are not observed where the fault displacements are highest, but where the fault morphologies were the most complex, i.e., there are rapid changes in displacement along the fault plane. Benchmarking this approach by numerical forward modelling yields further constraints on stress variation. In areas where we had preliminary predicted critical deformation we carried out new reflection seismic measurements to calibrate our predictions. This not only yields high-resolution structural images, but, in addition, also allows to determine petrophysical parameters by the acquisition of shear-wave reflection seismic data. With this seismo-mechanical workflow we obtain a better overview of possible fluid migration pathways and communication

  12. Palomar project: predicting school renouncing dropouts, using the artificial neural networks as a support for educational policy decisions.

    PubMed

    Carbone, V; Piras, G

    1998-02-01

    The "Palomar" project confronts two problem situations that are partly independent and partly connected to the Italian schooling system: unstable participation in school such as drop out and educational guidance. Our concern is that of a set of phenomena which consists of ceasing compulsory education, repetition of a year at school, school "drop outs", irregular compulsory attendance and delays in the course of studies. The "Palomar" project is designed to offer educators and administrators who want to effectively intervene with these complex problems to furnish school guidance services as an instrument able to: 1. Predict: creating a system able to predict in advance (not in a "cause-effect" way but as an approximation): a) which students are at "risk" for school destabilization or failure; b) what are the prototypical characteristics of these students; c) which students among those studied are more likely to "destabilize" or fail in school; in which course of study does each student have the greatest chance of success; d) which, among the variables studied and appropriately weighted for each student, will predict the successful grade, analyzed for each possible course of studies. 2. Optimize: selecting and focusing on a student on the basis of the information given. It is possible: a) to point out which personal factors (relational, familial, student, disciplinary, economical) need to be reinforced in order to improve the school performances of each selected student, both to prevent or limit "dropping out" desertion or failure and to raise the performances in the chosen school course as much as possible; b) on the basis of what was mentioned above, to simulate the possible support measures to increase the efficacy of the considered intervention; c) to choose for each student the appropriate intervention strategy capable of obtaining the maximum result and the maximum efficacy in the given conditions. 3. Verify: when the strategy of intervention has been decided

  13. Palomar project: predicting school renouncing dropouts, using the artificial neural networks as a support for educational policy decisions.

    PubMed

    Carbone, V; Piras, G

    1998-02-01

    The "Palomar" project confronts two problem situations that are partly independent and partly connected to the Italian schooling system: unstable participation in school such as drop out and educational guidance. Our concern is that of a set of phenomena which consists of ceasing compulsory education, repetition of a year at school, school "drop outs", irregular compulsory attendance and delays in the course of studies. The "Palomar" project is designed to offer educators and administrators who want to effectively intervene with these complex problems to furnish school guidance services as an instrument able to: 1. Predict: creating a system able to predict in advance (not in a "cause-effect" way but as an approximation): a) which students are at "risk" for school destabilization or failure; b) what are the prototypical characteristics of these students; c) which students among those studied are more likely to "destabilize" or fail in school; in which course of study does each student have the greatest chance of success; d) which, among the variables studied and appropriately weighted for each student, will predict the successful grade, analyzed for each possible course of studies. 2. Optimize: selecting and focusing on a student on the basis of the information given. It is possible: a) to point out which personal factors (relational, familial, student, disciplinary, economical) need to be reinforced in order to improve the school performances of each selected student, both to prevent or limit "dropping out" desertion or failure and to raise the performances in the chosen school course as much as possible; b) on the basis of what was mentioned above, to simulate the possible support measures to increase the efficacy of the considered intervention; c) to choose for each student the appropriate intervention strategy capable of obtaining the maximum result and the maximum efficacy in the given conditions. 3. Verify: when the strategy of intervention has been decided

  14. The eTOX data-sharing project to advance in silico drug-induced toxicity prediction.

    PubMed

    Cases, Montserrat; Briggs, Katharine; Steger-Hartmann, Thomas; Pognan, François; Marc, Philippe; Kleinöder, Thomas; Schwab, Christof H; Pastor, Manuel; Wichard, Jörg; Sanz, Ferran

    2014-01-01

    The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects) and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP) protection and set up of adequate controlled vocabularies) and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds). In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys) are presented as decision support knowledge-based tools for drug development process at an early stage. PMID:25405742

  15. The eTOX Data-Sharing Project to Advance in Silico Drug-Induced Toxicity Prediction

    PubMed Central

    Cases, Montserrat; Briggs, Katharine; Steger-Hartmann, Thomas; Pognan, François; Marc, Philippe; Kleinöder, Thomas; Schwab, Christof H.; Pastor, Manuel; Wichard, Jörg; Sanz, Ferran

    2014-01-01

    The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects) and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP) protection and set up of adequate controlled vocabularies) and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds). In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys) are presented as decision support knowledge-based tools for drug development process at an early stage. PMID:25405742

  16. The KIAPS global NWP model development project at the Korea Institute of Atmospheric Prediction Systems (KIAPS.org)

    NASA Astrophysics Data System (ADS)

    Kim, Young-Joon; Shin, Dong-Wook; Jin, Emilia; Oh, Tae-Jin; Song, Hyo-Jong; Song, In-Sun

    2013-04-01

    A nine-year project to develop Korea's own global Numerical Weather Prediction (NWP) system was launched in 2011 by the Korea Meteorological Administration (KMA) with the total funding of about 100 million US dollars. For the task, the Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded by KMA as an independent, non-profit organization. The project consists of three main stages. The first stage (2011-2013) is to set up the Institute, recruit researchers, lay out plans for the research and development, and design the basic structure and explore/develop core NWP technologies. The second stage (2014-2016) aims at developing the basic modules for the dynamical core, physical parameterizations and data assimilation systems as well as the applied module for the system framework and couplers to connect the basic modules and external models, respectively, in a systematic and efficient way. The third stage (2017-2019) is for validating the prototype NWP system built in stage 2, including necessary post-processing systems, by selecting/improving modules and refining/finalizing the system for operational use at KMA. KIAPS designed key modules for the dynamical core by adopting existing and/or developing new cores, and developed a barotropic model first and a baroclinic model later with code parallelization and optimization in mind. Various physical parameterization schemes, including those used operationally in NWP models as well as those developed by Korean scientists, are being evaluated and improved by using single-column and LES models, and explicit simulations, etc. The control variables for variational data assimilation systems, the testbeds for observational data pre-processing systems, have been designed, the linear models for a barotropic system have been constructed, and the modules for cost function minimization have been developed. The module framework, which is flexible for prognostic and diagnostic variables, is being developed, the I

  17. The Oxfordshire Community Stroke Project classification system predicts clinical outcomes following intravenous thrombolysis: a prospective cohort study

    PubMed Central

    Yang, Yuling; Wang, Anxin; Zhao, Xingquan; Wang, Chunxue; Liu, Liping; Zheng, Huaguang; Wang, Yongjun; Cao, Yibin; Wang, Yilong

    2016-01-01

    Background The Oxfordshire Community Stroke Project (OCSP) classification system is a simple stroke classification system that can be used to predict clinical outcomes. In this study, we compare the safety and efficacy of intravenous thrombolysis in Chinese stroke patients categorized using the OCSP classification system. Patients and methods We collected data from the Thrombolysis Implementation and Monitoring of Acute Ischemic Stroke in China registry. A total of 1,115 patients treated with intravenous thrombolysis with alteplase within 4.5 hours of stroke onset were included. Symptomatic intracranial hemorrhage (SICH), mortality, and 90-day functional outcomes were compared between the stroke patients with different stroke subtypes. Results Of the 1,115 patients included in the cohort, 197 (17.67%) were classified with total anterior circulation infarct (TACI), 700 (62.78%) with partial anterior circulation infarct, 153 (13.72%) with posterior circulation infarct, and 65 (5.83%) with lacunar infarct. After multivariable adjustment, compared to the patients with non-TACI, those with TACI had a significantly increased risk of SICH (odds ratio [OR] 8.80; 95% confidence interval [CI] 2.84–27.25, P<0.001), higher mortality (OR 5.24; 95% CI 3.19–8.62; P<0.001), and poor functional independence (OR 0.38; 95% CI 0.26–0.56; P<0.001) at 3-month follow-up. Conclusion After thrombolysis, the patients with TACI exhibited greater SICH, a higher mortality rate, and worse 3-month clinical outcomes compared with the patients with non-TACI. The OCSP classification system may help clinicians predict the safety and efficacy of thrombolysis. PMID:27418829

  18. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    NASA Astrophysics Data System (ADS)

    Brooks, Erin S.; Dobre, Mariana; Elliot, William J.; Wu, Joan Q.; Boll, Jan

    2016-02-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to forested watersheds, we developed and assessed new approaches for simulating streamflow and sediment transport from large watersheds using WEPP. We created specific algorithms to spatially distribute soil, climate, and management input files for all the subwatersheds within the basin. The model enhancements were tested on five geologically and climatically diverse watersheds in the Lake Tahoe basin, USA. The model was run with minimal calibration to assess WEPP's ability as a physically-based model to predict streamflow and sediment delivery. The performance of the model was examined against 17 years of observed snow water equivalent depth, streamflow, and sediment load data. Only region-wide baseflow recession parameters related to the geology of the basin were calibrated with observed streamflow data. Close agreement between simulated and observed snow water equivalent, streamflow, and the distribution of fine (<20 μm) and coarse (>20 μm) sediments was achieved at each of the major watersheds located in the high-precipitation regions of the basin. Sediment load was adequately simulated in the drier watersheds; however, annual streamflow was overestimated. With the exception of the drier eastern region, the model demonstrated no loss in accuracy when applied without calibration to multiple watersheds across Lake Tahoe basin demonstrating the utility of the model as a management tool in gauged and ungauged basins.

  19. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    SciTech Connect

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository.

  20. The value of selected in vitro and in silico methods to predict acute oral toxicity in a regulatory context: results from the European Project ACuteTox.

    PubMed

    Prieto, P; Kinsner-Ovaskainen, A; Stanzel, S; Albella, B; Artursson, P; Campillo, N; Cecchelli, R; Cerrato, L; Díaz, L; Di Consiglio, E; Guerra, A; Gombau, L; Herrera, G; Honegger, P; Landry, C; O'Connor, J E; Páez, J A; Quintas, G; Svensson, R; Turco, L; Zurich, M G; Zurbano, M J; Kopp-Schneider, A

    2013-06-01

    ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).

  1. The sensitivity of predicted carbon sequestered by a sustainable forestry management project -- An example from the Sierra Gorda Queretana, Mexico

    SciTech Connect

    Bird, D.N.; Ruiz Corzo, M.I.

    1997-12-31

    Joint Implementation (JI) projects that capture carbon through sequestration are believed by investors to be more risky than other projects that reduce greenhouse gas emissions. This is because of their long lifetime, complicated nature, numerous poorly defined input parameters and perceived high costs of monitoring. Whereas the first factors are true, sensitivity analysis can help reduce these costs by focusing one`s attention on the important parameters. Secondly, sensitivity analysis can be used to improve program design. And finally, one can also create a distribution of possible outcomes for the project. The carbon flux model proposed by Schlamadinger and Marland (1) is used to calculate the amount of carbon sequestered by a forest management project. Using simple sensitivity analysis, the model has been extended to create tornado diagrams and probability distributions. Analysis of these data have led to focusing on estimates of important variables, an understanding of the time-value of money and the possibility of project redesign by the operating Non Government Organization (NGO). The project used in this discussion is a forestry management program supervised by Grupo Ecologico Sierra Gorda, A.C.. Their goal is to create a sustainable forestry practice in the Sierra Gorda Queretana, Mexico. It is a 25 year project involving replanting a 1,000 ha/year for seven years and natural reforestation of a further 1,000 ha/year of marginal farmland for seven years. These two components of the project sequester 1.1 million tonnes of carbon and bring $260 million to the region. A forestry protection program sequesters a further 0.8 million tonnes of carbon at marginal cost. The project is anomalous for a sequestration project in that it makes money and has a 20.5% rate of return.

  2. Next generation paradigm for urban pluvial flood modelling, prediction, management and vulnerability reduction - Interaction between RainGain and Blue Green Dream projects

    NASA Astrophysics Data System (ADS)

    Maksimovic, C.

    2012-04-01

    The effects of climate change and increasing urbanisation call for a new paradigm for efficient planning, management and retrofitting of urban developments to increase resilience to climate change and to maximize ecosystem services. Improved management of urban floods from all sources in required. Time scale for well documented fluvial and coastal floods allows for timely response but surface (pluvial) flooding caused by intense local storms had not been given appropriate attention, Pitt Review (UK). Urban surface floods predictions require fine scale data and model resolutions. They have to be tackled locally by combining central inputs (meteorological services) with the efforts of the local entities. Although significant breakthrough in modelling of pluvial flooding was made there is a need to further enhance short term prediction of both rainfall and surface flooding. These issues are dealt with in the EU Iterreg project Rain Gain (RG). Breakthrough in urban flood mitigation can only be achieved by combined effects of advanced planning design, construction and management of urban water (blue) assets in interaction with urban vegetated areas' (green) assets. Changes in design and operation of blue and green assets, currently operating as two separate systems, is urgently required. Gaps in knowledge and technology will be introduced by EIT's Climate-KIC Blue Green Dream (BGD) project. The RG and BGD projects provide synergy of the "decoupled" blue and green systems to enhance multiple benefits to: urban amenity, flood management, heat island, biodiversity, resilience to drought thus energy requirements, thus increased quality of urban life at lower costs. Urban pluvial flood management will address two priority areas: Short Term rainfall Forecast and Short term flood surface forecast. Spatial resolution of short term rainfall forecast below 0.5 km2 and lead time of a few hours are needed. Improvements are achievable by combining data sources of raingauge networks

  3. An Analysis of Predicted vs. Monitored Space Heat Energy Use in 120 Homes : Residential Construction Demonstration Project Cycle II.

    SciTech Connect

    Douglass, John G.; Young, Marvin; Washington State Energy Office.

    1991-10-01

    The SUNDAY thermal simulation program was used to predict space heat energy consumption for 120 energy efficient homes. The predicted data were found to explain 43.8 percent of the variation in monitored space heat consumption. Using a paired Student's to test, no statistically significant difference could be found between mean predicted space heat and monitored space heat for the entire sample of homes. The homes were grouped into seven classes, sub-samples by total heat loss coefficient. An intermediate class (UA = 300--350 Btu/{degrees}F) was found to significantly over-predict space heat by 25 percent. The same class was over-predicted by 16 percent in the analogous Cycle 1 research, but the sample size was smaller and this was not found to be statistically significant. Several variables that were not directly included as inputs to the simulation were examined with an analysis of covariance model for their ability to improve the simulation's prediction of space heat. The variables having the greatest effect were conditioned floor area, heating system type, and foundation type. The model was able to increase the coefficient of determination from 0.438 to 0.670; a 54 percent increase. While the SUNDAY simulation program to aggregate is able to predict space heat consumption, it should be noted that there is a considerable amount of variation in both the monitored space heat consumption and the SUNDAY predictions. The ability of the program to accurately model an individual house will be constrained by both the quality of input variables and the range of occupant behavior. These constraints apply to any building model.

  4. An Analysis of Predicted vs. Monitored Space Heat Energy Use in 120 Homes :Residential Construction Demonstration Project Cycle II.

    SciTech Connect

    Douglass, John G.; Young, Marvin; Washington State Energy Office.

    1991-10-01

    The SUNDAY thermal simulation program was used to predict space heat energy consumption for 120 energy efficient homes. The predicted data were found to explain 43.8 percent of the variation in monitored space heat consumption. Using a paired Student`s to test, no statistically significant difference could be found between mean predicted space heat and monitored space heat for the entire sample of homes. The homes were grouped into seven classes, sub-samples by total heat loss coefficient. An intermediate class (UA = 300--350 Btu/{degrees}F) was found to significantly over-predict space heat by 25 percent. The same class was over-predicted by 16 percent in the analogous Cycle 1 research, but the sample size was smaller and this was not found to be statistically significant. Several variables that were not directly included as inputs to the simulation were examined with an analysis of covariance model for their ability to improve the simulation`s prediction of space heat. The variables having the greatest effect were conditioned floor area, heating system type, and foundation type. The model was able to increase the coefficient of determination from 0.438 to 0.670; a 54 percent increase. While the SUNDAY simulation program to aggregate is able to predict space heat consumption, it should be noted that there is a considerable amount of variation in both the monitored space heat consumption and the SUNDAY predictions. The ability of the program to accurately model an individual house will be constrained by both the quality of input variables and the range of occupant behavior. These constraints apply to any building model.

  5. An analysis of predicted vs monitored space heat energy use in 83 homes. Residential Construction Demonstration Project

    SciTech Connect

    Downey, P.K.

    1989-08-01

    In 1983 the Northwest Power Planning Council (NWPPC) directed the Bonneville Power Administration to create the Residential Standards Demonstration Program to demonstrate actual construction using the Model Conservation Standards (MCS) and to collect cost and thermal data in residential structures. Much information was gained from that program, and as a consequence, the MCS were reevaluated and updated. A second program, the Residential Construction Demonstration Project was created to further investigate residential energy efficiency measures for both cost and thermal performance. The Residential Construction Demonstration Project was administered by the Washington State Energy Office in conjunction with the Idaho Department of Water Resources, the Montana Department of Natural Resources and Conservation, and the Oregon Department of Energy. This analysis is based upon information collected during the first phase of the Residential Construction Demonstration Project (RCDP).

  6. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  7. Development of a Stochastic Inversion Tool To Optimize Agreement Between The Observed And Predicted Seismic Response To CO2 Injection/Migration in the Weyburn-Midale Project

    SciTech Connect

    Ramirez, A L; Hao, Y; White, D; Carle, S; Dyer, K; Yang, X; Mcnab, W; Foxall, W; Johnson, J

    2009-12-02

    During Phase 1 of the Weyburn Project (2000-2004), 4D reflection seismic data were used to map CO{sub 2} migration within the Midale reservoir, while an extensive fluid sampling program documented the geochemical evolution triggered by CO{sub 2}-brine-oil-mineral interactions. The aim of this task (3b.11) is to exploit these existing seismic and geochemical data sets, augmented by CO{sub 2}/H{sub 2}O injection and HC/H{sub 2}O production data toward optimizing the reservoir model and thereby improving site characterization and dependent predictions of long-term CO{sub 2} storage in the Weyburn-Midale reservoir. Our initial project activities have concentrated on developing a stochastic inversion method that will identify reservoir models that optimize agreement between the observed and predicted seismic response. This report describes the technical approach we have followed, the data that supports it, and associated implementation activities. The report fulfills deliverable D1 in the project's statement of work. Future deliverables will describe the development of the stochastic inversion tool that uses geochemical data to optimize the reservoir model.

  8. Can Online Discussion Participation Predict Group Project Performance? Investigating the Roles of Linguistic Features and Participation Patterns

    ERIC Educational Resources Information Center

    Yoo, Jaebong; Kim, Jihie

    2014-01-01

    Although many college courses adopt online tools such as Q&A online discussion boards, there is no easy way to measure or evaluate their effect on learning. As a part of supporting instructional assessment of online discussions, we investigate a predictive relation between characteristics of discussion contributions and student performance.…

  9. Drug biokinetic and toxicity assessments in rat and human primary hepatocytes and HepaRG cells within the EU-funded Predict-IV project.

    PubMed

    Mueller, Stefan O; Guillouzo, André; Hewitt, Philip G; Richert, Lysiane

    2015-12-25

    The overall aim of Predict-IV (EU-funded collaborative project #202222) was to develop improved testing strategies for drug safety in the late discovery phase. One major focus was the prediction of hepatotoxicity as liver remains one of the major organ leading to failure in drug development, drug withdrawal and has a poor predictivity from animal experiments. In this overview we describe the use and applicability of the three cell models employed, i.e., primary rat hepatocytes, primary human hepatocytes and the human HepaRG cell line, using four model compounds, chlorpromazine, ibuprofen, cyclosporine A and amiodarone. This overview described the data generated on mode of action of liver toxicity after long-term repeat-dosing. Moreover we have quantified parent compound and its distribution in various in vitro compartments, which allowed us to develop biokinetic models where we could derive real exposure concentrations in vitro. In conclusion, the complex data set enables quantitative measurements that proved the concept that we can define human relevant free and toxic exposure levels in vitro. Further compounds have to be analyzed in a broader concentration range to fully exploit these promising results for improved prediction of hepatotoxicity and hazard assessment for humans. PMID:25952325

  10. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  11. Predictions of tracer transport in interwell tracer tests at the C-Hole complex. Yucca Mountain site characterization project report milestone 4077

    SciTech Connect

    Reimus, P.W.

    1996-09-01

    This report presents predictions of tracer transport in interwell tracer tests that are to be conducted at the C-Hole complex at the Nevada Test Site on behalf of the Yucca Mountain Site Characterization Project. The predictions are used to make specific recommendations about the manner in which the tracer test should be conducted to best satisfy the needs of the Project. The objective of he tracer tests is to study flow and species transport under saturated conditions in the fractured tuffs near Yucca Mountain, Nevada, the site of a potential high-level nuclear waste repository. The potential repository will be located in the unsaturated zone within Yucca Mountain. The saturated zone beneath and around the mountain represents the final barrier to transport to the accessible environment that radionuclides will encounter if they breach the engineered barriers within the repository and the barriers to flow and transport provided by the unsaturated zone. Background information on the C-Holes is provided in Section 1.1, and the planned tracer testing program is discussed in Section 1.2.

  12. Operational strategy for soil concentration predictions of strontium/yttrium-90 and cesium-137 in surface soil at the West Valley Demonstration Project site

    SciTech Connect

    Myers, J.A.

    1995-06-05

    There are difficulties associated with the assessment of the interpretation of field measurements, determination of guideline protocols and control and disposal of low level radioactive contaminated soil in the environmental health physics field. Questions are raised among scientists and in public forums concerning the necessity and high costs of large area soil remediation versus the risks of low-dose radiation health effects. As a result, accurate soil activity assessments become imperative in decontamination situations. The West Valley Demonstration Project (WVDP), a US Department of Energy facility located in West Valley, New York is managed and operated by West Valley Nuclear Services Co., Inc. (WVNS). WVNS has identified contaminated on-site soil areas with a mixed variety of radionuclides (primarily fission product). Through the use of data obtained from a previous project performed during the summer of 1994 entitled ``Field Survey Correlation and Instrumentation Response for an In Situ Soil Measurement Program`` (Myers), the WVDP offers a unique research opportunity to investigate the possibility of soil concentration predictions based on exposure or count rate responses returned from a survey detector probe. In this study, correlations are developed between laboratory measured soil beta activity and survey probe response for the purposes of determining the optimal detector for field use and using these correlations to establish predictability of soil activity levels.

  13. Final report for LDRD project {open_quotes}A new approach to protein function and structure prediction{close_quotes}

    SciTech Connect

    Phillips, C.A.

    1997-03-01

    This report describes the research performed under the laboratory-Directed Research and Development (LDRD) grant {open_quotes}A new approach to protein function and structure prediction{close_quotes}, funded FY94-6. We describe the goals of the research, motivate and list our improvements to the state of the art in multiple sequence alignment and phylogeny (evolutionary tree) construction, but leave technical details to the six publications resulting from this work. At least three algorithms for phylogeny construction or tree consensus have been implemented and used by researchers outside of Sandia.

  14. Predicting the future distribution of Polar Bear Habitat in the polar basin from resource selection functions applied to 21st century general circulation model projections of sea ice

    USGS Publications Warehouse

    Durner, George M.; Douglas, David C.; Nielson, Ryan M.; Amstrup, Steven C.; McDonald, Trent L.

    2007-01-01

    Predictions of polar bear (Ursus maritimus) habitat distribution in the Arctic polar basin during the 21st century were developed to help understand the likely consequences of anticipated sea ice reductions on polar bear populations. We used location data from satellite-collared polar bears and environmental data (e.g., bathymetry, coastlines, and sea ice) collected between 1985–1995 to build habitat use models called Resource Selection Functions (RSF). The RSFs described habitats polar bears preferred in each of four seasons: summer (ice minimum), autumn (growth), winter (ice maximum) and spring (melt). When applied to the model source data and to independent data (1996–2006), the RSFs consistently identified habitats most frequently used by polar bears. We applied the RSFs to monthly maps of 21st century sea ice concentration predicted by 10 general circulation models (GCM) described in the International Panel of Climate Change Fourth Assessment Report. The 10 GCMs we used had high concordance between their simulations of 20th century summer sea ice extent and the actual ice extent derived from passive microwave satellite observations. Predictions of the amount and rate of change in polar bear habitat varied among GCMs, but all GCMs predicted net habitat losses in the polar basin during the 21st century. Projected losses in the highest-valued RSF habitat (optimal habitat) were greatest in the peripheral seas of the polar basin, especially the Chukchi Sea and Barents Sea. Losses were least in high-latitude regions where RSFs predicted an initial increase in optimal habitat followed by a modest decline. The largest seasonal reductions in habitat were predicted for spring and summer. Average area of optimal polar bear habitat during summer in the polar basin declined from an observed 1.0 million km2 in 1985–1995 (baseline) to a projected multi-model average of 0.58 million km2 in 2045–2054 (-42% change), 0.36 million km2 in 2070–2079 (-64% change), and 0

  15. Performance prediction of mechanical excavators from linear cutter tests on Yucca Mountain welded tuffs; Yucca Mountain Site Characterization Project

    SciTech Connect

    Gertsch, R.; Ozdemir, L.

    1992-09-01

    The performances of mechanical excavators are predicted for excavations in welded tuff. Emphasis is given to tunnel boring machine evaluations based on linear cutting machine test data obtained on samples of Topopah Spring welded tuff. The tests involve measurement of forces as cutters are applied to the rock surface at certain spacing and penetrations. Two disc and two point-attack cutters representing currently available technology are thus evaluated. The performance predictions based on these direct experimental measurements are believed to be more accurate than any previous values for mechanical excavation of welded tuff. The calculations of performance are predicated on minimizing the amount of energy required to excavate the welded tuff. Specific energy decreases with increasing spacing and penetration, and reaches its lowest at the widest spacing and deepest penetration used in this test program. Using the force, spacing, and penetration data from this experimental program, the thrust, torque, power, and rate of penetration are calculated for several types of mechanical excavators. The results of this study show that the candidate excavators will require higher torque and power than heretofore estimated.

  16. Prediction of Inhibitory Activity of Epidermal Growth Factor Receptor Inhibitors Using Grid Search-Projection Pursuit Regression Method

    PubMed Central

    Du, Hongying; Hu, Zhide; Bazzoli, Andrea; Zhang, Yang

    2011-01-01

    The epidermal growth factor receptor (EGFR) protein tyrosine kinase (PTK) is an important protein target for anti-tumor drug discovery. To identify potential EGFR inhibitors, we conducted a quantitative structure–activity relationship (QSAR) study on the inhibitory activity of a series of quinazoline derivatives against EGFR tyrosine kinase. Two 2D-QSAR models were developed based on the best multi-linear regression (BMLR) and grid-search assisted projection pursuit regression (GS-PPR) methods. The results demonstrate that the inhibitory activity of quinazoline derivatives is strongly correlated with their polarizability, activation energy, mass distribution, connectivity, and branching information. Although the present investigation focused on EGFR, the approach provides a general avenue in the structure-based drug development of different protein receptor inhibitors. PMID:21811593

  17. USGS "iCoast - Did the Coast Change?" Project: Crowd-Tagging Aerial Photographs to Improve Coastal Change Prediction Models

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Poore, B. S.; Plant, N. G.; Stockdon, H. F.; Morgan, K.; Snell, R.

    2014-12-01

    The U.S. Geological Survey (USGS) has been acquiring oblique aerial photographs of the coast before and after major storms since 1995 and has amassed a database of over 140,000 photographs of the Gulf, Atlantic, and Pacific coasts. USGS coastal scientists use these photographs to document and characterize coastal change caused by storms. The images can also be used to evaluate the accuracy of predictive models of coastal erosion. However, the USGS does not have the personnel to manually analyze all of the photographs taken after a storm. Also, computers cannot yet automatically identify damages and geomorphic changes to the coast from the oblique aerial photographs. There is a high public interest in accessing the limited number of pre- and post-storm photographic pairs the USGS is currently able to share. Recent federal policies that encourage open data and open innovation initiatives have resulted in many federal agencies developing new ways of using citizen science and crowdsourcing techniques to share data and collaborate with the public to accomplish large tasks. The USGS launched a crowdsourcing application in June 2014 called "iCoast - Did the Coast Change?" (http://coastal.er.usgs.gov/icoast) to allow citizens to help USGS scientists identify changes to the coast by comparing USGS aerial photographs taken before and after storms, and then selecting pre-defined tags like "dune scarp" and "sand on road." The tags are accompanied by text definitions and pictorial examples of these coastal morphology terms and serve to informally and passively educate users about coastal hazards. The iCoast application facilitates greater citizen awareness of coastal change and is an educational resource for teachers and students interested in learning about coastal vulnerability. We expect that the citizen observations from iCoast will assist with probabilistic model development to produce more accurate predictions of coastal vulnerability.

  18. Clinical and Biologic Features Predictive of Survival After Relapse of Neuroblastoma: A Report From the International Neuroblastoma Risk Group Project

    PubMed Central

    London, Wendy B.; Castel, Victoria; Monclair, Tom; Ambros, Peter F.; Pearson, Andrew D.J.; Cohn, Susan L.; Berthold, Frank; Nakagawara, Akira; Ladenstein, Ruth L.; Iehara, Tomoko; Matthay, Katherine K.

    2011-01-01

    Purpose Survival after neuroblastoma relapse is poor. Understanding the relationship between clinical and biologic features and outcome after relapse may help in selection of optimal therapy. Our aim was to determine which factors were significantly predictive of postrelapse overall survival (OS) in patients with recurrent neuroblastoma—particularly whether time from diagnosis to first relapse (TTFR) was a significant predictor of OS. Patients and Methods Patients with first relapse/progression were identified in the International Neuroblastoma Risk Group (INRG) database. Time from study enrollment until first event and OS time starting from first event were calculated. Cox regression models were used to calculate the hazard ratio of increased death risk and perform survival tree regression. TTFR was tested in a multivariable Cox model with other factors. Results In the INRG database (N = 8,800), 2,266 patients experienced first progression/relapse. Median time to relapse was 13.2 months (range, 1 day to 11.4 years). Five-year OS from time of first event was 20% (SE, ± 1%). TTFR was statistically significantly associated with OS time in a nonlinear relationship; patients with TTFR of 36 months or longer had the lowest risk of death, followed by patients who relapsed in the period of 0 to less than 6 months or 18 to 36 months. Patients who relapsed between 6 and 18 months after diagnosis had the highest risk of death. TTFR, age, International Neuroblastoma Staging System stage, and MYCN copy number status were independently predictive of postrelapse OS in multivariable analysis. Conclusion Age, stage, MYCN status, and TTFR are significant prognostic factors for postrelapse survival and may help in the design of clinical trials evaluating novel agents. PMID:21768459

  19. Lessons learned from the National Climate Predictions and Projections (NCPP) platform Workshop on Quantitative Evaluation of Downscaling 2013

    NASA Astrophysics Data System (ADS)

    Guentchev, G.

    2013-12-01

    The mission of NCPP is to accelerate the provision of climate information on regional and local scale for use in adaptation planning and decision making through collaborative participation of a community of scientists and practitioners. A major focus is the development of a capability for objective and quantitative evaluation of downscaled climate information in support of applications. NCPP recognizes the importance of focusing this evaluation effort on real-world applications and the necessity to work closely with the user community to deliver usable evaluations and guidance. This summer NCPP organized our first workshop on quantitative evaluation of downscaled climate datasets (http://earthsystemcog.org/projects/downscaling-2013/). Workshop participants included representatives from downscaling efforts, applications partners from the health, ecological, agriculture and water resources impacts communities, and people working on data infrastructure, metadata, and standards development. The workshop exemplifies NCPP's approach of collaborative and participatory problem-solving where scientists are working together with practitioners to develop applications related evaluation. The set of observed and downscaled datasets included for evaluation in the workshop were assessed using a variety of metrics to elucidate the statistical characteristics of temperature and precipitation time series. In addition, the downscaled datasets were evaluated in terms of their representation of indices relevant to the participating applications working groups, more specifically related to human health and ecological impacts. The presentation will focus on sharing the lessons we learned from our workshop.

  20. Tailoring dam structures to water quality predictions in new reservoir projects: assisting decision-making using numerical modeling.

    PubMed

    Marcé, Rafael; Moreno-Ostos, Enrique; García-Barcina, José Ma; Armengol, Joan

    2010-06-01

    Selection of reservoir location, the floodable basin forest handling, and the design of dam structures devoted to water supply (e.g. water outlets) constitute relevant features which strongly determine water quality and frequently demand management strategies to be adopted. Although these crucial aspects should be carefully examined during dam design before construction, currently the development of ad hoc limnological studies tailoring dam location and dam structures to the water quality characteristics expected in the future reservoir is not typical practice. In this study, we use numerical simulation to assist on the design of a new dam project in Spain with the aim of maximizing the quality of the water supplied by the future reservoir. First, we ran a well-known coupled hydrodynamic and biogeochemical dynamic numerical model (DYRESM-CAEDYM) to simulate the potential development of anoxic layers in the future reservoir. Then, we generated several scenarios corresponding to different potential hydraulic conditions and outlet configurations. Second, we built a simplified numerical model to simulate the development of the hypolimnetic oxygen content during the maturation stage after the first reservoir filling, taking into consideration the degradation of the terrestrial organic matter flooded and the adoption of different forest handling scenarios. Results are discussed in terms of reservoir design and water quality management. The combination of hypolimnetic withdrawal from two deep outlets and the removal of all the valuable terrestrial vegetal biomass before flooding resulted in the best water quality scenario.

  1. REDUCING UNCERTAINTIES IN MODEL PREDICTIONS VIA HISTORY MATCHING OF CO2 MIGRATION AND REACTIVE TRANSPORT MODELING OF CO2 FATE AT THE SLEIPNER PROJECT

    SciTech Connect

    Zhu, Chen

    2015-03-31

    An important question for the Carbon Capture, Storage, and Utility program is “can we adequately predict the CO2 plume migration?” For tracking CO2 plume development, the Sleipner project in the Norwegian North Sea provides more time-lapse seismic monitoring data than any other sites, but significant uncertainties still exist for some of the reservoir parameters. In Part I, we assessed model uncertainties by applying two multi-phase compositional simulators to the Sleipner Benchmark model for the uppermost layer (Layer 9) of the Utsira Sand and calibrated our model against the time-lapsed seismic monitoring data for the site from 1999 to 2010. Approximate match with the observed plume was achieved by introducing lateral permeability anisotropy, adding CH4 into the CO2 stream, and adjusting the reservoir temperatures. Model-predicted gas saturation, CO2 accumulation thickness, and CO2 solubility in brine—none were used as calibration metrics—were all comparable with the interpretations of the seismic data in the literature. In Part II & III, we evaluated the uncertainties of predicted long-term CO2 fate up to 10,000 years, due to uncertain reaction kinetics. Under four scenarios of the kinetic rate laws, the temporal and spatial evolution of CO2 partitioning into the four trapping mechanisms (hydrodynamic/structural, solubility, residual/capillary, and mineral) was simulated with ToughReact, taking into account the CO2-brine-rock reactions and the multi-phase reactive flow and mass transport. Modeling results show that different rate laws for mineral dissolution and precipitation reactions resulted in different predicted amounts of trapped CO2 by carbonate minerals, with scenarios of the conventional linear rate law for feldspar dissolution having twice as much mineral trapping (21% of the injected CO2) as scenarios with a Burch-type or Alekseyev et al.–type rate law for feldspar dissolution (11%). So far, most reactive transport modeling (RTM) studies for

  2. Pons to Posterior Cingulate Functional Projections Predict Affective Processing Changes in the Elderly Following Eight Weeks of Meditation Training.

    PubMed

    Shao, Robin; Keuper, Kati; Geng, Xiujuan; Lee, Tatia M C

    2016-08-01

    Evidence indicates meditation facilitates affective regulation and reduces negative affect. It also influences resting-state functional connectivity between affective networks and the posterior cingulate (PCC)/precuneus, regions critically implicated in self-referential processing. However, no longitudinal study employing active control group has examined the effect of meditation training on affective processing, PCC/precuneus connectivity, and their association. Here, we report that eight-week meditation, but not relaxation, training 'neutralized' affective processing of positive and negative stimuli in healthy elderly participants. Additionally, meditation versus relaxation training increased the positive connectivity between the PCC/precuneus and the pons, the direction of which was largely directed from the pons to the PCC/precuneus, as revealed by dynamic causal modeling. Further, changes in connectivity between the PCC/precuneus and pons predicted changes in affective processing after meditation training. These findings indicate meditation promotes self-referential affective regulation based on increased regulatory influence of the pons on PCC/precuneus, which new affective-processing strategy is employed across both resting state and when evaluating affective stimuli. Such insights have clinical implications on interventions on elderly individuals with affective disorders. PMID:27349456

  3. Factors that predict financial sustainability of community coalitions: five years of findings from the PROSPER partnership project.

    PubMed

    Greenberg, Mark T; Feinberg, Mark E; Johnson, Lesley E; Perkins, Daniel F; Welsh, Janet A; Spoth, Richard L

    2015-01-01

    This study is a longitudinal investigation of the Promoting School-community-university Partnerships to Enhance Resilience (PROSPER) partnership model designed to evaluate the level of sustainability funding by community prevention teams, including which factors impact teams' generation of sustainable funding. Community teams were responsible for choosing, implementing with quality, and sustaining evidence-based programs (EBPs) intended to reduce substance misuse and promote positive youth and family development. Fourteen US rural communities and small towns were studied. Data were collected from PROSPER community team members (N = 164) and prevention coordinators (N = 10) over a 5-year period. Global and specific aspects of team functioning were assessed over six waves. Outcome measures were the total funds (cash and in-kind) raised to implement prevention programs. All 14 community teams were sustained for the first 5 years. However, there was substantial variability in the amount of funds raised, and these differences were predicted by earlier and concurrent team functioning and by team sustainability planning. Given the sufficient infrastructure and ongoing technical assistance provided by the PROSPER partnership model, local sustainability of EBPs is achievable.

  4. Water pollution risk simulation and prediction in the main canal of the South-to-North Water Transfer Project

    NASA Astrophysics Data System (ADS)

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Cheng, Xi

    2014-11-01

    The middle route of the South-to-North Water Transfer Project (MRP) will divert water to Beijing Tuancheng Lake from Taocha in the Danjiangkou reservoir located in the Hubei province of China. The MRP is composed of a long canal and complex hydraulic structures and will transfer water in open channel areas to provide drinking water for Beijing, Shijiazhuang and other cities under extremely strict water quality requirements. A large number of vehicular accidents, occurred on the many highway bridges across the main canal would cause significant water pollution in the main canal. To ensure that water quality is maintained during the diversion process, the effects of pollutants on water quality due to sudden pollution accidents were simulated and analyzed in this paper. The MIKE11 HD module was used to calculate the hydraulic characteristics of the 42-km Xishi-to-Beijuma River channel of the MRP. Six types of hydraulic structures, including inverted siphons, gates, highway bridges, culverts and tunnels, were included in this model. Based on the hydrodynamic model, the MIKE11 AD module, which is one-dimensional advection dispersion model, was built for TP, NH3-N, CODMn and F. The validated results showed that the computed values agreed well with the measured values. In accordance with transportation data across the Dianbei Highway Bridge, the effects of traffic accidents on the bridge on water quality were analyzed. Based on simulated scenarios with three discharge rates (ranged from 12 m3/s to 17 m3/s, 40 m3/s, and 60 m3/s) and three pollution loading concentration levels (5 t, 10 t and 20 t) when trucks spill their contents (i.e., phosphate fertilizer, cyanide, oil and chromium solution) into the channel, emergency measures were proposed. Reasonable solutions to ensure the water quality with regard to the various types of pollutants were proposed, including treating polluted water, maintaining materials, and personnel reserves.

  5. Predicting future US water yield and ecosystem productivity by linking an ecohydrological model to WRF dynamically downscaled climate projections

    NASA Astrophysics Data System (ADS)

    Sun, S.; Sun, G.; Cohen, E.; McNulty, S. G.; Caldwell, P.; Duan, K.; Zhang, Y.

    2015-12-01

    Quantifying the potential impacts of climate change on water yield and ecosystem productivity (i.e., carbon balances) is essential to developing sound watershed restoration plans, and climate change adaptation and mitigation strategies. This study links an ecohydrological model (Water Supply and Stress Index, WaSSI) with WRF (Weather Research and Forecasting Model) dynamically downscaled climate projections of the HadCM3 model under the IPCC SRES A2 emission scenario. We evaluated the future (2031-2060) changes in evapotranspiration (ET), water yield (Q) and gross primary productivity (GPP) from the baseline period of 1979-2007 across the 82 773 watersheds (12 digit Hydrologic Unit Code level) in the conterminous US (CONUS), and evaluated the future annual and monthly changes of hydrology and ecosystem productivity for the 18 Water Resource Regions (WRRs) or 2-digit HUCs. Across the CONUS, the future multi-year means show increases in annual precipitation (P) of 45 mm yr-1 (6 %), 1.8 °C increase in temperature (T), 37 mm yr-1 (7 %) increase in ET, 9 mm yr-1 (3 %) increase in Q, and 106 g C m-2 yr-1 (9 %) increase in GPP. Response to climate change was highly variable across the 82, 773 watersheds, but in general, the majority would see consistent increases in all variables evaluated. Over half of the 82 773 watersheds, mostly found in the northeast and the southern part of the southwest would have an increase in annual Q (>100 mm yr-1 or 20 %). This study provides an integrated method and example for comprehensive assessment of the potential impacts of climate change on watershed water balances and ecosystem productivity at high spatial and temporal resolutions. Results will be useful for policy-makers and land managers in formulating appropriate watershed-specific strategies for sustaining water and carbon sources in the face of climate change.

  6. Project: "Project!"

    ERIC Educational Resources Information Center

    Grayson, Katherine

    2007-01-01

    In November 2006, the editors of "Campus Technology" launched their first-ever High-Resolution Projection Study, to find out if the latest in projector technology could really make a significant difference in teaching, learning, and educational innovation on US campuses. The author and her colleagues asked campus educators, technologists, and…

  7. The PreViBOSS project: study the short term predictability of the visibility change during the Fog life cycle, from surface and satellite observation

    NASA Astrophysics Data System (ADS)

    Elias, T.; Haeffelin, M.; Ramon, D.; Gomes, L.; Brunet, F.; Vrac, M.; Yiou, P.; Hello, G.; Petithomme, H.

    2010-07-01

    Fog prejudices major activities as transport and Earth observation, by critically reducing atmospheric visibility with no continuity in time and space. Fog is also an essential factor of air quality and climate as it modifies particle properties of the surface atmospheric layer. Complexity, diversity and the fine scale of processes make uncertain by current numerical weather prediction models, not only visibility diagnosis but also fog event prediction. Extensive measurements of atmospheric parameters are made on the SIRTA since 1997 to document physical processes over the atmospheric column, in the Paris suburb area, typical of an environment intermittently under oceanic influence and affected by urban and industrial pollution. The ParisFog field campaign hosted in SIRTA during 6-month in winter 2006-2007 resulted in the deployment of instrumentation specifically dedicated to study physical processes in the fog life cycle: thermodynamical, radiative, dynamical, microphysical processes. Analysis of the measurements provided a preliminary climatology of the episodes of reduced visibility, chronology of processes was delivered by examining time series of measured parameters and a closure study was performed on optical and microphysical properties of particles (aerosols to droplets) during the life cycle of a radiative fog, providing the relative contribution of several particle groups to extinction in clear-sky conditions, in haze and in fog. PreViBOSS is a 3-year project scheduled to start this year. The aim is to improve the short term prediction of changes in atmospheric visibility, at a local scale. It proposes an innovative approach: applying the Generalised Additive Model statistical method to the detailed and extended dataset acquired at SIRTA. This method offers the opportunity to explore non linear relationships between parameters, which are not yet integrated in current numerical models. Emphasis will be put on aerosols and their impact on the fog life

  8. Ground-Based Cloud and Atmospheric Boundary Layer Observations for the Project: High Definition Clouds and Precipitation for Advancing Climate Prediction, HD(CP)2

    NASA Astrophysics Data System (ADS)

    Hirsikko, A.; Ebell, K.; Ulrich, U.; Schween, J. H.; Bohn, B.; Görsdorf, U.; Leinweber, R.; Päschke, E.; Baars, H.; Seifert, P.; Klein Baltink, H.

    2014-12-01

    The German research initiative ''High Definition Clouds and Precipitation for advancing Climate Prediction, HD(CP)2'' aims for an improved representation of clouds and precipitation in climate models. Model development and its evaluation require comprehensive observational datasets. A specific work package was established to create uniform and documented observational datasets for the HD(CP)2 data base. Datasets included ground-based remote-sensing (Doppler lidars, ceilometers, microwave radiometers, and cloud radars) and in-situ (meteorological and radiation sensors) measurements. Four supersites (Jülich ObservatorY for Cloud Evolution (JOYCE), Lindenberg Meteorological Observatory - Richard Assmann Observatory (RAO), and Leipzig Aerosol and Cloud Remote Observations System (LACROS) in Germany, and Cabauw experimental site for atmospheric research (Cesar) in the Netherlands) are finalizing the operational procedures to provide quality controlled (and calibrated if possible) remote-sensing and in-situ observations, retrievals on atmospheric boundary layer state (e.g. winds, mixing layer height, humidity and temperature), and cloud macro and micro physical properties with uncertainty estimations or at least quality flags. During the project new processing and retrieval methods were developed if no commonly agreed or satisfying methods were available. Especially, large progress was made concerning uncertainty estimation and automated quality control. Additionally, the data from JOYCE are used in a radiative closure studies under cloudy conditions to evaluate retrievals of cloud properties. The current status of work progress will be presented.

  9. Climate prediction and predictability

    NASA Astrophysics Data System (ADS)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  10. PREDICTIVE MODELS

    SciTech Connect

    Ray, R.M. )

    1986-12-01

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1) chemical flooding, where soap-like surfactants are injected into the reservoir to wash out the oil; 2) carbon dioxide miscible flooding, where carbon dioxide mixes with the lighter hydrocarbons making the oil easier to displace; 3) in-situ combustion, which uses the heat from burning some of the underground oil to thin the product; 4) polymer flooding, where thick, cohesive material is pumped into a reservoir to push the oil through the underground rock; and 5) steamflood, where pressurized steam is injected underground to thin the oil. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes.

  11. Glyphosate Use Predicts ADHD Hospital Discharges in the Healthcare Cost and Utilization Project Net (HCUPnet): A Two-Way Fixed-Effects Analysis.

    PubMed

    Fluegge, Keith R; Fluegge, Kyle R

    2015-01-01

    There has been considerable international study on the etiology of rising mental disorders, such as attention-deficit hyperactivity disorder (ADHD), in human populations. As glyphosate is the most commonly used herbicide in the world, we sought to test the hypothesis that glyphosate use in agriculture may be a contributing environmental factor to the rise of ADHD in human populations. State estimates for glyphosate use and nitrogen fertilizer use were obtained from the U.S. Geological Survey (USGS). We queried the Healthcare Cost and Utilization Project net (HCUPNET) for state-level hospitalization discharge data in all patients for all-listed ADHD from 2007 to 2010. We used rural-urban continuum codes from the USDA-Economic Research Service when exploring the effect of urbanization on the relationship between herbicide use and ADHD. Least squares dummy variable (LSDV) method and within method using two-way fixed effects was used to elucidate the relationship between glyphosate use and all-listed ADHD hospital discharges. We show that a one kilogram increase in glyphosate use, in particular, in one year significantly positively predicts state-level all-listed ADHD discharges, expressed as a percent of total mental disorders, the following year (coefficient = 5.54E-08, p<.01). A study on the effect of urbanization on the relationship between glyphosate and ADHD indicates that the relationship is marginally significantly positive after multiple comparison correction only in urban U.S. counties (p<.025). Furthermore, total glyphosate use is strongly positively associated with total farm use of nitrogen fertilizers from 1992 to 2006 (p<.001). We present evidence from the biomedical research literature of a plausible link among glyphosate, nitrogen dysbiosis and ADHD. Glyphosate use is a significant predictor of state hospitalizations for all-listed ADHD hospital discharges, with the effect concentrated in urban U.S. counties. This effect is seen even after controlling

  12. Glyphosate Use Predicts ADHD Hospital Discharges in the Healthcare Cost and Utilization Project Net (HCUPnet): A Two-Way Fixed-Effects Analysis

    PubMed Central

    Fluegge, Keith R.; Fluegge, Kyle R.

    2015-01-01

    There has been considerable international study on the etiology of rising mental disorders, such as attention-deficit hyperactivity disorder (ADHD), in human populations. As glyphosate is the most commonly used herbicide in the world, we sought to test the hypothesis that glyphosate use in agriculture may be a contributing environmental factor to the rise of ADHD in human populations. State estimates for glyphosate use and nitrogen fertilizer use were obtained from the U.S. Geological Survey (USGS). We queried the Healthcare Cost and Utilization Project net (HCUPNET) for state-level hospitalization discharge data in all patients for all-listed ADHD from 2007 to 2010. We used rural-urban continuum codes from the USDA-Economic Research Service when exploring the effect of urbanization on the relationship between herbicide use and ADHD. Least squares dummy variable (LSDV) method and within method using two-way fixed effects was used to elucidate the relationship between glyphosate use and all-listed ADHD hospital discharges. We show that a one kilogram increase in glyphosate use, in particular, in one year significantly positively predicts state-level all-listed ADHD discharges, expressed as a percent of total mental disorders, the following year (coefficient = 5.54E-08, p<.01). A study on the effect of urbanization on the relationship between glyphosate and ADHD indicates that the relationship is marginally significantly positive after multiple comparison correction only in urban U.S. counties (p<.025). Furthermore, total glyphosate use is strongly positively associated with total farm use of nitrogen fertilizers from 1992 to 2006 (p<.001). We present evidence from the biomedical research literature of a plausible link among glyphosate, nitrogen dysbiosis and ADHD. Glyphosate use is a significant predictor of state hospitalizations for all-listed ADHD hospital discharges, with the effect concentrated in urban U.S. counties. This effect is seen even after controlling

  13. Glyphosate Use Predicts ADHD Hospital Discharges in the Healthcare Cost and Utilization Project Net (HCUPnet): A Two-Way Fixed-Effects Analysis.

    PubMed

    Fluegge, Keith R; Fluegge, Kyle R

    2015-01-01

    There has been considerable international study on the etiology of rising mental disorders, such as attention-deficit hyperactivity disorder (ADHD), in human populations. As glyphosate is the most commonly used herbicide in the world, we sought to test the hypothesis that glyphosate use in agriculture may be a contributing environmental factor to the rise of ADHD in human populations. State estimates for glyphosate use and nitrogen fertilizer use were obtained from the U.S. Geological Survey (USGS). We queried the Healthcare Cost and Utilization Project net (HCUPNET) for state-level hospitalization discharge data in all patients for all-listed ADHD from 2007 to 2010. We used rural-urban continuum codes from the USDA-Economic Research Service when exploring the effect of urbanization on the relationship between herbicide use and ADHD. Least squares dummy variable (LSDV) method and within method using two-way fixed effects was used to elucidate the relationship between glyphosate use and all-listed ADHD hospital discharges. We show that a one kilogram increase in glyphosate use, in particular, in one year significantly positively predicts state-level all-listed ADHD discharges, expressed as a percent of total mental disorders, the following year (coefficient = 5.54E-08, p<.01). A study on the effect of urbanization on the relationship between glyphosate and ADHD indicates that the relationship is marginally significantly positive after multiple comparison correction only in urban U.S. counties (p<.025). Furthermore, total glyphosate use is strongly positively associated with total farm use of nitrogen fertilizers from 1992 to 2006 (p<.001). We present evidence from the biomedical research literature of a plausible link among glyphosate, nitrogen dysbiosis and ADHD. Glyphosate use is a significant predictor of state hospitalizations for all-listed ADHD hospital discharges, with the effect concentrated in urban U.S. counties. This effect is seen even after controlling

  14. Prediction of Pseudo relative velocity response spectra at Yucca Mountain for underground nuclear explosions conducted in the Pahute Mesa testing area at the Nevada testing site; Yucca Mountain Site Characterization Project

    SciTech Connect

    Phillips, J.S.

    1991-12-01

    The Yucca Mountain Site Characterization Project (YMP), managed by the Office of Geologic Disposal of the Office of Civilian Radioactive Waste Management of the US Department of Energy, is examining the feasibility of siting a repository for commercial, high-level nuclear wastes at Yucca Mountain on and adjacent to the Nevada Test Site (NTS). This work, intended to extend our understanding of the ground motion at Yucca Mountain resulting from testing of nuclear weapons on the NTS, was funded by the Yucca Mountain project and the Military Applications Weapons Test Program. This report summarizes one aspect of the weapons test seismic investigations conducted in FY88. Pseudo relative velocity response spectra (PSRV) have been calculated for a large body of surface ground motions generated by underground nuclear explosions. These spectra have been analyzed and fit using multiple linear regression techniques to develop a credible prediction technique for surface PSRVs. In addition, a technique for estimating downhole PSRVs at specific stations is included. A data summary, data analysis, prediction development, prediction evaluation, software summary and FORTRAN listing of the prediction technique are included in this report.

  15. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  16. Project summaries

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.

  17. Composite risk scores and composite endpoints in the risk prediction of outcomes in anticoagulated patients with atrial fibrillation. The Loire Valley Atrial Fibrillation Project.

    PubMed

    Banerjee, A; Fauchier, L; Bernard-Brunet, A; Clementy, N; Lip, G Y H

    2014-03-01

    Several validated risk stratification schemes for prediction of ischaemic stroke (IS)/thromboembolism (TE) and major bleeding are available for patients with non-valvular atrial fibrillation (NVAF). On the basis for multiple common risk factors for IS/TE and bleeding, it has been suggested that composite risk prediction scores may be more practical and user-friendly than separate scores for bleeding and IS/TE. In a long-term prospective hospital registry of anticoagulated patients with newly diagnosed AF, we compared the predictive value of existing risk prediction scores as well as composite risk scores, and also compared these risk scoring systems using composite endpoints. Endpoint 1 was the simple composite of IS and major bleeds. Endpoint 2 was based on a composite of IS plus intracerebral haemorrhage (ICH). Endpoint 3 was based on weighted coefficients for IS/TE and ICH. Endpoint 4 was a composite of stroke, cardiovascular death, TE and major bleeding. The incremental predictive value of these scores over CHADS2 (as reference) for composite endpoints was assessed using c-statistic, net reclassification improvement (NRI) and integrated discrimination improvement (IDI). Of 8,962 eligible individuals, 3,607 (40.2%) had NVAF and were on OAC at baseline. There were no statistically significant differences between the c-statistics of the various risk scores, compared with the CHADS2 score, regardless of the endpoint. For the various risk scores and various endpoints, NRI and IDI did not show significant improvement (≥1%), compared with the CHADS2 score. In conclusion, composite risk scores did not significantly improve risk prediction of endpoints in patients with NVAF, regardless of how endpoints were defined. This would support individualised prediction of IS/TE and bleeding separately using different separate risk prediction tools, and not the use of composite scores or endpoints for everyday 'real world' clinical practice, to guide decisions on

  18. Evaluation of Projection Methods to Predict Wetlands Area Sizes: the Wetlands Inventory of the United States (sample Date Correction, Land Classification)

    NASA Astrophysics Data System (ADS)

    Terrazas-Gonzalez, Gerardo H.

    This research concerns different methods that can be applied for projection of wetlands areas at selected times. A method described by Frayer (1987) for a stratified random sampling design. The method was developed by W. E. Frayer in collaboration with D. C. Bowden and can be used in surveys where the sampling units have been measured at two different times, tsb1 and tsb2. A change matrix giving the amount of each wetland type at time tsb1 that is in each of the wetland types at time tsb2 is obtained for each sampling unit. Projections are based on a mean annual stratum matrix of changes. Methods of evaluating the reliability of FBSB projections have not been given previously. One objective of this project is to provide variance estimators for the FBSB projection method using jackknife and bootstrap techniques. Direct analytic techniques appear to require an unrealistic amount of time to develop given the complexity of the estimator. Interest in projections at an arbitrary time led to a more general description of the stratum basis estimator to include t < tsb1 and t between tsb1 and tsb2. Variations in the measurement times tsb1 and tsb2 among sampling units within stratum and other considerations including the complexity of the stratum basis estimator, motivated the use of simpler estimators. Two classes of methods for making projections are given. The first class of estimators are functions of summed estimated change matrices while the second class of estimators are functions of products of normalized estimated change matrices. Estimators within each of the classes are further differentiated by whether the changes matrices are on annual or observed time period (tsb2-tsb1) basis. The projections procedures address two different objectives. One objective is the estimation of the total of wetlands areas at a given time. The other objective is to estimate the amount of change among wetland types between 2 given times. All the methods can be used for both objectives

  19. EPA's ToxCast Project: Lessons learned and future directions for use of HTS in predicting in vivo toxicology -- A Chemical Perspective

    EPA Science Inventory

    U.S. EPA’s ToxCast and the related Tox21 projects are employing high-throughput screening (HTS) technologies to profile thousands of chemicals, which in turn serve as probes of a wide diversity of targets, pathways and mechanisms related to toxicity. Initial models relating ToxCa...

  20. Predicting the effects of climate change on ecosystems and wildlife habitat in northwest Alaska: results from the WildCast project

    USGS Publications Warehouse

    DeGange, Anthony R.; Marcot, Bruce G.; Lawler, James; Jorgenson, Torre; Winfree, Robert

    2014-01-01

    We used a modeling framework and a recent ecological land classification and land cover map to predict how ecosystems and wildlife habitat in northwest Alaska might change in response to increasing temperature. Our results suggest modest increases in forest and tall shrub ecotypes in Northwest Alaska by the end of this century thereby increasing habitat for forest-dwelling and shrub-using birds and mammals. Conversely, we predict declines in several more open low shrub, tussock, and meadow ecotypes favored by many waterbird, shorebird, and small mammal species.

  1. On earthquake prediction in Japan.

    PubMed

    Uyeda, Seiya

    2013-01-01

    Japan's National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author's view, are mainly interested in securing funds for seismology - on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  2. On Earthquake Prediction in Japan

    PubMed Central

    UYEDA, Seiya

    2013-01-01

    Japan’s National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author’s view, are mainly interested in securing funds for seismology — on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  3. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  4. Mortality prediction in the ICU: can we do better? Results from the Super ICU Learner Algorithm (SICULA) project, a population-based study

    PubMed Central

    Pirracchio, Romain; Petersen, Maya L.; Carone, Marco; Rigon, Matthieu Resche; Chevret, Sylvie; van der LAAN, Mark J.

    2015-01-01

    Background Improved mortality prediction for patients in intensive care units (ICU) remains an important challenge. Many severity scores have been proposed but validation studies have concluded that they are not adequately calibrated. Many flexible algorithms are available, yet none of these individually outperform all others regardless of context. In contrast, the Super Learner (SL), an ensemble machine learning technique that leverages on multiple learning algorithms to obtain better prediction performance, has been shown to perform at least as well as the optimal member of its library. It might provide an ideal opportunity to construct a novel severity score with an improved performance profile. The aim of the present study was to provide a new mortality prediction algorithm for ICU patients using an implementation of the Super Learner, and to assess its performance relative to prediction based on the SAPS II, APACHE II and SOFA scores. Methods We used the Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database (v26) including all patients admitted to an ICU at Boston’s Beth Israel Deaconess Medical Center from 2001 to 2008. The calibration, discrimination and risk classification of predicted hospital mortality based on SAPS II, on APACHE II, on SOFA and on our Super Learned-based proposal were evaluated. Performance measures were calculated using cross-validation to avoid making biased assessments. Our proposed score was then externally validated on a dataset of 200 randomly selected patients admitted at the ICU of Hôpital Européen Georges-Pompidou in Paris, France between September 2013 and June 2014. The primary outcome was hospital mortality. The explanatory variables were the same as those included in the SAPS II score. Results 24,508 patients were included, with median SAPS II 38 (IQR: 27–51), median SOFA 5 (IQR: 2–8). A total of 3,002/24,508(12.2%) patients died in the hospital. The two versions of our Super Learner

  5. Use of Cumulative Degradation Factor Prediction and Life Test Result of the Thruster Gimbal Assembly Actuator for the Dawn Flight Project

    NASA Technical Reports Server (NTRS)

    Lo, C. John; Brophy, John R.; Etters, M. Andy; Ramesham, Rajeshuni; Jones, William R., Jr.; Jansen, Mark J.

    2009-01-01

    The Dawn Ion Propulsion System is the ninth project in NASA s Discovery Program. The Dawn spacecraft is being developed to enable the scientific investigation of the two heaviest main-belt asteroids, Vesta and Ceres. Dawn is the first mission to orbit two extraterrestrial bodies, and the first to orbit a main-belt asteroid. The mission is enabled by the onboard Ion Propulsion System (IPS) to provide the post-launch delta-V. The three Ion Engines of the IPS are mounted on Thruster Gimbal Assembly (TGA), with only one engine operating at a time for this 10-year mission. The three TGAs weigh 14.6 kg.

  6. Applications systems verification and transfer project. Volume 1: Operational applications of satellite snow cover observations: Executive summary. [usefulness of satellite snow-cover data for water yield prediction

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1981-01-01

    Both LANDSAT and NOAA satellite data were used in improving snowmelt runoff forecasts. When the satellite snow cover data were tested in both empirical seasonal runoff estimation and short term modeling approaches, a definite potential for reducing forecast error was evident. A cost benefit analysis run in conjunction with the snow mapping indicated a $36.5 million annual benefit accruing from a one percent improvement in forecast accuracy using the snow cover data for the western United States. The annual cost of employing the system would be $505,000. The snow mapping has proven that satellite snow cover data can be used to reduce snowmelt runoff forecast error in a cost effective manner once all operational satellite data are available within 72 hours after acquisition. Executive summaries of the individual snow mapping projects are presented.

  7. Orthostatic Hypotension and Elevated Resting Heart Rate Predict Low-Energy Fractures in the Population: The Malmö Preventive Project

    PubMed Central

    Hamrefors, Viktor; Härstedt, Maria; Holmberg, Anna; Rogmark, Cecilia; Sutton, Richard; Melander, Olle; Fedorowski, Artur

    2016-01-01

    Background Autonomic disorders of the cardiovascular system, such as orthostatic hypotension and elevated resting heart rate, predict mortality and cardiovascular events in the population. Low-energy-fractures constitute a substantial clinical problem that may represent an additional risk related to such autonomic dysfunction. Aims To test the association between orthostatic hypotension, resting heart rate and incidence of low-energy-fractures in the general population. Methods and Results Using multivariable-adjusted Cox regression models we investigated the association between orthostatic blood pressure response, resting heart rate and first incident low-energy-fracture in a population-based, middle-aged cohort of 33 000 individuals over 25 years follow-up. The median follow-up time from baseline to first incident fracture among the subjects that experienced a low energy fracture was 15.0 years. A 10 mmHg orthostatic decrease in systolic blood pressure at baseline was associated with 5% increased risk of low-energy-fractures (95% confidence interval 1.01–1.10) during follow-up, whereas the resting heart rate predicted low-energy-fractures with an effect size of 8% increased risk per 10 beats-per-minute (1.05–1.12), independently of the orthostatic response. Subjects with a resting heart rate exceeding 68 beats-per-minute had 18% (1.10–1.26) increased risk of low-energy-fractures during follow-up compared with subjects with a resting heart rate below 68 beats-per-minute. When combining the orthostatic response and resting heart rate, there was a 30% risk increase (1.08–1.57) of low-energy-fractures between the extremes, i.e. between subjects in the fourth compared with the first quartiles of both resting heart rate and systolic blood pressure-decrease. Conclusion Orthostatic blood pressure decline and elevated resting heart rate independently predict low-energy fractures in a middle-aged population. These two measures of subclinical cardiovascular

  8. Integrative Pathway Analysis of Metabolic Signature in Bladder Cancer: A Linkage to The Cancer Genome Atlas Project and Prediction of Survival

    PubMed Central

    von Rundstedt, Friedrich-Carl; Rajapakshe, Kimal; Ma, Jing; Arnold, James M.; Gohlke, Jie; Putluri, Vasanta; Krishnapuram, Rashmi; Piyarathna, D. Badrajee; Lotan, Yair; Gödde, Daniel; Roth, Stephan; Störkel, Stephan; Levitt, Jonathan M.; Michailidis, George; Sreekumar, Arun; Lerner, Seth P.; Coarfa, Cristian; Putluri, Nagireddy

    2016-01-01

    Purpose We used targeted mass spectrometry to study the metabolic fingerprint of urothelial cancer and determine whether the biochemical pathway analysis gene signature would have a predictive value in independent cohorts of patients with bladder cancer. Materials and Methods Pathologically evaluated, bladder derived tissues, including benign adjacent tissue from 14 patients and bladder cancer from 46, were analyzed by liquid chromatography based targeted mass spectrometry. Differential metabolites associated with tumor samples in comparison to benign tissue were identified by adjusting the p values for multiple testing at a false discovery rate threshold of 15%. Enrichment of pathways and processes associated with the metabolic signature were determined using the GO (Gene Ontology) Database and MSigDB (Molecular Signature Database). Integration of metabolite alterations with transcriptome data from TCGA (The Cancer Genome Atlas) was done to identify the molecular signature of 30 metabolic genes. Available outcome data from TCGA portal were used to determine the association with survival. Results We identified 145 metabolites, of which analysis revealed 31 differential metabolites when comparing benign and tumor tissue samples. Using the KEGG (Kyoto Encyclopedia of Genes and Genomes) Database we identified a total of 174 genes that correlated with the altered metabolic pathways involved. By integrating these genes with the transcriptomic data from the corresponding TCGA data set we identified a metabolic signature consisting of 30 genes. The signature was significant in its prediction of survival in 95 patients with a low signature score vs 282 with a high signature score (p = 0.0458). Conclusions Targeted mass spectrometry of bladder cancer is highly sensitive for detecting metabolic alterations. Applying transcriptome data allows for integration into larger data sets and identification of relevant metabolic pathways in bladder cancer progression. PMID:26802582

  9. Structural-Functional Relationships of the Dynein, Spokes, and Central-Pair Projections Predicted from an Analysis of the Forces Acting within a Flagellum

    PubMed Central

    Lindemann, Charles B.

    2003-01-01

    In the axoneme of eukaryotic flagella the dynein motor proteins form crossbridges between the outer doublet microtubules. These motor proteins generate force that accumulates as linear tension, or compression, on the doublets. When tension or compression is present on a curved microtubule, a force per unit length develops in the plane of bending and is transverse to the long axis of the microtubule. This transverse force (t-force) is evaluated here using available experimental evidence from sea urchin sperm and bull sperm. At or near the switch point for beat reversal, the t-force is in the range of 0.25–1.0 nN/μm, with 0.5 nN/μm the most likely value. This is the case in both beating and arrested bull sperm and in beating sea urchin sperm. The total force that can be generated (or resisted) by all the dyneins on one micron of outer doublet is also ∼0.5 nN. The equivalence of the maximum dynein force/μm and t-force/μm at the switch point may have important consequences. Firstly, the t-force acting on the doublets near the switch point of the flagellar beat is sufficiently strong that it could terminate the action of the dyneins directly by strongly favoring the detached state and precipitating a cascade of detachment from the adjacent doublet. Secondly, after dynein release occurs, the radial spokes and central-pair apparatus are the structures that must carry the t-force. The spokes attached to the central-pair projections will bear most of the load. The central-pair projections are well-positioned for this role, and they are suitably configured to regulate the amount of axoneme distortion that occurs during switching. However, to fulfill this role without preventing flagellar bend formation, moveable attachments that behave like processive motor proteins must mediate the attachment between the spoke heads and the central-pair structure. PMID:12770914

  10. Structural-functional relationships of the dynein, spokes, and central-pair projections predicted from an analysis of the forces acting within a flagellum.

    PubMed

    Lindemann, Charles B

    2003-06-01

    In the axoneme of eukaryotic flagella the dynein motor proteins form crossbridges between the outer doublet microtubules. These motor proteins generate force that accumulates as linear tension, or compression, on the doublets. When tension or compression is present on a curved microtubule, a force per unit length develops in the plane of bending and is transverse to the long axis of the microtubule. This transverse force (t-force) is evaluated here using available experimental evidence from sea urchin sperm and bull sperm. At or near the switch point for beat reversal, the t-force is in the range of 0.25-1.0 nN/ micro m, with 0.5 nN/ micro m the most likely value. This is the case in both beating and arrested bull sperm and in beating sea urchin sperm. The total force that can be generated (or resisted) by all the dyneins on one micron of outer doublet is also approximately 0.5 nN. The equivalence of the maximum dynein force/ micro m and t-force/ micro m at the switch point may have important consequences. Firstly, the t-force acting on the doublets near the switch point of the flagellar beat is sufficiently strong that it could terminate the action of the dyneins directly by strongly favoring the detached state and precipitating a cascade of detachment from the adjacent doublet. Secondly, after dynein release occurs, the radial spokes and central-pair apparatus are the structures that must carry the t-force. The spokes attached to the central-pair projections will bear most of the load. The central-pair projections are well-positioned for this role, and they are suitably configured to regulate the amount of axoneme distortion that occurs during switching. However, to fulfill this role without preventing flagellar bend formation, moveable attachments that behave like processive motor proteins must mediate the attachment between the spoke heads and the central-pair structure. PMID:12770914

  11. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  12. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  13. Collaborative Project. Understanding the effects of tides and eddies on the ocean dynamics, sea ice cover and decadal/centennial climate prediction using the Regional Arctic Climate Model (RACM)

    SciTech Connect

    Hutchings, Jennifer; Joseph, Renu

    2013-09-14

    The goal of this project is to develop an eddy resolving ocean model (POP) with tides coupled to a sea ice model (CICE) within the Regional Arctic System Model (RASM) to investigate the importance of ocean tides and mesoscale eddies in arctic climate simulations and quantify biases associated with these processes and how their relative contribution may improve decadal to centennial arctic climate predictions. Ocean, sea ice and coupled arctic climate response to these small scale processes will be evaluated with regard to their influence on mass, momentum and property exchange between oceans, shelf-basin, ice-ocean, and ocean-atmosphere. The project will facilitate the future routine inclusion of polar tides and eddies in Earth System Models when computing power allows. As such, the proposed research addresses the science in support of the BER’s Climate and Environmental Sciences Division Long Term Measure as it will improve the ocean and sea ice model components as well as the fully coupled RASM and Community Earth System Model (CESM) and it will make them more accurate and computationally efficient.

  14. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  15. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  16. Project Wild (Project Tame).

    ERIC Educational Resources Information Center

    Siegenthaler, David

    For 37 states in the United States, Project Wild has become an officially sanctioned, distributed and funded "environemtnal and conservation education program." For those who are striving to implement focused, sequential, learning programs, as well as those who wish to promote harmony through a non-anthropocentric world view, Project Wild may…

  17. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    EPA Science Inventory

    Humans potentially are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Many of these chemicals never have been tested for their ability to interact with the es...

  18. Practical aspects of geological prediction

    SciTech Connect

    Mallio, W.J.; Peck, J.H.

    1981-11-01

    Nuclear waste disposal requires that geology be a predictive science. The prediction of future events rests on (1) recognizing the periodicity of geologic events; (2) defining a critical dimension of effect, such as the area of a drainage basin, the length of a fault trace, etc; and (3) using our understanding of active processes the project the frequency and magnitude of future events in the light of geological principles. Of importance to nuclear waste disposal are longer term processes such as continental denudation and removal of materials by glacial erosion. Constant testing of projections will allow the practical limits of predicting geological events to be defined. 11 refs.

  19. Comparison of initial perturbation methods for the mesoscale ensemble prediction system of the Meteorological Research Institute for the WWRP Beijing 2008 Olympics Research and Development Project (B08RDP)

    NASA Astrophysics Data System (ADS)

    Saito, Kazuo; Hara, Masahiro; Kunii, Masaru; Seko, Hiromu; Yamaguchi, Munehiko

    2011-05-01

    Different initial perturbation methods for the mesoscale ensemble prediction were compared by the Meteorological Research Institute (MRI) as a part of the intercomparison of mesoscale ensemble prediction systems (EPSs) of the World Weather Research Programme (WWRP) Beijing 2008 Olympics Research and Development Project (B08RDP). Five initial perturbation methods for mesoscale ensemble prediction were developed for B08RDP and compared at MRI: (1) a downscaling method of the Japan Meteorological Agency (JMA)'s operational one-week EPS (WEP), (2) a targeted global model singular vector (GSV) method, (3) a mesoscale model singular vector (MSV) method based on the adjoint model of the JMA non-hydrostatic model (NHM), (4) a mesoscale breeding growing mode (MBD) method based on the NHM forecast and (5) a local ensemble transform (LET) method based on the local ensemble transform Kalman filter (LETKF) using NHM. These perturbation methods were applied to the preliminary experiments of the B08RDP Tier-1 mesoscale ensemble prediction with a horizontal resolution of 15 km. To make the comparison easier, the same horizontal resolution (40 km) was employed for the three mesoscale model-based initial perturbation methods (MSV, MBD and LET). The GSV method completely outperformed the WEP method, confirming the advantage of targeting in mesoscale EPS. The GSV method generally performed well with regard to root mean square errors of the ensemble mean, large growth rates of ensemble spreads throughout the 36-h forecast period, and high detection rates and high Brier skill scores (BSSs) for weak rains. On the other hand, the mesoscale model-based initial perturbation methods showed good detection rates and BSSs for intense rains. The MSV method showed a rapid growth in the ensemble spread of precipitation up to a forecast time of 6 h, which suggests suitability of the mesoscale SV for short-range EPSs, but the initial large growth of the perturbation did not last long. The

  20. Graphing Predictions

    ERIC Educational Resources Information Center

    Connery, Keely Flynn

    2007-01-01

    Graphing predictions is especially important in classes where relationships between variables need to be explored and derived. In this article, the author describes how his students sketch the graphs of their predictions before they begin their investigations on two laboratory activities: Distance Versus Time Cart Race Lab and Resistance; and…

  1. Predicting Hurricanes with Supercomputers

    SciTech Connect

    2010-01-01

    Hurricane Emily, formed in the Atlantic Ocean on July 10, 2005, was the strongest hurricane ever to form before August. By checking computer models against the actual path of the storm, researchers can improve hurricane prediction. In 2010, NOAA researchers were awarded 25 million processor-hours on Argonne's BlueGene/P supercomputer for the project. Read more at http://go.usa.gov/OLh

  2. Projects Work!

    ERIC Educational Resources Information Center

    Textor, Martin R.

    2005-01-01

    The great educational value of projects is emphasized by contrasting negative aspects of the life of today's children with the goals of project work. This is illustrated by a project "Shopping." It is shown what children are learning in such projects and what the advantages of project work are. Relevant topic areas, criteria for selecting a…

  3. Toward preclinical predictive drug testing for metabolism and hepatotoxicity by using in vitro models derived from human embryonic stem cells and human cell lines - a report on the Vitrocellomics EU-project.

    PubMed

    Mandenius, Carl-Fredrik; Andersson, Tommy B; Alves, Paula M; Batzl-Hartmann, Christine; Björquist, Petter; Carrondo, Manuel J T; Chesne, Christophe; Coecke, Sandra; Edsbagge, Josefina; Fredriksson, J Magnus; Gerlach, Jörg C; Heinzle, Elmar; Ingelman-Sundberg, Magnus; Johansson, Inger; Küppers-Munther, Barbara; Müller-Vieira, Ursula; Noor, Fozia; Zeilinger, Katrin

    2011-05-01

    Drug-induced liver injury is a common reason for drug attrition in late clinical phases, and even for post-launch withdrawals. As a consequence, there is a broad consensus in the pharmaceutical industry, and within regulatory authorities, that a significant improvement of the current in vitro test methodologies for accurate assessment and prediction of such adverse effects is needed. For this purpose, appropriate in vivo-like hepatic in vitro models are necessary, in addition to novel sources of human hepatocytes. In this report, we describe recent and ongoing research toward the use of human embryonic stem cell (hESC)-derived hepatic cells, in conjunction with new and improved test methods, for evaluating drug metabolism and hepatotoxicity. Recent progress on the directed differentiation of human embryonic stem cells to the functional hepatic phenotype is reported, as well as the development and adaptation of bioreactors and toxicity assay technologies for the testing of hepatic cells. The aim of achieving a testing platform for metabolism and hepatotoxicity assessment, based on hESC-derived hepatic cells, has advanced markedly in the last 2-3 years. However, great challenges still remain, before such new test systems could be routinely used by the industry. In particular, we give an overview of results from the Vitrocellomics project (EU Framework 6) and discuss these in relation to the current state-of-the-art and the remaining difficulties, with suggestions on how to proceed before such in vitro systems can be implemented in industrial discovery and development settings and in regulatory acceptance.

  4. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  5. Methods of Predicting Solid Waste Characteristics.

    ERIC Educational Resources Information Center

    Boyd, Gail B.; Hawkins, Myron B.

    The project summarized by this report involved a preliminary design of a model for estimating and predicting the quantity and composition of solid waste and a determination of its feasibility. The novelty of the prediction model is that it estimates and predicts on the basis of knowledge of materials and quantities before they become a part of the…

  6. Reliability Prediction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    RELAV, a NASA-developed computer program, enables Systems Control Technology, Inc. (SCT) to predict performance of aircraft subsystems. RELAV provides a system level evaluation of a technology. Systems, the mechanism of a landing gear for example, are first described as a set of components performing a specific function. RELAV analyzes the total system and the individual subsystem probabilities to predict success probability, and reliability. This information is then translated into operational support and maintenance requirements. SCT provides research and development services in support of government contracts.

  7. VIPER project

    NASA Technical Reports Server (NTRS)

    Kershaw, John

    1990-01-01

    The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.

  8. Project management controls

    SciTech Connect

    Hardin, D.S.; Carnes, W.S.

    1990-12-31

    Project management controls are utilized to enhance the probability that a project will be successful. The control system used by a project manager can take many forms and can be applied at different times to varying degrees on a given project depending upon its complexity. The Consolidated Incineration Facility (CIF) is one project of many at the Savannah River Site (SRS). The United States Department of Energy Order 4700.1 is a project management system that is applied on a site-wide basis, thus including the CIF. The control system required by this order is proceduralized to ensure that it is applied in a consistent manner and will produce reliable results. These results provide the project manager with a correlation of both costs and schedule within the defined scope to adequately asses the status of the project. This is an iterative process and can be simply stated: plan, actual, variance, corrective action, prediction, and revision. This paper presents the basis for the project management controls applied at the Savannah River Site.

  9. Project management controls

    SciTech Connect

    Hardin, D.S. ); Carnes, W.S. )

    1990-01-01

    Project management controls are utilized to enhance the probability that a project will be successful. The control system used by a project manager can take many forms and can be applied at different times to varying degrees on a given project depending upon its complexity. The Consolidated Incineration Facility (CIF) is one project of many at the Savannah River Site (SRS). The United States Department of Energy Order 4700.1 is a project management system that is applied on a site-wide basis, thus including the CIF. The control system required by this order is proceduralized to ensure that it is applied in a consistent manner and will produce reliable results. These results provide the project manager with a correlation of both costs and schedule within the defined scope to adequately asses the status of the project. This is an iterative process and can be simply stated: plan, actual, variance, corrective action, prediction, and revision. This paper presents the basis for the project management controls applied at the Savannah River Site.

  10. Successful Predictions

    NASA Astrophysics Data System (ADS)

    Pierrehumbert, R.

    2012-12-01

    In an observational science, it is not possible to test hypotheses through controlled laboratory experiments. One can test parts of the system in the lab (as is done routinely with infrared spectroscopy of greenhouse gases), but the collective behavior cannot be tested experimentally because a star or planet cannot be brought into the lab; it must, instead, itself be the lab. In the case of anthropogenic global warming, this is all too literally true, and the experiment would be quite exciting if it weren't for the unsettling fact that we and all our descendents for the forseeable future will have to continue making our home in the lab. There are nonetheless many routes though which the validity of a theory of the collective behavior can be determined. A convincing explanation must not be a"just-so" story, but must make additional predictions that can be verified against observations that were not originally used in formulating the theory. The field of Earth and planetary climate has racked up an impressive number of such predictions. I will also admit as "predictions" statements about things that happened in the past, provided that observations or proxies pinning down the past climate state were not available at the time the prediction was made. The basic prediction that burning of fossil fuels would lead to an increase of atmospheric CO2, and that this would in turn alter the Earth's energy balance so as to cause tropospheric warming, is one of the great successes of climate science. It began in the lineage of Fourier, Tyndall and Arrhenius, and was largely complete with the the radiative-convective modeling work of Manabe in the 1960's -- all well before the expected warming had progressed far enough to be observable. Similarly, long before the increase in atmospheric CO2 could be detected, Bolin formulated a carbon cycle model and used it to predict atmospheric CO2 out to the year 2000; the actual values come in at the high end of his predicted range, for

  11. ENSO predictability

    NASA Astrophysics Data System (ADS)

    Larson, Sarah Michelle

    The overarching goal of this work is to explore seasonal El Nino -- Southern Oscillation (ENSO) predictability. More specifically, this work investigates how intrinsic variability affects ENSO predictability using a state-of-the-art climate model. Topics related to the effects of systematic model errors and external forcing are not included in this study. Intrinsic variability encompasses a hierarchy of temporal and spatial scales, from high frequency small-scale noise-driven processes including coupled instabilities to low frequency large-scale deterministic climate modes. The former exemplifies what can be considered intrinsic "noise" in the climate system that hinders predictability by promoting rapid error growth whereas the latter often provides the slow thermal ocean inertia that supplies the coupled ENSO system with predictability. These two ends of the spectrum essentially provide the lower and upper bounds of ENSO predictability that can be attributed to internal variability. The effects of noise-driven coupled instabilities on sea surface temperature (SST) predictability in the ENSO region is quantified by utilizing a novel coupled model methodology paired with an ensemble approach. The experimental design allows for rapid growth of intrinsic perturbations that are not prescribed. Several cases exhibit sufficiently rapid growth to produce ENSO-like final states that do not require a previous ENSO event, large-scale wind trigger, or subsurface heat content precursor. Results challenge conventional ENSO theory that considers the subsurface precursor as a necessary condition for ENSO. Noise-driven SST error growth exhibits strong seasonality and dependence on the initialization month. A dynamical analysis reveals that much of the error growth behavior is linked to the seasonal strength of the Bjerknes feedback in the model, indicating that the noise-induced perturbations grow via an ENSO-like mechanism. The daily error fields reveal that persistent

  12. Prediction techniques

    NASA Astrophysics Data System (ADS)

    1981-12-01

    Prediction methods and related propagation results for the evaluation of Earth-space communication paths operating above 10 GHz are presented. Gaseous attenuation, rain, cloud, fog, sand, and dust attenuation, path diversity, signal fluctuations and low angle fading, depolarization effects, bandwidth coherence, and sky noise are considered.

  13. 1986-87 atomic mass predictions

    SciTech Connect

    Haustein, P.E.

    1987-12-10

    A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conference (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Moeller and Nix; Moeller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jaenecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.

  14. The contoured auricular projection graft for nasal tip projection.

    PubMed

    Porter, J P; Tardy, M E; Cheng, J

    1999-01-01

    In all rhinoplasty surgery, the universal need exists to increase, decrease, or preserve existing tip projection. When proper tip projection is lacking, a variety of techniques are useful for improving projection. We describe a valuable technique for tip projection, particularly useful and indicated in the Asian rhinoplasty, African American rhinoplasty, and in certain revision rhinoplasties. In the past 15 years, the senior author (M.E.T.) has used the contoured auricular projection graft in selected patients for achieving satisfactory tip projection in patients with blunted tips. The aesthetic outcomes have been predictable, pleasing, and reliable for the long term. Precision pocket preparation for auricular conchal cartilage graft placement is key to symmetry and projection of the final outcome. The results yielded a rounded nasal tip that may be more natural-appearing in Asians, African Americans, and selected patients with revision rhinoplasty. The contoured auricular projection graft provides a highly useful graft for the nasal tip. PMID:10937122

  15. Evaluating the Predictive Value of Growth Prediction Models

    ERIC Educational Resources Information Center

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  16. Final Technical Report: Increasing Prediction Accuracy.

    SciTech Connect

    King, Bruce Hardison; Hansen, Clifford; Stein, Joshua

    2015-12-01

    PV performance models are used to quantify the value of PV plants in a given location. They combine the performance characteristics of the system, the measured or predicted irradiance and weather at a site, and the system configuration and design into a prediction of the amount of energy that will be produced by a PV system. These predictions must be as accurate as possible in order for finance charges to be minimized. Higher accuracy equals lower project risk. The Increasing Prediction Accuracy project at Sandia focuses on quantifying and reducing uncertainties in PV system performance models.

  17. Project Gifted.

    ERIC Educational Resources Information Center

    Cranston School Dept., RI.

    Covered in the short discussion of Project Gifted for Intermediate grade children are program description, instructional strategy, classification of question categories to cue various levels of thinking, traits common to intellectually gifted students, and procedure for selection of students participating in Project Gifted. Project Gifted is…

  18. Biorhythms and the Prediction of Suicide Behavior.

    ERIC Educational Resources Information Center

    Dezelsky, Thomas L.; Toohey, Jack V.

    1978-01-01

    Statistical analysis of the data in this research project indicates that neither the physical, emotional, nor intellectual cycles can be used to predict suicide behavior and also that biorhythms are influenced by environmental variations. (DS)

  19. A Course in... Model Predictive Control.

    ERIC Educational Resources Information Center

    Arkun, Yaman; And Others

    1988-01-01

    Describes a graduate engineering course which specializes in model predictive control. Lists course outline and scope. Discusses some specific topics and teaching methods. Suggests final projects for the students. (MVL)

  20. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  1. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

    NASA Technical Reports Server (NTRS)

    Paluska, A.; Pavoni, N.

    1983-01-01

    Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

  2. Word prediction

    SciTech Connect

    Rumelhart, D.E.; Skokowski, P.G.; Martin, B.O.

    1995-05-01

    In this project we have developed a language model based on Artificial Neural Networks (ANNs) for use in conjunction with automatic textual search or speech recognition systems. The model can be trained on large corpora of text to produce probability estimates that would improve the ability of systems to identify words in a sentence given partial contextual information. The model uses a gradient-descent learning procedure to develop a metric of similarity among terms in a corpus, based on context. Using lexical categories based on this metric, a network can then be trained to do serial word probability estimation. Such a metric can also be used to improve the performance of topic-based search by allowing retrieval of information that is related to desired topics even if no obvious set of key words unites all the retrieved items.

  3. Graduate Student Project: Operations Management Product Plan

    ERIC Educational Resources Information Center

    Fish, Lynn

    2007-01-01

    An operations management product project is an effective instructional technique that fills a void in current operations management literature in product planning. More than 94.1% of 286 graduates favored the project as a learning tool, and results demonstrate the significant impact the project had in predicting student performance. The author…

  4. Map projections

    USGS Publications Warehouse

    ,

    1993-01-01

    A map projection is used to portray all or part of the round Earth on a flat surface. This cannot be done without some distortion. Every projection has its own set of advantages and disadvantages. There is no "best" projection. The mapmaker must select the one best suited to the needs, reducing distortion of the most important features. Mapmakers and mathematicians have devised almost limitless ways to project the image of the globe onto paper. Scientists at the U. S. Geological Survey have designed projections for their specific needs—such as the Space Oblique Mercator, which allows mapping from satellites with little or no distortion. This document gives the key properties, characteristics, and preferred uses of many historically important projections and of those frequently used by mapmakers today.

  5. Initial value predictability of intrinsic oceanic modes and implications for decadal prediction over North America

    SciTech Connect

    Branstator, Grant

    2014-12-09

    The overall aim of our project was to quantify and characterize predictability of the climate as it pertains to decadal time scale predictions. By predictability we mean the degree to which a climate forecast can be distinguished from the climate that exists at initial forecast time, taking into consideration the growth of uncertainty that occurs as a result of the climate system being chaotic. In our project we were especially interested in predictability that arises from initializing forecasts from some specific state though we also contrast this predictability with predictability arising from forecasting the reaction of the system to external forcing – for example changes in greenhouse gas concentration. Also, we put special emphasis on the predictability of prominent intrinsic patterns of the system because they often dominate system behavior. Highlights from this work include: • Development of novel methods for estimating the predictability of climate forecast models. • Quantification of the initial value predictability limits of ocean heat content and the overturning circulation in the Atlantic as they are represented in various state of the art climate models. These limits varied substantially from model to model but on average were about a decade with North Atlantic heat content tending to be more predictable than North Pacific heat content. • Comparison of predictability resulting from knowledge of the current state of the climate system with predictability resulting from estimates of how the climate system will react to changes in greenhouse gas concentrations. It turned out that knowledge of the initial state produces a larger impact on forecasts for the first 5 to 10 years of projections. • Estimation of the predictability of dominant patterns of ocean variability including well-known patterns of variability in the North Pacific and North Atlantic. For the most part these patterns were predictable for 5 to 10 years. • Determination of

  6. Final project report

    SciTech Connect

    Nitin S. Baliga and Leroy Hood

    2008-11-12

    The proposed overarching goal for this project was the following: Data integration, simulation and visualization will facilitate metabolic and regulatory network prediction, exploration, and formulation of hypotheses. We stated three specific aims to achieve the overarching goal of this project: (1) Integration of multiple levels of information such as mRNA and protein levels, predicted protein-protein interactions/associations and gene function will enable construction of models describing environmental response and dynamic behavior. (2) Flexible tools for network inference will accelerate our understanding of biological systems. (3) Flexible exploration and queries of model hypotheses will provide focus and reveal novel dependencies. The underlying philosophy of these proposed aims is that an iterative cycle of experiments, experimental design, and verification will lead to a comprehensive and predictive model that will shed light on systems level mechanisms involved in responses elicited by living systems upon sensing a change in their environment. In the previous years report we demonstrated considerable progress in development of data standards, regulatory network inference and data visualization and exploration. We are pleased to report that several manuscripts describing these procedures have been published in top international peer reviewed journals including Genome Biology, PNAS, and Cell. The abstracts of these manuscripts are given and they summarize our accomplishments in this project.

  7. Solar prediction and intelligent machines

    NASA Technical Reports Server (NTRS)

    Johnson, Gordon G.

    1987-01-01

    The solar prediction program is aimed at reducing or eliminating the need to throughly understand the process previously developed and to still be able to produce a prediction. Substantial progress was made in identifying the procedures to be coded as well as testing some of the presently coded work. Another project involves work on developing ideas and software that should result in a machine capable of learning as well as carrying on an intelligent conversation over a wide range of topics. The underlying idea is to use primitive ideas and construct higher order ideas from these, which can then be easily related one to another.

  8. Project EASIER.

    ERIC Educational Resources Information Center

    Alvord, David J.; Tack, Leland R.; Dallam, Jerald W.

    1998-01-01

    Describes the development of Project EASIER, a collaborative electronic-data interchange for networking Iowa local school districts, education agencies, community colleges, universities, and the Department of Education. The primary goal of this project is to develop and implement a system for collection of student information for state and federal…

  9. A mathematical model for predicting the probability of acute mortality in a human population exposed to accidentally released airborne radionuclides. Final report for Phase I of the project: early effects of inhaled radionuclides

    SciTech Connect

    Filipy, R.E.; Borst, F.J.; Cross, F.T.; Park, J.F.; Moss, O.R.

    1980-06-01

    The report presents a mathematical model for the purpose of predicting the fraction of human population which would die within 1 year of an accidental exposure to airborne radionuclides. The model is based on data from laboratory experiments with rats, dogs and baboons, and from human epidemiological data. Doses from external, whole-body irradiation and from inhaled, alpha- and beta-emitting radionuclides are calculated for several organs. The probabilities of death from radiation pneumonitis and from bone marrow irradiation are predicted from doses accumulated within 30 days of exposure to the radioactive aerosol. The model is compared with existing similar models under hypothetical exposure conditions. Suggestions for further experiments with inhaled radionuclides are included.

  10. A Game Theoretic Approach to Cyber Attack Prediction

    SciTech Connect

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  11. TIARA project

    NASA Astrophysics Data System (ADS)

    Malecki, P.

    2013-10-01

    The aim of the Test Infrastructure and Accelerator Research Area - the TIARA project[1] is to consolidate and support the European R&D program in the field of physics and techniques of particle accelerators. This project, partially funded by the European Commission, groups 11 participants from 8 European countries, including Poland. Its present, threeyear (2011-2013) preparatory phase (PP) is shortly described in this paper. The project is divided into 9 work packages (WP). We will concentrate on four of them dedicated to governance, R&D infrastructures, joint R&D programming, and education and training, in which Polish participants are actively involved.

  12. Project Reptile!

    ERIC Educational Resources Information Center

    Diffily, Deborah

    2001-01-01

    Integrating curriculum is important in helping children make connections within and among areas. Presents a class project for kindergarten children which came out of the students' interests and desire to build a reptile exhibit. (ASK)

  13. Project Summaries

    ERIC Educational Resources Information Center

    Journal of Architectural Education, 1974

    1974-01-01

    Fellows of the Association of Collegiate Schools of Architecture Environmental Experience Stipends Program describe their project activities for 1973-74 including: an inservice course for teachers, television programs, graduate courses, high school courses, and workshops. (Author/PG)

  14. Alzheimer's Project

    MedlinePlus

    ... about the films on our message board . Watch films free online now "The Memory Loss Tapes" (85 ... ALZHEIMER'S PROJECT" is a presentation of HBO Documentary Films and the National Institute on Aging at the ...

  15. Geodynamics Project

    ERIC Educational Resources Information Center

    Drake, Charles L.

    1977-01-01

    Describes activities of Geodynamics Project of the Federal Council on Science and Technology, such as the application of multichannel seismic-reflection techniques to study the nature of the deep crust and upper mantle. (MLH)

  16. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. Part 2: Theoretical development of a dynamic model and application to rain fade durations and tolerable control delays for fade countermeasures

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1987-01-01

    A dynamic rain attenuation prediction model is developed for use in obtaining the temporal characteristics, on time scales of minutes or hours, of satellite communication link availability. Analagous to the associated static rain attenuation model, which yields yearly attenuation predictions, this dynamic model is applicable at any location in the world that is characterized by the static rain attenuation statistics peculiar to the geometry of the satellite link and the rain statistics of the location. Such statistics are calculated by employing the formalism of Part I of this report. In fact, the dynamic model presented here is an extension of the static model and reduces to the static model in the appropriate limit. By assuming that rain attenuation is dynamically described by a first-order stochastic differential equation in time and that this random attenuation process is a Markov process, an expression for the associated transition probability is obtained by solving the related forward Kolmogorov equation. This transition probability is then used to obtain such temporal rain attenuation statistics as attenuation durations and allowable attenuation margins versus control system delay.

  17. Fracture Toughness Prediction for MWCNT Reinforced Ceramics

    SciTech Connect

    Henager, Charles H.; Nguyen, Ba Nghiep

    2013-09-01

    This report describes the development of a micromechanics model to predict fracture toughness of multiwall carbon nanotube (MWCNT) reinforced ceramic composites to guide future experimental work for this project. The modeling work described in this report includes (i) prediction of elastic properties, (ii) development of a mechanistic damage model accounting for matrix cracking to predict the composite nonlinear stress/strain response to tensile loading to failure, and (iii) application of this damage model in a modified boundary layer (MBL) analysis using ABAQUS to predict fracture toughness and crack resistance behavior (R-curves) for ceramic materials containing MWCNTs at various volume fractions.

  18. Maximum Capital Project Management.

    ERIC Educational Resources Information Center

    Adams, Matt

    2002-01-01

    Describes the stages of capital project planning and development: (1) individual capital project submission; (2) capital project proposal assessment; (3) executive committee; and (4) capital project execution. (EV)

  19. LLAMA Project

    NASA Astrophysics Data System (ADS)

    Arnal, E. M.; Abraham, Z.; Giménez de Castro, G.; de Gouveia dal Pino, E. M.; Larrarte, J. J.; Lepine, J.; Morras, R.; Viramonte, J.

    2014-10-01

    The project LLAMA, acronym of Long Latin American Millimetre Array is very briefly described in this paper. This project is a joint scientific and technological undertaking of Argentina and Brazil on the basis of an equal investment share, whose mail goal is both to install and to operate an observing facility capable of exploring the Universe at millimetre and sub/millimetre wavelengths. This facility will be erected in the argentinean province of Salta, in a site located at 4830m above sea level.

  20. Cloudnet Project

    DOE Data Explorer

    Hogan, Robin

    2008-01-15

    Cloudnet is a research project supported by the European Commission. This project aims to use data obtained quasi-continuously for the development and implementation of cloud remote sensing synergy algorithms. The use of active instruments (lidar and radar) results in detailed vertical profiles of important cloud parameters which cannot be derived from current satellite sensing techniques. A network of three already existing cloud remote sensing stations (CRS-stations) will be operated for a two year period, activities will be co-ordinated, data formats harmonised and analysis of the data performed to evaluate the representation of clouds in four major european weather forecast models.

  1. Making detailed predictions makes (some) predictions worse

    NASA Astrophysics Data System (ADS)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  2. NDLGS project update

    SciTech Connect

    Lienert, Thomas J; Sutton, Jacob O; Piltch, Martin S; Lujan, Dennis J

    2011-01-14

    Recent results for laser and ESD processing for the NDLGS project will be reviewed. Conclusions are: (1) Short mix passes have profound effect on window T; (2) Multiple drill and re-weld at single location has been shown to be feasible and successful; (3) Kapton beam profiling method has been successfully developed. Comparison of 100 mm and 120 mm lenses gives reasonable and consistent results; (4) Manifold pumpdown data has been presented; (5) ESO results can be accurately predicted once a repeatable efficiency has been established; and (6) The electrode-workpiece geometry may play an important on ESO efficiency. Experiments are planned to investigate these effects.

  3. Spent Nuclear Fuel project, project management plan

    SciTech Connect

    Fuquay, B.J.

    1995-10-25

    The Hanford Spent Nuclear Fuel Project has been established to safely store spent nuclear fuel at the Hanford Site. This Project Management Plan sets forth the management basis for the Spent Nuclear Fuel Project. The plan applies to all fabrication and construction projects, operation of the Spent Nuclear Fuel Project facilities, and necessary engineering and management functions within the scope of the project

  4. Project Boomerang

    ERIC Educational Resources Information Center

    King, Allen L.

    1975-01-01

    Describes an experimental project on boomerangs designed for an undergraduate course in classical mechanics. The students designed and made their own boomerangs, devised their own procedures, and carried out suitable measurements. Presents some of their data and a simple analysis for the two-bladed boomerang. (Author/MLH)

  5. Project SUCCEED.

    ERIC Educational Resources Information Center

    Yarger, Sam; Klingner, Janette

    This paper describes Project SUCCEED (School University Community Coalition for Excellence in Education). The coalition includes the University of Miami School of Education, the University of Miami College of Arts and Sciences, Miami-Dade County Public Schools, and the Miami Museum of Science. The goal is to provide a comprehensive approach to…

  6. Project COLD.

    ERIC Educational Resources Information Center

    Kazanjian, Wendy C.

    1982-01-01

    Describes Project COLD (Climate, Ocean, Land, Discovery) a scientific study of the Polar Regions, a collection of 35 modules used within the framework of existing subjects: oceanography, biology, geology, meterology, geography, social science. Includes a partial list of topics and one activity (geodesic dome) from a module. (Author/SK)

  7. Project Notes

    ERIC Educational Resources Information Center

    School Science Review, 1977

    1977-01-01

    Listed and described are student A-level biology projects in the following areas: Angiosperm studies (e.g., factors affecting growth of various plants), 7; Bacterial studies, 1; Insect studies, 2; Fish studies, 1; Mammal studies, 1; Human studies, 1; Synecology studies, 2; Environmental studies, 2; and Enzyme studies, 1. (CS)

  8. Project Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1979

    1979-01-01

    Listed are 32 biology A-level projects, categorized by organisms studied as follows: algae (1), bryophytes (1), angiosperms (14), fungi (1), flatworms (1), annelids (2), molluscs (1), crustaceans (2), insects (4), fish (2), mammals (1), humans (1); and one synecological study. (CS)

  9. Viticultural Project

    ERIC Educational Resources Information Center

    Kaminske, Volker

    2005-01-01

    The EU offers opportunities to schools - within the Comenius Programme - to develop and implement specific studies in which pedagogical, lingual and science-oriented objectives can be internationally combined if schools from at least three European countries are ready to share in a common project. Geography seems to be especially suited for such…

  10. Project Reconstruct.

    ERIC Educational Resources Information Center

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  11. Thanksgiving Project

    ERIC Educational Resources Information Center

    Hilden, Pauline

    1976-01-01

    A teacher describes a Thanksgiving project in which 40 educable mentally retarded students (6-13 years old) made and served their own dinner of stew, butter, bread, ice cream, and pie, and in the process learned about social studies, cooking, and proper meal behavior. (CL)

  12. Projected Identities

    ERIC Educational Resources Information Center

    Anderson, Mark Alan

    2006-01-01

    This article presents the idea behind Projected Identities, an art activity wherein students fuse art-making processes and digital image manipulations in a series of exploratory artistic self-examinations. At some point in every person's life they've been told something hard to forget. Students might, for example, translate phrases like, "Good…

  13. Project CAST.

    ERIC Educational Resources Information Center

    Charles County Board of Education, La Plata, MD. Office of Special Education.

    The document outlines procedures for implementing Project CAST (Community and School Together), a community-based career education program for secondary special education students in Charles County, Maryland. Initial sections discuss the role of a learning coordinator, (including relevant travel reimbursement and mileage forms) and an overview of…

  14. Project Narrative

    SciTech Connect

    Driscoll, Mary C.

    2012-07-12

    The Project Narrative describes how the funds from the DOE grant were used to purchase equipment for the biology, chemistry, physics and mathematics departments. The Narrative also describes how the equipment is being used. There is also a list of the positive outcomes as a result of having the equipment that was purchased with the DOE grant.

  15. Project Succeed.

    ERIC Educational Resources Information Center

    Patterson, John

    Project Succeed is a program for helping failure- and dropout-oriented pupils to improve their school achievement. Attendance and assignment completion are the key behaviors for enhancing achievement. Behavior modification and communications procedures are used to bring about the desired changes. Treatment procedures include current assessment…

  16. Project Katrina

    ERIC Educational Resources Information Center

    Aghayan, Carol; Schellhaas, Andree; Wayne, Angela; Burts, Diane C.; Buchanan, Teresa K.; Benedict, Joan

    2005-01-01

    This article describes a spontaneous project that emerged from a group of 3- and 4-year-old children in Louisiana after Hurricane Katrina. The article describes how the teachers adapted the classroom and curriculum to meet the diverse needs of children who were evacuees, as well as those children who were affected in other ways by the…

  17. Limnological Projects.

    ERIC Educational Resources Information Center

    Hambler, David J.; Dixon, Jean M.

    1982-01-01

    Describes collection of quantitative samples of microorganisms and accumulation of physical data from a pond over a year. Provides examples of how final-year degree students have used materials and data for ecological projects (involving mainly algae), including their results/conclusions. Also describes apparatus and reagents used in the student…

  18. Enrollment Projections.

    ERIC Educational Resources Information Center

    Lee, John

    2000-01-01

    Higher education enrollment is going through a transition. Between 1992 and 1998, the enrollment growth rate has been nearly flat, but the National Center for Education Statistics now projects that enrollment will increase by 1.4% annually during the next decade. Not every college and university will realize this growth. The traditional college…

  19. The 1986-87 atomic mass predictions

    NASA Astrophysics Data System (ADS)

    Haustein, P. E.

    1987-12-01

    A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conference (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Möller and Nix; Möller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jänecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.

  20. Role of body mass index in the prediction of all cause mortality in over 62,000 men and women. The Italian RIFLE Pooling Project. Risk Factor and Life Expectancy

    PubMed Central

    Seccareccia, F.; Lanti, M.; Menotti, A.; Scanga, M.

    1998-01-01

    STUDY OBJECTIVE: To evaluate the relation of body mass index (BMI) to short-term mortality in a large Italian population sample. DESIGN: Within the Italian RIFLE pooling project, BMI was measured in 47 population samples made of 32,741 men and 30,305 women ages 20-69 years (young 20-44, mature 45-69). Data on mortality were collected for the next six years. MAIN OUTCOME MEASURES: Age adjusted death rates in quintile classes of BMI and Cox proportional hazards models with six year all causes mortality as end point, BMI as covariate and age, smoking, systolic blood pressure as possible confounders were computed. Multivariate analysis was tested in all subjects and after the exclusion of smokers, early (first two years) deaths, and both categories. RESULTS: The univariate analysis failed to demonstrate in all cases a U or inverse J shaped relation. The Cox coefficients for the linear and quadratic terms of BMI proved significant for both young and mature women. The minimum of the curve was located at 27.0 (24.0, 30.0, 95% confidence limits, CL) and 31.8 (25.5, 38.2, 95% CL) units of BMI, for young and mature women respectively. Similar findings were obtained even when exclusion were performed. No relation was found for young men while for mature adult men only the model for all subjects retained significant curvilinear relation (minimum 29.3; 22.4, 36.2, 95% CL). CONCLUSION: These uncommon high values of BMI carrying the minimum risk of death seems to be in contrast with weight guidelines. A confirmation of these findings in other population groups might induce the consideration of changes in the suggested healthy values of BMI.   PMID:9604037

  1. Is Climate Change Predictable? Really?

    SciTech Connect

    Dannevik, W P; Rotman, D A

    2005-11-14

    This project is the first application of a completely different approach to climate modeling, in which new prognostic equations are used to directly compute the evolution of two-point correlations. This project addresses three questions that are critical for the credibility of the science base for climate prediction: (1) What is the variability spectrum at equilibrium? (2) What is the rate of relaxation when subjected to external perturbations? (3) Can variations due to natural processes be distinguished from those due to transient external forces? The technical approach starts with the evolution equation for the probability distribution function and arrives at a prognostic equation for ensemble-mean two-point correlations, bypassing the detailed weather calculation. This work will expand our basic understanding of the theoretical limits of climate prediction and stimulate new experiments to perform with conventional climate models. It will furnish statistical estimates that are inaccessible with conventional climate simulations and likely will raise important new questions about the very nature of climate change and about how (and whether) climate change can be predicted. Solid progress on such issues is vital to the credibility of the science base for climate change research and will provide policymakers evaluating tradeoffs among energy technology options and their attendant environmental and economic consequences.

  2. Water resources assessment and prediction in China

    NASA Astrophysics Data System (ADS)

    Wang, Guangsheng; Dai, Ning; Yang, Jianqing; Wang, Jinxing

    2016-10-01

    Water resources assessment in China, can be classified into three groups: (i) comprehensive water resources assessment, (ii) annual water resources assessment, and (iii) industrial project water resources assessment. Comprehensive water resources assessment is the conventional assessment where the frequency distribution of water resources in basins or provincial regions are analyzed. For the annual water resources assessment, water resources of the last year in basins or provincial regions are usually assessed. For the industrial project water resources assessment, the water resources situation before the construction of industrial project has to be assessed. To address the climate and environmental changes, hydrological and statistical models are widely applied for studies on assessing water resources changes. For the water resources prediction in China usually the monthly runoff prediction is used. In most low flow seasons, the flow recession curve is commonly used as prediction method. In the humid regions, the rainfall-runoff ensemble prediction (ESP) has been widely applied for the monthly runoff prediction. The conditional probability method for the monthly runoff prediction was also applied to assess next month runoff probability under a fixed initial condition.

  3. Cognitive Education Project. Summary Project.

    ERIC Educational Resources Information Center

    Mulcahy, Robert; And Others

    The Cognitive Education Project conducted a 3-year longitudinal evaluation of two cognitive education programs that were aimed at teaching thinking skills. The critical difference between the two experimental programs was that one, Feuerstein's Instrumental Enrichment (IE) method, was taught out of curricular content, while the other, the…

  4. Ceramic Technology Project

    SciTech Connect

    Not Available

    1992-03-01

    The Ceramic Technology Project was developed by the USDOE Office of Transportation Systems (OTS) in Conservation and Renewable Energy. This project, part of the OTS's Materials Development Program, was developed to meet the ceramic technology requirements of the OTS's automotive technology programs. Significant accomplishments in fabricating ceramic components for the USDOE and NASA advanced heat engine programs have provided evidence that the operation of ceramic parts in high-temperature engine environments is feasible. These programs have also demonstrated that additional research is needed in materials and processing development, design methodology, and data base and life prediction before industry will have a sufficient technology base from which to produce reliable cost-effective ceramic engine components commercially. A five-year project plan was developed with extensive input from private industry. In July 1990 the original plan was updated through the estimated completion of development in 1993. The objective is to develop the industrial technology base required for reliable ceramics for application in advanced automotive heat engines. The project approach includes determining the mechanisms controlling reliability, improving processes for fabricating existing ceramics, developing new materials with increased reliability, and testing these materials in simulated engine environments to confirm reliability. Although this is a generic materials project, the focus is on the structural ceramics for advanced gas turbine and diesel engines, ceramic bearings and attachments, and ceramic coatings for thermal barrier and wear applications in these engines. To facilitate the rapid transfer of this technology to US industry, the major portion of the work is being done in the ceramic industry, with technological support from government laboratories, other industrial laboratories, and universities.

  5. SIMBIOS Project

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; McClain, Charles R.; Busalacchi, Antonio J. (Technical Monitor)

    2001-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, NASA Research Announcement (NRAI) research status, satellite data processing, data product validation, and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project.

  6. Project Prometheus

    NASA Technical Reports Server (NTRS)

    Johnson, Steve

    2003-01-01

    Project Prometheus will enable a new paradigm in the scientific exploration of the Solar System. The proposed JIMO mission will start a new generation of missions characterized by more maneuverability, flexibility, power and lifetime. Project Prometheus organization is established at NASA Headquarters: 1.Organization established to carry out development of JIMO, nuclear power (radioisotope), and nuclear propulsion research. 2.Completed broad technology and national capacity assessments to inform decision making on planning and technology development. 3.Awarded five NRA s for nuclear propulsion research. 4.Radioisotope power systems in development, and Plutonium-238 being purchased from Russia. 5.Formulated science driven near-term and long-term plan for the safe utilization of nuclear propulsion based missions. 6.Completed preliminary studies (Pre-Phase A) of JIMO and other missions. 7.Initiated JIMO Phase A studies by Contractors and NASA.

  7. Hydropower Projects

    SciTech Connect

    2015-04-02

    The Water Power Program helps industry harness this renewable, emissions-free resource to generate environmentally sustainable and cost-effective electricity. Through support for public, private, and nonprofit efforts, the Water Power Program promotes the development, demonstration, and deployment of advanced hydropower devices and pumped storage hydropower applications. These technologies help capture energy stored by diversionary structures, increase the efficiency of hydroelectric generation, and use excess grid energy to replenish storage reserves for use during periods of peak electricity demand. In addition, the Water Power Program works to assess the potential extractable energy from domestic water resources to assist industry and government in planning for our nation’s energy future. From FY 2008 to FY 2014, DOE’s Water Power Program announced awards totaling approximately $62.5 million to 33 projects focused on hydropower. Table 1 provides a brief description of these projects.

  8. Project MEDSAT

    NASA Technical Reports Server (NTRS)

    1991-01-01

    During the winter term of 1991, two design courses at the University of Michigan worked on a joint project, MEDSAT. The two design teams consisted of the Atmospheric, Oceanic, and Spacite System Design and Aerospace Engineering 483 (Aero 483) Aerospace System Design. In collaboration, they worked to produce MEDSAT, a satellite and scientific payload whose purpose was to monitor environmental conditions over Chiapas, Mexico. Information gained from the sensing, combined with regional data, would be used to determine the potential for malaria occurrence in that area. The responsibilities of AOSS 605 consisted of determining the remote sensing techniques, the data processing, and the method to translate the information into a usable output. Aero 483 developed the satellite configuration and the subsystems required for the satellite to accomplish its task. The MEDSAT project is an outgrowth of work already being accomplished by NASA's Biospheric and Disease Monitoring Program and Ames Research Center. NASA's work has been to develop remote sensing techniques to determine the abundance of disease carriers and now this project will place the techniques aboard a satellite. MEDSAT will be unique in its use of both a Synthetic Aperture Radar and visual/IR sensor to obtain comprehensive monitoring of the site. In order to create a highly feasible system, low cost was a high priority. To obtain this goal, a light satellite configuration launched by the Pegasus launch vehicle was used.

  9. Project MEDSAT

    NASA Astrophysics Data System (ADS)

    During the winter term of 1991, two design courses at the University of Michigan worked on a joint project, MEDSAT. The two design teams consisted of the Atmospheric, Oceanic, and Spacite System Design and Aerospace Engineering 483 (Aero 483) Aerospace System Design. In collaboration, they worked to produce MEDSAT, a satellite and scientific payload whose purpose was to monitor environmental conditions over Chiapas, Mexico. Information gained from the sensing, combined with regional data, would be used to determine the potential for malaria occurrence in that area. The responsibilities of AOSS 605 consisted of determining the remote sensing techniques, the data processing, and the method to translate the information into a usable output. Aero 483 developed the satellite configuration and the subsystems required for the satellite to accomplish its task. The MEDSAT project is an outgrowth of work already being accomplished by NASA's Biospheric and Disease Monitoring Program and Ames Research Center. NASA's work has been to develop remote sensing techniques to determine the abundance of disease carriers and now this project will place the techniques aboard a satellite. MEDSAT will be unique in its use of both a Synthetic Aperture Radar and visual/IR sensor to obtain comprehensive monitoring of the site. In order to create a highly feasible system, low cost was a high priority. To obtain this goal, a light satellite configuration launched by the Pegasus launch vehicle was used.

  10. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.

  11. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  12. Prediction of Gas Lubricated Foil Journal Bearing Performance

    NASA Technical Reports Server (NTRS)

    Carpino, Marc; Talmage, Gita

    2003-01-01

    This report summarizes the progress in the first eight months of the project. The objectives of this research project are to theoretically predict the steady operating conditions and the rotor dynamic coefficients of gas foil journal bearings. The project is currently on or ahead of schedule with the development of a finite element code that predicts steady bearing performance characteristics such as film thickness, pressure, load, and drag. Graphical results for a typical bearing are presented in the report. Project plans for the next year are discussed.

  13. Predicting evolutionary dynamics

    NASA Astrophysics Data System (ADS)

    Balazsi, Gabor

    We developed an ordinary differential equation-based model to predict the evolutionary dynamics of yeast cells carrying a synthetic gene circuit. The predicted aspects included the speed at which the ancestral genotype disappears from the population; as well as the types of mutant alleles that establish in each environmental condition. We validated these predictions by experimental evolution. The agreement between our predictions and experimental findings suggests that cellular and population fitness landscapes can be useful to predict short-term evolution.

  14. Study Predicts Dramatic Shifts in Enrollments.

    ERIC Educational Resources Information Center

    Evangelauf, Jean

    1991-01-01

    A new study detailing demographic shifts in the college-age population predicts growth in minority high school graduates and shrinkage or maintenance of White graduation rates. The report is the first to provide state-by-state figures on actual and projected graduates from 1986 through 1995 by racial and ethnic group. (MSE)

  15. Project Exodus

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Project Exodus is an in-depth study to identify and address the basic problems of a manned mission to Mars. The most important problems concern propulsion, life support, structure, trajectory, and finance. Exodus will employ a passenger ship, cargo ship, and landing craft for the journey to Mars. These three major components of the mission design are discussed separately. Within each component the design characteristics of structures, trajectory, and propulsion are addressed. The design characteristics of life support are mentioned only in those sections requiring it.

  16. SIMBIOS Project

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; McClain, Charles R.

    2002-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, NASA Research Announcement (NRA) research status, satellite data processing, data product validation, and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project. The SIMBIOS Science Team Principal Investigators' (PIs) original contributions to this report are in chapters four and above. The purpose of these contributions is to describe the current research status of the SIMBIOS-NRA-96 funded research. The contributions are published as submitted, with the exception of minor edits to correct obvious grammatical or clerical errors.

  17. Project Exodus

    NASA Technical Reports Server (NTRS)

    Bryant, Rodney (Compiler); Dillon, Jennifer (Compiler); Grewe, George (Compiler); Mcmorrow, Jim (Compiler); Melton, Craig (Compiler); Rainey, Gerald (Compiler); Rinko, John (Compiler); Singh, David (Compiler); Yen, Tzu-Liang (Compiler)

    1990-01-01

    A design for a manned Mars mission, PROJECT EXODUS is presented. PROJECT EXODUS incorporates the design of a hypersonic waverider, cargo ship and NIMF (nuclear rocket using indigenous Martian fuel) shuttle lander to safely carry out a three to five month mission on the surface of Mars. The cargo ship transports return fuel, return engine, surface life support, NIMF shuttle, and the Mars base to low Mars orbit (LMO). The cargo ship is powered by a nuclear electric propulsion (NEP) system which allows the cargo ship to execute a spiral trajectory to Mars. The waverider transports ten astronauts to Mars and back. It is launched from the Space Station with propulsion provided by a chemical engine and a delta velocity of 9 km/sec. The waverider performs an aero-gravity assist maneuver through the atmosphere of Venus to obtain a deflection angle and increase in delta velocity. Once the waverider and cargo ship have docked the astronauts will detach the landing cargo capsules and nuclear electric power plant and remotely pilot them to the surface. They will then descend to the surface aboard the NIMF shuttle. A dome base will be quickly constructed on the surface and the astronauts will conduct an exploratory mission for three to five months. They will return to Earth and dock with the Space Station using the waverider.

  18. Project Explorer

    NASA Technical Reports Server (NTRS)

    Dannenberg, K. K.; Henderson, A.; Lee, J.; Smith, G.; Stluka, E.

    1984-01-01

    PROJECT EXPLORER is a program that will fly student-developed experiments onboard the Space Shuttle in NASA's Get-Away Special (GAS) containers. The program is co-sponsored by the Alabama Space and Rocket Center, the Alabama-Mississippi Section of the American Institute of Aeronautics and Astronautics, Alabama A&M University and requires extensive support by the University of Alabama in Huntsville. A unique feature of this project will demonstrate transmissions to ground stations on amateur radio frequencies in English language. Experiments Nos. 1, 2, and 3 use the microgravity of space flight to study the solidification of lead-antimony and aluminum-copper alloys, the growth of potassium-tetracyanoplatinate hydrate crystals in an aqueous solution, and the germination of radish seeds. Flight results will be compared with Earth-based data. Experiment No. 4 features radio transmission and will also provide timing for the start of all other experiments. A microprocessor will obtain real-time data from all experiments as well as temperature and pressure measurements taken inside the canister. These data will be transmitted on previously announced amateur radio frequencies after they have been converted into the English language by a digitalker for general reception.

  19. PORTNUS Project

    SciTech Connect

    Loyal, Rebecca E.

    2015-07-14

    The objective of the Portunus Project is to create large, automated offshore ports that will the pace and scale of international trade. Additionally, these ports would increase the number of U.S. domestic trade vessels needed, as the imported goods would need to be transported from these offshore platforms to land-based ports such as Boston, Los Angeles, and Newark. Currently, domestic trade in the United States can only be conducted by vessels that abide by the Merchant Marine Act of 1920 – also referred to as the Jones Act. The Jones Act stipulates that vessels involved in domestic trade must be U.S. owned, U.S. built, and manned by a crew made up of U.S. citizens. The Portunus Project would increase the number of Jones Act vessels needed, which raises an interesting economic concern. Are Jones Act ships more expensive to operate than foreign vessels? Would it be more economically efficient to modify the Jones Act and allow vessels manned by foreign crews to engage in U.S. domestic trade? While opposition to altering the Jones Act is strong, it is important to consider the possibility that ship-owners who employ foreign crews will lobby for the chance to enter a growing domestic trade market. Their success would mean potential job loss for thousands of Americans currently employed in maritime trade.

  20. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  1. SISCAL project

    NASA Astrophysics Data System (ADS)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    The first "ocean colour" sensor, Coastal Zone Color Scanner (CZCS), was launched in 1978. Oceanographers learnt a lot from CZCS but it remained a purely scientific sensor. In recent years, a new generation of satellite-borne earth observation (EO) instruments has been brought into space. These instruments combine high spectral and spatial resolution with revisiting rates of the order of one per day. More instruments with further increased spatial, spectral and temporal resolution will be available within the next years. In the meantime, evaluation procedures taking advantage of the capabilities of the new instruments were derived, allowing the retrieval of ecologically important parameters with higher accuracy than before. Space agencies are now able to collect and to process satellite data in real time and to disseminate them via the Internet. It is therefore meanwhile possible to envisage using EO operationally. In principle, a significant demand for EO data products on terrestrial or marine ecosystems exists both with public authorities (environmental protection, emergency management, natural resources management, national parks, regional planning, etc) and private companies (tourist industry, insurance companies, water suppliers, etc). However, for a number of reasons, many data products that can be derived from the new instruments and methods have not yet left the scientific community towards public or private end users. It is the intention of the proposed SISCAL (Satellite-based Information System on Coastal Areas and Lakes) project to contribute to the closure of the existing gap between space agencies and research institutions on one side and end users on the other side. To do so, we intend to create a data processor that automatically derives and subsequently delivers over the Internet, in Near-Real-Time (NRT), a number of data products tailored to individual end user needs. The data products will be generated using a Geographical Information System (GIS

  2. SISCAL project

    NASA Astrophysics Data System (ADS)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    The first "ocean colour" sensor, Coastal Zone Color Scanner (CZCS), was launched in 1978. Oceanographers learnt a lot from CZCS but it remained a purely scientific sensor. In recent years, a new generation of satellite-borne earth observation (EO) instruments has been brought into space. These instruments combine high spectral and spatial resolution with revisiting rates of the order of one per day. More instruments with further increased spatial, spectral and temporal resolution will be available within the next years. In the meantime, evaluation procedures taking advantage of the capabilities of the new instruments were derived, allowing the retrieval of ecologically important parameters with higher accuracy than before. Space agencies are now able to collect and to process satellite data in real time and to disseminate them via the Internet. It is therefore meanwhile possible to envisage using EO operationally. In principle, a significant demand for EO data products on terrestrial or marine ecosystems exists both with public authorities (environmental protection, emergency management, natural resources management, national parks, regional planning, etc) and private companies (tourist industry, insurance companies, water suppliers, etc). However, for a number of reasons, many data products that can be derived from the new instruments and methods have not yet left the scientific community towards public or private end users. It is the intention of the proposed SISCAL (Satellite-based Information System on Coastal Areas and Lakes) project to contribute to the closure of the existing gap between space agencies and research institutions on one side and end users on the other side. To do so, we intend to create a data processor that automatically derives and subsequently delivers over the Internet, in Near-Real-Time (NRT), a number of data products tailored to individual end user needs. The data products will be generated using a Geographical Information System (GIS

  3. Predictability and prediction skill of the boreal summer intraseasonal oscillation in the Intraseasonal Variability Hindcast Experiment

    NASA Astrophysics Data System (ADS)

    Lee, Sun-Seon; Wang, Bin; Waliser, Duane E.; Neena, Joseph Mani; Lee, June-Yi

    2015-10-01

    Boreal summer intraseasonal oscillation (BSISO) is one of the dominant modes of intraseasonal variability of the tropical climate system, which has fundamental impacts on regional summer monsoons, tropical storms, and extra-tropical climate variations. Due to its distinctive characteristics, a specific metric for characterizing observed BSISO evolution and assessing numerical models' simulations has previously been proposed (Lee et al. in Clim Dyn 40:493-509, 2013). However, the current dynamical model's prediction skill and predictability have not been investigated in a multi-model framework. Using six coupled models in the Intraseasonal Variability Hindcast Experiment project, the predictability estimates and prediction skill of BSISO are examined. The BSISO predictability is estimated by the forecast lead day when mean forecast error becomes as large as the mean signal under the perfect model assumption. Applying the signal-to-error ratio method and using ensemble-mean approach, we found that the multi-model mean BSISO predictability estimate and prediction skill with strong initial amplitude (about 10 % higher than the mean initial amplitude) are about 45 and 22 days, respectively, which are comparable with the corresponding counterparts for Madden-Julian Oscillation during boreal winter (Neena et al. in J Clim 27:4531-4543, 2014a). The significantly lower BSISO prediction skill compared with its predictability indicates considerable room for improvement of the dynamical BSISO prediction. The estimated predictability limit is independent on its initial amplitude, but the models' prediction skills for strong initial amplitude is 6 days higher than the corresponding skill with the weak initial condition (about 15 % less than mean initial amplitude), suggesting the importance of using accurate initial conditions. The BSISO predictability and prediction skill are phase and season-dependent, but the degree of dependency varies with the models. It is important to

  4. Regulatory focus in predictions about others.

    PubMed

    Woltin, Karl-Andrew; Yzerbyt, Vincent

    2015-03-01

    Based on social projection research, four studies investigated whether people rely on their own regulatory focus when making predictions about others. Chronic (Study 1) and induced (Study 2) regulatory focus shaped estimations of others' strategic promotion or prevention inclinations and choices between enriched (fitting promotion) and impoverished options (fitting prevention). Providing indirect process evidence via boundary conditions, participants only relied on their induced regulatory focus in predictions of others' inclinations to seek romantic alternatives to the extent that this did not run counter to stereotypic gender beliefs (Study 3). In addition, participants only relied on their induced regulatory focus in preference predictions concerning promotion and prevention products when they lacked idiosyncratic target knowledge (Study 4). These effects were not mediated by mood, judgment-certainty, perceived task-enjoyment, or task-difficulty. Implications of these findings for social projection research as well as possible interpersonal consequences are delineated.

  5. Prospective evaluation of a Bayesian model to predict organizational change.

    PubMed

    Molfenter, Todd; Gustafson, Dave; Kilo, Chuck; Bhattacharya, Abhik; Olsson, Jesper

    2005-01-01

    This research examines a subjective Bayesian model's ability to predict organizational change outcomes and sustainability of those outcomes for project teams participating in a multi-organizational improvement collaborative. PMID:16093893

  6. Predicting the Orbits of Satellites with a TI-85 Calculator.

    ERIC Educational Resources Information Center

    Papay, Kate; And Others

    1996-01-01

    Describes a project that predicts the orbits of satellites using a TI-85 calculator. Enables students to achieve a richer understanding of longitude, latitude, time zones, orbital mechanics of satellites, and the terms associated with satellite tracking. (JRH)

  7. Battery Life Predictive Model

    2009-12-31

    The Software consists of a model used to predict battery capacity fade and resistance growth for arbitrary cycling and temperature profiles. It allows the user to extrapolate from experimental data to predict actual life cycle.

  8. Apollo Project

    NASA Technical Reports Server (NTRS)

    1966-01-01

    From Spaceflight Revolution: 'Top NASA officials listen to a LOPO briefing at Langley in December 1966. Sitting to the far right with his hand on his chin is Floyd Thompson. To the left sits Dr. George Mueller, NASA associate administrator for Manned Space Flight. On the wall is a diagram of the sites selected for the 'concentrated mission.' 'The most fundamental issue in the pre-mission planning for Lunar Orbiter was how the moon was to be photographed. Would the photography be 'concentrated' on a predetermined single target, or would it be 'distributed' over several selected targets across the moon's surface? On the answer to this basic question depended the successful integration of the entire mission plan for Lunar Orbiter.' The Lunar Orbiter Project made systematic photographic maps of the lunar landing sites. Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 337.

  9. Project Grandmaster

    SciTech Connect

    None, None

    2013-09-16

    The purpose of the Project Grandmaster Application is to allow individuals to opt-in and give the application access to data sources about their activities on social media sites. The application will cross-reference these data sources to build up a picture of each individual activities they discuss, either at present or in the past, and place this picture in reference to groups of all participants. The goal is to allow an individual to place themselves in the collective and to understand how their behavior patterns fit with the group and potentially find changes to make, such as activities they weren’t already aware of or different groups of interest they might want to follow.

  10. Project Grandmaster

    2013-09-16

    The purpose of the Project Grandmaster Application is to allow individuals to opt-in and give the application access to data sources about their activities on social media sites. The application will cross-reference these data sources to build up a picture of each individual activities they discuss, either at present or in the past, and place this picture in reference to groups of all participants. The goal is to allow an individual to place themselves inmore » the collective and to understand how their behavior patterns fit with the group and potentially find changes to make, such as activities they weren’t already aware of or different groups of interest they might want to follow.« less

  11. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Construction of Model 1 used in the LOLA simulator. This was a twenty-foot sphere which simulated for the astronauts what the surface of the moon would look like from 200 miles up. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White wrote: 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  12. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  13. Apollo Project

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White described the simulator as follows: 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  14. Apollo Project

    NASA Technical Reports Server (NTRS)

    1963-01-01

    Track, Model 2 and Model 1, the 20-foot sphere. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) From Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966. 'The model system is designed so that a television camera is mounted on a camera boom on each transport cart and each cart system is shared by two models. The cart's travel along the tracks represents longitudinal motion along the plane of a nominal orbit, vertical travel of the camera boom represents latitude on out-of-plane travel, and horizontal travel of the camera boom represents altitude changes.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379.

  15. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; From Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  16. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution, NASA SP-4308, p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  17. Decadal Climate Prediction: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Hurrell, J. W.

    2010-12-01

    The scientific understanding of climate change is now sufficiently clear to show that climate change from global warming is already upon us, and the rate of change as projected exceeds anything seen in nature in the past 10,000 years. Uncertainties remain, however, especially regarding how climate will change at regional and local scales where the signal of natural variability is large. Decision makers in diverse arenas, from water mangers in the U.S. Southwest to public health experts in Asia, need to know if the climate events they are seeing are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible anthropogenic climate change. The climate science community will not be able to answer these questions and reduce the uncertainties in near-term climate projections without moving toward high resolution climate system predictions, with a blurring of the distinction between shorter-term predictions and longer-term climate projections. The key is the realization that climate system predictions of natural and forced change, regardless of timescale, will require initialization of coupled general circulation models with the best estimates of the current observed state of the atmosphere, oceans, cryosphere, and land surface, a state influenced both by the current phases of modes of natural variability and by the accumulated impacts to date of anthropogenic radiative forcing. Formidable challenges exist: for instance, what is the best method of initialization given imperfect observations and systematic errors in models, what effect does initialization have on climate predictions, what predictions should be attempted and how would they be verified? Accurate initial conditions for the global oceans are especially important and could conceivably be provided by ARGO floats and existing ocean data assimilation exercises. However, performing hindcasts prior to the ARGO float era of near-global upper

  18. Decadal Climate Prediction: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Hurrell, J. W.

    2011-12-01

    The scientific understanding of climate change is sufficiently clear to show that climate change from global warming is already upon us, and the rate of change as projected exceeds anything seen in nature in the past 10,000 years. Uncertainties remain, however, especially regarding how climate will change at regional and local scales where the signal of natural variability is large. Decision makers in diverse arenas, from water managers in the U.S. Southwest to public health experts in Asia, need to know if the climate events they are seeing are the product of natural variability, and hence can be expected to reverse at some point, or are the result of potentially irreversible anthropogenic climate change. The climate science community will not be able to answer these questions and reduce the uncertainties in near-term climate projections without moving toward high resolution climate system predictions, with a blurring of the distinction between shorter-term predictions and longer-term climate projections. The key is the realization that climate system predictions of natural and forced change, regardless of timescale, will require initialization of coupled general circulation models with the best estimates of the current observed state of the atmosphere, oceans, cryosphere, and land surface, a state influenced both by the current phases of modes of natural variability and by the accumulated impacts to date of anthropogenic radiative forcing. Formidable challenges exist: for instance, what is the best method of initialization given imperfect observations and systematic errors in models, what effect does initialization have on climate predictions, what predictions should be attempted and how would they be verified? Accurate initial conditions for the global oceans are especially important and could conceivably be provided by ARGO floats and existing ocean data assimilation exercises. However, performing hindcasts prior to the ARGO float era of near-global upper ocean

  19. Prediction in Multiple Regression.

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2000-01-01

    Presents the concept of prediction via multiple regression (MR) and discusses the assumptions underlying multiple regression analyses. Also discusses shrinkage, cross-validation, and double cross-validation of prediction equations and describes how to calculate confidence intervals around individual predictions. (SLD)

  20. Nonlinear Combustion Instability Prediction

    NASA Technical Reports Server (NTRS)

    Flandro, Gary

    2010-01-01

    The liquid rocket engine stability prediction software (LCI) predicts combustion stability of systems using LOX-LH2 propellants. Both longitudinal and transverse mode stability characteristics are calculated. This software has the unique feature of being able to predict system limit amplitude.

  1. Prediction in Multilevel Models

    ERIC Educational Resources Information Center

    Afshartous, David; de Leeuw, Jan

    2005-01-01

    Multilevel modeling is an increasingly popular technique for analyzing hierarchical data. This article addresses the problem of predicting a future observable y[subscript *j] in the jth group of a hierarchical data set. Three prediction rules are considered and several analytical results on the relative performance of these prediction rules are…

  2. Drought Predictability and Prediction in a Changing Climate: Assessing Current Predictive Knowledge and Capabilities, User Requirements and Research Priorities

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried

    2011-01-01

    Drought is fundamentally the result of an extended period of reduced precipitation lasting anywhere from a few weeks to decades and even longer. As such, addressing drought predictability and prediction in a changing climate requires foremost that we make progress on the ability to predict precipitation anomalies on subseasonal and longer time scales. From the perspective of the users of drought forecasts and information, drought is however most directly viewed through its impacts (e.g., on soil moisture, streamflow, crop yields). As such, the question of the predictability of drought must extend to those quantities as well. In order to make progress on these issues, the WCRP drought information group (DIG), with the support of WCRP, the Catalan Institute of Climate Sciences, the La Caixa Foundation, the National Aeronautics and Space Administration, the National Oceanic and Atmospheric Administration, and the National Science Foundation, has organized a workshop to focus on: 1. User requirements for drought prediction information on sub-seasonal to centennial time scales 2. Current understanding of the mechanisms and predictability of drought on sub-seasonal to centennial time scales 3. Current drought prediction/projection capabilities on sub-seasonal to centennial time scales 4. Advancing regional drought prediction capabilities for variables and scales most relevant to user needs on sub-seasonal to centennial time scales. This introductory talk provides an overview of these goals, and outlines the occurrence and mechanisms of drought world-wide.

  3. Current affairs in earthquake prediction in Japan

    NASA Astrophysics Data System (ADS)

    Uyeda, Seiya

    2015-12-01

    As of mid-2014, the main organizations of the earthquake (EQ hereafter) prediction program, including the Seismological Society of Japan (SSJ), the MEXT Headquarters for EQ Research Promotion, hold the official position that they neither can nor want to make any short-term prediction. It is an extraordinary stance of responsible authorities when the nation, after the devastating 2011 M9 Tohoku EQ, most urgently needs whatever information that may exist on forthcoming EQs. Japan's national project for EQ prediction started in 1965, but it has made no success. The main reason for no success is the failure to capture precursors. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this stance has been further fortified by the 2011 M9 Tohoku Mega-quake. This paper tries to explain how this situation came about and suggest that it may in fact be a legitimate one which should have come a long time ago. Actually, substantial positive changes are taking place now. Some promising signs are arising even from cooperation of researchers with private sectors and there is a move to establish an "EQ Prediction Society of Japan". From now on, maintaining the high scientific standards in EQ prediction will be of crucial importance.

  4. Ace Project as a Project Management Tool

    ERIC Educational Resources Information Center

    Cline, Melinda; Guynes, Carl S.; Simard, Karine

    2010-01-01

    The primary challenge of project management is to achieve the project goals and objectives while adhering to project constraints--usually scope, quality, time and budget. The secondary challenge is to optimize the allocation and integration of resources necessary to meet pre-defined objectives. Project management software provides an active…

  5. Radar-aeolian roughness project

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald; Dobrovolskis, A.; Gaddis, L.; Iversen, J. D.; Lancaster, N.; Leach, Rodman N.; Rasnussen, K.; Saunders, S.; Vanzyl, J.; Wall, S.

    1991-01-01

    The objective is to establish an empirical relationship between measurements of radar, aeolian, and surface roughness on a variety of natural surfaces and to understand the underlying physical causes. This relationship will form the basis for developing a predictive equation to derive aeolian roughness from radar backscatter. Results are given from investigations carried out in 1989 on the principal elements of the project, with separate sections on field studies, radar data analysis, laboratory simulations, and development of theory for planetary applications.

  6. Signal Prediction With Input Identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Chen, Ya-Chin

    1999-01-01

    A novel coding technique is presented for signal prediction with applications including speech coding, system identification, and estimation of input excitation. The approach is based on the blind equalization method for speech signal processing in conjunction with the geometric subspace projection theory to formulate the basic prediction equation. The speech-coding problem is often divided into two parts, a linear prediction model and excitation input. The parameter coefficients of the linear predictor and the input excitation are solved simultaneously and recursively by a conventional recursive least-squares algorithm. The excitation input is computed by coding all possible outcomes into a binary codebook. The coefficients of the linear predictor and excitation, and the index of the codebook can then be used to represent the signal. In addition, a variable-frame concept is proposed to block the same excitation signal in sequence in order to reduce the storage size and increase the transmission rate. The results of this work can be easily extended to the problem of disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. Simulations are included to demonstrate the proposed method.

  7. Predicting Predictable about Natural Catastrophic Extremes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2015-04-01

    By definition, an extreme event is rare one in a series of kindred phenomena. Usually (e.g. in Geophysics), it implies investigating a small sample of case-histories with a help of delicate statistical methods and data of different quality, collected in various conditions. Many extreme events are clustered (far from independent) and follow fractal or some other "strange" distribution (far from uniform). Evidently, such an "unusual" situation complicates search and definition of reliable precursory behaviors to be used for forecast/prediction purposes. Making forecast/prediction claims reliable and quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" forecast/prediction outcomes, which cannot be obtained without an extended rigorous test of the candidate method. The set of errors ("success/failure" scores and space-time measure of alarms) and other information obtained in such a control test supplies us with data necessary to judge the candidate's potential as a forecast/prediction tool and, eventually, to find its improvements. This is to be done first in comparison against random guessing, which results confidence (measured in terms of statistical significance). Note that an application of the forecast/prediction tools could be very different in cases of different natural hazards, costs and benefits that determine risks, and, therefore, requires determination of different optimal strategies minimizing reliable estimates of realistic levels of accepted losses. In their turn case specific costs and benefits may suggest a modification of the forecast/prediction tools for a more adequate "optimal" application. Fortunately, the situation is not hopeless due to the state-of-the-art understanding of the complexity and non-linear dynamics of the Earth as a Physical System and pattern recognition approaches applied to available geophysical evidences, specifically, when intending to predict

  8. CUBES Project Support

    NASA Technical Reports Server (NTRS)

    Jenkins, Kenneth T., Jr.

    2012-01-01

    CUBES stands for Creating Understanding and Broadening Education through Satellites. The goal of the project is to allow high school students to build a small satellite, or CubeSat. Merritt Island High School (MIHS) was selected to partner with NASA, and California Polytechnic State University (Cal-Poly}, to build a CubeSat. The objective of the mission is to collect flight data to better characterize maximum predicted environments inside the CubeSat launcher, Poly-Picosatellite Orbital Deplorer (P-POD), while attached to the launch vehicle. The MIHS CubeSat team will apply to the NASA CubeSat Launch Initiative, which provides opportunities for small satellite development teams to secure launch slots on upcoming expendable launch vehicle missions. The MIHS team is working to achieve a test launch, or proof of concept flight aboard a suborbital launch vehicle in early 2013.

  9. Predicting Precipitation in Darwin: An Experiment with Markov Chains

    ERIC Educational Resources Information Center

    Boncek, John; Harden, Sig

    2009-01-01

    As teachers of first-year college mathematics and science students, the authors are constantly on the lookout for simple classroom exercises that improve their students' analytical and computational skills. In this article, the authors outline a project entitled "Predicting Precipitation in Darwin." In this project, students: (1) analyze and…

  10. EDSP Prioritization: Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) (SOT)

    EPA Science Inventory

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...

  11. Projection and prediction: Climate sensitivity on the rise

    NASA Astrophysics Data System (ADS)

    Armour, Kyle C.

    2016-10-01

    Recent observations of Earth's energy budget indicate low climate sensitivity. Research now shows that these estimates should be revised upward, resolving an apparent mismatch with climate models and implying a warmer future.

  12. Uncertainty in QSAR predictions.

    PubMed

    Sahlin, Ullrika

    2013-03-01

    It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle.

  13. Collaborative Physical Chemistry Projects Involving Computational Chemistry

    NASA Astrophysics Data System (ADS)

    Whisnant, David M.; Howe, Jerry J.; Lever, Lisa S.

    2000-02-01

    The physical chemistry classes from three colleges have collaborated on two computational chemistry projects using Quantum CAChe 3.0 and Gaussian 94W running on Pentium II PCs. Online communication by email and the World Wide Web was an important part of the collaboration. In the first project, students used molecular modeling to predict benzene derivatives that might be possible hair dyes. They used PM3 and ZINDO calculations to predict the electronic spectra of the molecules and tested the predicted spectra by comparing some with experimental measurements. They also did literature searches for real hair dyes and possible health effects. In the final phase of the project they proposed a synthetic pathway for one compound. In the second project the students were asked to predict which isomer of a small carbon cluster (C3, C4, or C5) was responsible for a series of IR lines observed in the spectrum of a carbon star. After preliminary PM3 calculations, they used ab initio calculations at the HF/6-31G(d) and MP2/6-31G(d) level to model the molecules and predict their vibrational frequencies and rotational constants. A comparison of the predictions with the experimental spectra suggested that the linear isomer of the C5 molecule was responsible for the lines.

  14. Project Longshot

    NASA Technical Reports Server (NTRS)

    West, J. Curtis; Chamberlain, Sally A.; Stevens, Robert; Pagan, Neftali

    1989-01-01

    Project Longshot is an unmanned probe to our nearest star system, Alpha Centauri, 4.3 light years away. The Centauri system is a trinary system consisting of two central stars (A and B) orbiting a barycenter, and a third (Proxima Centauri) orbiting the two. The system is a declination of -67 degrees. The goal is to reach the Centauri system in 50 years. This time space was chosen because any shorter time would be impossible of the relativistic velocities involved, and any greater time would be impossible because of the difficulty of creating a spacecraft with such a long lifetime. Therefore, the following mission profile is proposed: (1) spacecraft is assembled in Earth orbit; (2) spacecraft escapes Earth and Sun in the ecliptic with a single impulse maneuver; (3) spacecraft changed declination to point toward Centauri system; (4) spacecraft accelerates to 0.1c; (5) spacecraft coasts at 0.1c for 41 years; (6) spacecraft decelerates upon reaching Centauri system; and (7) spacecraft orbits Centauri system, conducts investigations, and relays data to Earth. The total time to reach the Centauri system, taking into consideration acceleration and deceleration, will be approximately 50 years.

  15. Project LASER

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.

  16. Characteristics of predictable MJOs

    NASA Astrophysics Data System (ADS)

    Kim, H. M.; Kim, D.; Vitart, F.; Toma, V. E.; Kug, J. S.; Webster, P. J.

    2015-12-01

    The Madden-Julian Oscillation (MJO) has been considered as a major potential source of global climate predictability on subseasonal time scales. Current operational forecasting systems are able to predict the MJO up to 3-4 weeks with the mean of ensembles, while the skill is still below the theoretical estimates of the predictability (6-7 weeks). It is well accepted that MJO prediction skill in operational systems is distinctly better when the MJO is well organized with having strong amplitude at the beginning of the forecast compared to when it starts with weak or nonexistent MJOs. However, while initially strong MJOs have better skill than weak/non MJOs, the 'initial amplitude-skill' relationship is not linear. It is not every initially strong MJO that has high skill, and not every weak/non MJO that has low skill. What physical mechanism drives some MJOs to be more predictable than others? Using dynamical MJO reforecasts, this study will show that the skill of highly predictable MJOs depends on the background environment that influence on MJO evolution. The favorable conditions for highly predictable MJOs (starting with moderate amplitude) support for stronger ocean-atmosphere coupling, and for sufficient heat and low-tropospheric moisture supply for better MJO propagation, particularly for the MJO crossing over the Maritime Continent. Understanding the characteristics of predictable MJOs may provide insights into the overall predictability of MJO, thus advance the prediction of 3-4 weeks towards its theoretical limit.

  17. Mexican Coronagraph "Mextli" Project

    NASA Astrophysics Data System (ADS)

    Muñoz Martínez, Guadalupe; Jacinto, Juan Soto; Vargas Cardenas, Bernardo; Aguirre Marquez, Hector; Schwenn, Rainer

    Space weather forecasts require a variety of data and information in order to produce reliable results to predict important events affecting the Earth and the surrounding environment. One of the most important solar phenomena concerning the interplanetary conditions is coronal mass ejections. These events transport important amount of material and magnetic field to the interplanetary medium capable of interact with the magnetosphere in different ways. The only source of clear evidence of the early development of coronal mass ejections are, by now, white light images, provided by ground based and space coronagraphs. From these images the main kinematical parameters as speed and acceleration, as projected on the plane of the sky, are obtained. Basic information as the speed of the ejecta in the line of sight and the nature of the material carried require spectrographic observations of the phenomena. LASCO C1 on board of SOHO space mission provided valuable information in this field but propagation speeds greater than 10 km/s could not be detected from the images and it is not in operations since 1998. The Argentinean ground based coronagraph MICA has a design similar to C1 but using a narrow-band filters mechanism instead of the Fabry-Perot interferometer of C1. The purpose or Mextli project is to have a coronagraph with spectroscopic capabilities aimed to observe the inner solar corona between 2.5 and 15 solar radii in the emission of Fe XIV line at 530 +-N nm. Its main objective would be the early detection of dynamical events and its kinematical characterization. In order to achieve the objective pursued, the coronagraph will b e provided with a high speed CCD camera and an electronic Fabry Perot interferometer. The instrument will be constructed in Mexico in the frame of a collaboration project between the UNAM, INAOE and IPN and under the technical supervision of the MPS in Germany and the MICA team from Argentina.

  18. Geothermal Reservoir Technology Research Program: Abstracts of selected research projects

    SciTech Connect

    Reed, M.J.

    1993-03-01

    Research projects are described in the following areas: geothermal exploration, mapping reservoir properties and reservoir monitoring, and well testing, simulation, and predicting reservoir performance. The objectives, technical approach, and project status of each project are presented. The background, research results, and future plans for each project are discussed. The names, addresses, and telephone and telefax numbers are given for the DOE program manager and the principal investigators. (MHR)

  19. RESOLVE Project

    NASA Technical Reports Server (NTRS)

    Parker, Ray; Coan, Mary; Cryderman, Kate; Captain, Janine

    2013-01-01

    The RESOLVE project is a lunar prospecting mission whose primary goal is to characterize water and other volatiles in lunar regolith. The Lunar Advanced Volatiles Analysis (LAVA) subsystem is comprised of a fluid subsystem that transports flow to the gas chromatograph - mass spectrometer (GC-MS) instruments that characterize volatiles and the Water Droplet Demonstration (WDD) that will capture and display water condensation in the gas stream. The LAVA Engineering Test Unit (ETU) is undergoing risk reduction testing this summer and fall within a vacuum chamber to understand and characterize component and integrated system performance. Testing of line heaters, printed circuit heaters, pressure transducers, temperature sensors, regulators, and valves in atmospheric and vacuum environments was done. Test procedures were developed to guide experimental tests and test reports to analyze and draw conclusions from the data. In addition, knowledge and experience was gained with preparing a vacuum chamber with fluid and electrical connections. Further testing will include integrated testing of the fluid subsystem with the gas supply system, near-infrared spectrometer, WDD, Sample Delivery System, and GC-MS in the vacuum chamber. This testing will provide hands-on exposure to a flight forward spaceflight subsystem, the processes associated with testing equipment in a vacuum chamber, and experience working in a laboratory setting. Examples of specific analysis conducted include: pneumatic analysis to calculate the WDD's efficiency at extracting water vapor from the gas stream to form condensation; thermal analysis of the conduction and radiation along a line connecting two thermal masses; and proportional-integral-derivative (PID) heater control analysis. Since LAVA is a scientific subsystem, the near-infrared spectrometer and GC-MS instruments will be tested during the ETU testing phase.

  20. Predictive systems ecology.

    PubMed

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  1. Predicting Unsteady Aeroelastic Behavior

    NASA Technical Reports Server (NTRS)

    Strganac, Thomas W.; Mook, Dean T.

    1990-01-01

    New method for predicting subsonic flutter, static deflections, and aeroelastic divergence developed. Unsteady aerodynamic loads determined by unsteady-vortex-lattice method. Accounts for aspect ratio and angle of attack. Equations for motion of wing and flow field solved iteratively and simultaneously. Used to predict transient responses to initial disturbances, and to predict steady-state static and oscillatory responses. Potential application for research in such unsteady structural/flow interactions as those in windmills, turbines, and compressors.

  2. Predictability of Conversation Partners

    NASA Astrophysics Data System (ADS)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  3. The Materials Genome Project

    NASA Astrophysics Data System (ADS)

    Aourag, H.

    2008-09-01

    In the past, the search for new and improved materials was characterized mostly by the use of empirical, trial- and-error methods. This picture of materials science has been changing as the knowledge and understanding of fundamental processes governing a material's properties and performance (namely, composition, structure, history, and environment) have increased. In a number of cases, it is now possible to predict a material's properties before it has even been manufactured thus greatly reducing the time spent on testing and development. The objective of modern materials science is to tailor a material (starting with its chemical composition, constituent phases, and microstructure) in order to obtain a desired set of properties suitable for a given application. In the short term, the traditional "empirical" methods for developing new materials will be complemented to a greater degree by theoretical predictions. In some areas, computer simulation is already used by industry to weed out costly or improbable synthesis routes. Can novel materials with optimized properties be designed by computers? Advances in modelling methods at the atomic level coupled with rapid increases in computer capabilities over the last decade have led scientists to answer this question with a resounding "yes'. The ability to design new materials from quantum mechanical principles with computers is currently one of the fastest growing and most exciting areas of theoretical research in the world. The methods allow scientists to evaluate and prescreen new materials "in silico" (in vitro), rather than through time consuming experimentation. The Materials Genome Project is to pursue the theory of large scale modeling as well as powerful methods to construct new materials, with optimized properties. Indeed, it is the intimate synergy between our ability to predict accurately from quantum theory how atoms can be assembled to form new materials and our capacity to synthesize novel materials atom

  4. Solar Cycle Predictions

    NASA Technical Reports Server (NTRS)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  5. Predicting cancer outcome

    SciTech Connect

    Gardner, S N; Fernandes, M

    2005-03-24

    We read with interest the paper by Michiels et al on the prediction of cancer with microarrays and the commentary by Ioannidis listing the potential as well as the limitations of this approach (February 5, p 488 and 454). Cancer is a disease characterized by complex, heterogeneous mechanisms and studies to define factors that can direct new drug discovery and use should be encouraged. However, this is easier said than done. Casti teaches that a better understanding does not necessarily extrapolate to better prediction, and that useful prediction is possible without complete understanding (1). To attempt both, explanation and prediction, in a single nonmathematical construct, is a tall order (Figure 1).

  6. Projection in surrogate decisions about life-sustaining medical treatments.

    PubMed

    Fagerlin, A; Ditto, P H; Danks, J H; Houts, R M; Smucker, W D

    2001-05-01

    To honor the wishes of an incapacitated patient, surrogate decision makers must predict the treatment decisions patients would make for themselves if able. Social psychological research, however, suggests that surrogates' own treatment preferences may influence their predictions of others' preferences. In 2 studies (1 involving 60 college student surrogates and a parent, the other involving 361 elderly outpatients and their chosen surrogate decision maker), surrogates predicted whether a close other would want life-sustaining treatment in hypothetical end-of-life scenarios and stated their own treatment preferences in the same scenarios. Surrogate predictions more closely resembled surrogates' own treatment wishes than they did the wishes of the individual they were trying to predict. Although the majority of prediction errors reflected inaccurate use of surrogates' own treatment preferences, projection was also found to result in accurate prediction more often than counterprojective predictions. The rationality and accuracy of projection in surrogate decision making is discussed. PMID:11403214

  7. Crop status evaluations and yield predictions

    NASA Technical Reports Server (NTRS)

    Haun, J. R.

    1976-01-01

    One phase of the large area crop inventory project is presented. Wheat yield models based on the input of environmental variables potentially obtainable through the use of space remote sensing were developed and demonstrated. By the use of a unique method for visually qualifying daily plant development and subsequent multifactor computer analyses, it was possible to develop practical models for predicting crop development and yield. Development of wheat yield prediction models was based on the discovery that morphological changes in plants are detected and quantified on a daily basis, and that this change during a portion of the season was proportional to yield.

  8. RESOLVE Project

    NASA Technical Reports Server (NTRS)

    Parker, Ray O.

    2012-01-01

    The RESOLVE project is a lunar prospecting mission whose primary goal is to characterize water and other volatiles in lunar regolith. The Lunar Advanced Volatiles Analysis (LAVA) subsystem is comprised of a fluid subsystem that transports flow to the gas chromatograph- mass spectrometer (GC-MS) instruments that characterize volatiles and the Water Droplet Demonstration (WDD) that will capture and display water condensation in the gas stream. The LAVA Engineering Test Unit (ETU) is undergoing risk reduction testing this summer and fall within a vacuum chamber to understand and characterize C!Jmponent and integrated system performance. Ray will be assisting with component testing of line heaters, printed circuit heaters, pressure transducers, temperature sensors, regulators, and valves in atmospheric and vacuum environments. He will be developing procedures to guide these tests and test reports to analyze and draw conclusions from the data. In addition, he will gain experience with preparing a vacuum chamber with fluid and electrical connections. Further testing will include integrated testing of the fluid subsystem with the gas supply system, near-infrared spectrometer, WDD, Sample Delivery System, and GC-MS in the vacuum chamber. This testing will provide hands-on exposure to a flight forward spaceflight subsystem, the processes associated with testing equipment in a vacuum chamber, and experience working in a laboratory setting. Examples of specific analysis Ray will conduct include: pneumatic analysis to calculate the WOO's efficiency at extracting water vapor from the gas stream to form condensation; thermal analysis of the conduction and radiation along a line connecting two thermal masses; and proportional-integral-derivative (PID) heater control analysis. In this Research and Technology environment, Ray will be asked to problem solve real-time as issues arise. Since LAVA is a scientific subsystem, Ray will be utilizing his chemical engineering background to

  9. Project summary

    NASA Technical Reports Server (NTRS)

    1991-01-01

    California Polytechnic State University's design project for the 1990-91 school year was the design of a close air support aircraft. There were eight design groups that participated and were given requests for proposals. These proposals contained mission specifications, particular performance and payload requirements, as well as the main design drivers. The mission specifications called for a single pilot weighing 225 lb with equipment. The design mission profile consisted of the following: (1) warm-up, taxi, take off, and accelerate to cruise speed; (2) dash at sea level at 500 knots to a point 250 nmi from take off; (3) combat phase, requiring two combat passes at 450 knots that each consist of a 360 deg turn and an energy increase of 4000 ft. - at each pass, half of air-to-surface ordnance is released; (4) dash at sea level at 500 knots 250 nmi back to base; and (5) land with 20 min of reserve fuel. The request for proposal also specified the following performance requirements with 50 percent internal fuel and standard stores: (1) the aircraft must be able to accelerate from Mach 0.3 to 0.5 at sea level in less than 20 sec; (2) required turn rates are 4.5 sustained g at 450 knots at sea level; (3) the aircraft must have a reattack time of 25 sec or less (reattack time was defined as the time between the first and second weapon drops); (4) the aircraft is allowed a maximum take off and landing ground roll of 2000 ft. The payload requirements were 20 Mk 82 general-purpose free-fall bombs and racks; 1 GAU-8A 30-mm cannon with 1350 rounds; and 2 AIM-9L Sidewinder missiles and racks. The main design drivers expressed in the request for proposal were that the aircraft should be survivable and maintainable. It must be able to operate in remote areas with little or no maintenance. Simplicity was considered the most important factor in achieving the former goal. In addition, the aircraft must be low cost both in acquisition and operation. The summaries of the aircraft

  10. Climate predictability in the second year.

    PubMed

    Hermanson, Leon; Sutton, Rowan T

    2009-03-13

    In this paper, the predictability of climate arising from ocean heat content (OHC) anomalies is investigated in the HadCM3 coupled atmosphere-ocean model. An ensemble of simulations of the twentieth century are used to provide initial conditions for a case study. The case study consists of two ensembles started from initial conditions with large differences in regional OHC in the North Atlantic, the Southern Ocean and parts of the West Pacific. Surface temperatures and precipitation are on average not predictable beyond seasonal time scales, but for certain initial conditions there may be longer predictability. It is shown that, for the case study examined here, some aspects of tropical precipitation, European surface temperatures and North Atlantic sea-level pressure are potentially predictable 2 years ahead. Predictability also exists in the other case studies, but the climate variables and regions, which are potentially predictable, differ. This work was done as part of the Grid for Coupled Ensemble Prediction (GCEP) eScience project. PMID:19087941

  11. The Hairy Head Project.

    ERIC Educational Resources Information Center

    Gallick, Barbara

    A class of 3- to 6-year-old children in a Midwestern child care center chose to study hair and hairstyling salons as a group project. This article discusses how the project evolved, describes the three phases of the project, and provides the teacher's reflections on the project. Photos taken during the project are included. (Author)

  12. Integrated Project Management System description. [UMTRAP Project

    SciTech Connect

    Not Available

    1987-03-01

    The Uranium Mill Tailings Remedial Action (UMTRA) Project is a Department of Energy (DOE) designated Major System Acquisition (MSA). To execute and manage the Project mission successfully and to comply with the MSA requirements, the UMTRA Project Office ( Project Office'') has implemented and operates an Integrated Project Management System (IPMS). The Project Office is assisted by the Technical Assistance Contractor's (TAC) Project Integration and Control (PIC) Group in system operation. Each participant, in turn, provides critical input to system operation and reporting requirements. The IPMS provides a uniform structured approach for integrating the work of Project participants. It serves as a tool for planning and control, workload management, performance measurement, and specialized reporting within a standardized format. This system description presents the guidance for its operation. Appendices 1 and 2 contain definitions of commonly used terms and abbreviations and acronyms, respectively. 17 figs., 5 tabs.

  13. Improved nonlinear prediction method

    NASA Astrophysics Data System (ADS)

    Adenan, Nur Hamiza; Md Noorani, Mohd Salmi

    2014-06-01

    The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.

  14. Stable predictive control horizons

    NASA Astrophysics Data System (ADS)

    Estrada, Raúl; Favela, Antonio; Raimondi, Angelo; Nevado, Antonio; Requena, Ricardo; Beltrán-Carbajal, Francisco

    2012-04-01

    The stability theory of predictive and adaptive predictive control for processes of linear and stable nature is based on the hypothesis of a physically realisable driving desired trajectory (DDT). The formal theoretical verification of this hypothesis is trivial for processes with a stable inverse, but it is not for processes with an unstable inverse. The extended strategy of predictive control was developed with the purpose of overcoming methodologically this stability problem and it has delivered excellent performance and stability in its industrial applications given a suitable choice of the prediction horizon. From a theoretical point of view, the existence of a prediction horizon capable of ensuring stability for processes with an unstable inverse was proven in the literature. However, no analytical solution has been found for the determination of the prediction horizon values which guarantee stability, in spite of the theoretical and practical interest of this matter. This article presents a new method able to determine the set of prediction horizon values which ensure stability under the extended predictive control strategy formulation and a particular performance criterion for the design of the DDT generically used in many industrial applications. The practical application of this method is illustrated by means of simulation examples.

  15. Managing Projects for Change: Contextualised Project Management

    ERIC Educational Resources Information Center

    Tynan, Belinda; Adlington, Rachael; Stewart, Cherry; Vale, Deborah; Sims, Rod; Shanahan, Peter

    2010-01-01

    This paper will detail three projects which focussed on enhancing online learning at a large Australian distance education University within a School of Business, School of Health and School of Education. Each project had special funding and took quite distinctive project management approaches, which reflect the desire to embed innovation and…

  16. The Prediction of Spatial Aftershock Probabilities (PRESAP)

    NASA Astrophysics Data System (ADS)

    McCloskey, J.

    2003-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the

  17. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  18. Predictive Modeling of Cardiac Ischemia

    NASA Technical Reports Server (NTRS)

    Anderson, Gary T.

    1996-01-01

    The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.

  19. Predicting Turning Points for Toy Cars

    NASA Astrophysics Data System (ADS)

    Ramsdell, M. W.; Wick, D. P.

    2005-10-01

    A simple experiment can be performed to estimate a toy car's effective coefficient of friction without the use of sophisticated equipment. The results can be used to predict the car's turning points when traveling on an arbitrarily shaped track containing multiple hills and valleys in a 2-D vertical plane. This activity is based on a team-oriented modeling-based project conducted at Clarkson University, which was published in the American Journal of Physics.

  20. Regional Climate Predictability in the Extratropics

    SciTech Connect

    Robertson,A.W.:Ghil,M.

    2001-08-09

    The goal of this project was to develop a dynamical framework for extratropical climate predictability on decade-to-century timescales and subcontinental spatial scales,besed on the intraseasonal dynamics of the midlatitude atmosphere and their interaction with the ocean's longer timescales.A two-pronged approach was taken,based on (a)idealized,quasi-geostrophic,coupled models of the midlatitude ocean-atmosphere system,and(b)analysis of GCM results.

  1. Analysis of Variables: Predicting Sophomore Persistence Using Logistic Regression Analysis at the University of South Florida

    ERIC Educational Resources Information Center

    Miller, Thomas E.; Herreid, Charlene H.

    2009-01-01

    This is the fifth in a series of articles describing an attrition prediction and intervention project at the University of South Florida (USF) in Tampa. The project was originally presented in the 83(2) issue (Miller 2007). The statistical model for predicting attrition was described in the 83(3) issue (Miller and Herreid 2008). The methods and…

  2. De Novo Protein Structure Prediction

    NASA Astrophysics Data System (ADS)

    Hung, Ling-Hong; Ngan, Shing-Chung; Samudrala, Ram

    An unparalleled amount of sequence data is being made available from large-scale genome sequencing efforts. The data provide a shortcut to the determination of the function of a gene of interest, as long as there is an existing sequenced gene with similar sequence and of known function. This has spurred structural genomic initiatives with the goal of determining as many protein folds as possible (Brenner and Levitt, 2000; Burley, 2000; Brenner, 2001; Heinemann et al., 2001). The purpose of this is twofold: First, the structure of a gene product can often lead to direct inference of its function. Second, since the function of a protein is dependent on its structure, direct comparison of the structures of gene products can be more sensitive than the comparison of sequences of genes for detecting homology. Presently, structural determination by crystallography and NMR techniques is still slow and expensive in terms of manpower and resources, despite attempts to automate the processes. Computer structure prediction algorithms, while not providing the accuracy of the traditional techniques, are extremely quick and inexpensive and can provide useful low-resolution data for structure comparisons (Bonneau and Baker, 2001). Given the immense number of structures which the structural genomic projects are attempting to solve, there would be a considerable gain even if the computer structure prediction approach were applicable to a subset of proteins.

  3. Elementary School Projects.

    ERIC Educational Resources Information Center

    Learning By Design, 2001

    2001-01-01

    Highlights elementary school construction projects that have won the Learning By Design Awards for 2001. Projects covered involve new school construction; and renovation, additions, and restoration. (GR)

  4. Prediction of bull fertility.

    PubMed

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. PMID:26791329

  5. Coating Life Prediction

    NASA Technical Reports Server (NTRS)

    Nesbitt, J. A.; Gedwill, M. A.

    1984-01-01

    Hot-section gas-turbine components typically require some form of coating for oxidation and corrosion protection. Efficient use of coatings requires reliable and accurate predictions of the protective life of the coating. Currently engine inspections and component replacements are often made on a conservative basis. As a result, there is a constant need to improve and develop the life-prediction capability of metallic coatings for use in various service environments. The purpose of this present work is aimed at developing of an improved methodology for predicting metallic coating lives in an oxidizing environment and in a corrosive environment.

  6. Wind power prediction models

    NASA Technical Reports Server (NTRS)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  7. Prediction of Microporosity in Shrouded Impeller Castings

    SciTech Connect

    Viswanathan, S. Nelson, C.D.

    1998-09-01

    The purpose of this Cooperative Research and Development Agreement (CRADA) between the Oak Ridge National Laboratory (ORNL) and Morris Bean and Company was to link computer models of heat and fluid flow with previously developed quality criteria for the prediction of microporosity in a Al-4.5% Cu alloy shrouded impeller casting. The results may be used to analyze the casting process design for the commercial production of 206 o alloy shrouded impeller castings. Test impeller castings were poured in the laboratory for the purpose of obtaining thermal data and porosity distributions. Also, a simulation of the test impeller casting was conducted and the results validated with porosity measurements on the test castings. A comparison of the predicted and measured microporosity distributions indicated an excellent correlation between experiments and prediction. The results of the experimental and modeling studies undertaken in this project indicate that the quality criteria developed for the prediction of microporosity in Al-4.5% Cu alloy castings can accurately predict regions of elevated microporosity even in complex castings such as the shrouded impeller casting. Accordingly, it should be possible to use quality criteria for porosity prediction in conjunction with computer models of heat and fluid flow to optimize the casting process for the production of shrouded impeller castings. Since high levels of microporosity may be expected to result in poor fatigue properties, casting designs that are optimized for low levels of microporosity should exhibit superior fatigue life.

  8. The feedback-related negativity signals salience prediction errors, not reward prediction errors.

    PubMed

    Talmi, Deborah; Atkinson, Ryan; El-Deredy, Wael

    2013-05-01

    Modulations of the feedback-related negativity (FRN) event-related potential (ERP) have been suggested as a potential biomarker in psychopathology. A dominant theory about this signal contends that it reflects the operation of the neural system underlying reinforcement learning in humans. The theory suggests that this frontocentral negative deflection in the ERP 230-270 ms after the delivery of a probabilistic reward expresses a prediction error signal derived from midbrain dopaminergic projections to the anterior cingulate cortex. We tested this theory by investigating whether FRN will also be observed for an inherently aversive outcome: physical pain. In another session, the outcome was monetary reward instead of pain. As predicted, unexpected reward omissions (a negative reward prediction error) yielded a more negative deflection relative to unexpected reward delivery. Surprisingly, unexpected pain omission (a positive reward prediction error) also yielded a negative deflection relative to unexpected pain delivery. Our data challenge the theory by showing that the FRN expresses aversive prediction errors with the same sign as reward prediction errors. Both FRNs were spatiotemporally and functionally equivalent. We suggest that FRN expresses salience prediction errors rather than reward prediction errors. PMID:23658166

  9. The feedback-related negativity signals salience prediction errors, not reward prediction errors.

    PubMed

    Talmi, Deborah; Atkinson, Ryan; El-Deredy, Wael

    2013-05-01

    Modulations of the feedback-related negativity (FRN) event-related potential (ERP) have been suggested as a potential biomarker in psychopathology. A dominant theory about this signal contends that it reflects the operation of the neural system underlying reinforcement learning in humans. The theory suggests that this frontocentral negative deflection in the ERP 230-270 ms after the delivery of a probabilistic reward expresses a prediction error signal derived from midbrain dopaminergic projections to the anterior cingulate cortex. We tested this theory by investigating whether FRN will also be observed for an inherently aversive outcome: physical pain. In another session, the outcome was monetary reward instead of pain. As predicted, unexpected reward omissions (a negative reward prediction error) yielded a more negative deflection relative to unexpected reward delivery. Surprisingly, unexpected pain omission (a positive reward prediction error) also yielded a negative deflection relative to unexpected pain delivery. Our data challenge the theory by showing that the FRN expresses aversive prediction errors with the same sign as reward prediction errors. Both FRNs were spatiotemporally and functionally equivalent. We suggest that FRN expresses salience prediction errors rather than reward prediction errors.

  10. Consistent probabilistic outputs for protein function prediction

    PubMed Central

    Obozinski, Guillaume; Lanckriet, Gert; Grant, Charles; Jordan, Michael I; Noble, William Stafford

    2008-01-01

    In predicting hierarchical protein function annotations, such as terms in the Gene Ontology (GO), the simplest approach makes predictions for each term independently. However, this approach has the unfortunate consequence that the predictor may assign to a single protein a set of terms that are inconsistent with one another; for example, the predictor may assign a specific GO term to a given protein ('purine nucleotide binding') but not assign the parent term ('nucleotide binding'). Such predictions are difficult to interpret. In this work, we focus on methods for calibrating and combining independent predictions to obtain a set of probabilistic predictions that are consistent with the topology of the ontology. We call this procedure 'reconciliation'. We begin with a baseline method for predicting GO terms from a collection of data types using an ensemble of discriminative classifiers. We apply the method to a previously described benchmark data set, and we demonstrate that the resulting predictions are frequently inconsistent with the topology of the GO. We then consider 11 distinct reconciliation methods: three heuristic methods; four variants of a Bayesian network; an extension of logistic regression to the structured case; and three novel projection methods - isotonic regression and two variants of a Kullback-Leibler projection method. We evaluate each method in three different modes - per term, per protein and joint - corresponding to three types of prediction tasks. Although the principal goal of reconciliation is interpretability, it is important to assess whether interpretability comes at a cost in terms of precision and recall. Indeed, we find that many apparently reasonable reconciliation methods yield reconciled probabilities with significantly lower precision than the original, unreconciled estimates. On the other hand, we find that isotonic regression usually performs better than the underlying, unreconciled method, and almost never performs worse

  11. Predicting Population Curves.

    ERIC Educational Resources Information Center

    Bunton, Matt

    2003-01-01

    Uses graphs to involve students in inquiry-based population investigations on the Wisconsin gray wolf. Requires students to predict future changes in the wolf population, carrying capacity, and deer population. (YDS)

  12. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  13. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  14. On the prediction of GLE events

    NASA Astrophysics Data System (ADS)

    Nunez, Marlon; Reyes, Pedro

    2016-04-01

    A model for predicting the occurrence of GLE events is presented. This model uses the UMASEP scheme based on the lag-correlation between the time derivatives of soft X-ray flux (SXR) and near-earth proton fluxes (Núñez, 2011, 2015). We extended this approach with the correlation between SXR and ground-level neutron measurements. This model was calibrated with X-ray, proton and neutron data obtained during the period 1989-2015 from the GOES/HEPAD instrument, and neutron data from the Neutron Monitor Data Base (NMDB). During this period, 32 GLE events were detected by neutron monitor stations. We consider that a GLE prediction is successful when it is triggered before the first GLE alert is issued by any neutron station of the NMDB network. For the most recent 16 years (2015-2000), the model was able to issue successful predictions for the 53.8% (7 of 13 GLE events), obtaining a false alarm ratio (FAR) of 36.4% (4/11), and an average warning time of 10 min. For the first years of the evaluation period (1989-1999), the model was able to issue successful predictions for the 31.6% (6 of 19 GLE events), obtaining a FAR of 33.3% (3/9), and an AWT of 17 min. A preliminary conclusion is that the model is not able to predict the promptest events but the more gradual ones. The final goal of this project, which is now halfway through its planned two-year duration, is the prediction of >500 MeV events. This project has received funding from the European Union's Horizon 2020 research and innovation programme under agreement No 637324.

  15. Operational Dust Prediction

    NASA Technical Reports Server (NTRS)

    Benedetti, Angela; Baldasano, Jose M.; Basart, Sara; Benincasa, Francesco; Boucher, Olivier; Brooks, Malcolm E.; Chen, Jen-Ping; Colarco, Peter R.; Gong, Sunlin; Huneeus, Nicolas; Jones, Luke; Lu, Sarah; Menut, Laurent; Morcrette, Jean-Jacques; Mulcahy, Jane; Nickovic, Slobodan; Garcia-Pando, Carlos P.; Reid, Jeffrey S.; Sekiyama, Thomas T.; Tanaka, Taichu Y.; Terradellas, Enric; Westphal, Douglas L.; Zhang, Xiao-Ye; Zhou, Chun-Hong

    2014-01-01

    Over the last few years, numerical prediction of dust aerosol concentration has become prominent at several research and operational weather centres due to growing interest from diverse stakeholders, such as solar energy plant managers, health professionals, aviation and military authorities and policymakers. Dust prediction in numerical weather prediction-type models faces a number of challenges owing to the complexity of the system. At the centre of the problem is the vast range of scales required to fully account for all of the physical processes related to dust. Another limiting factor is the paucity of suitable dust observations available for model, evaluation and assimilation. This chapter discusses in detail numerical prediction of dust with examples from systems that are currently providing dust forecasts in near real-time or are part of international efforts to establish daily provision of dust forecasts based on multi-model ensembles. The various models are introduced and described along with an overview on the importance of dust prediction activities and a historical perspective. Assimilation and evaluation aspects in dust prediction are also discussed.

  16. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  17. Sensor image prediction techniques

    NASA Astrophysics Data System (ADS)

    Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.

    1981-02-01

    The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.

  18. ASRM radiation and flowfield prediction status. [Advanced Solid Rocket Motor plume radiation prediction

    NASA Technical Reports Server (NTRS)

    Reardon, J. E.; Everson, J.; Smith, S. D.; Sulyma, P. R.

    1991-01-01

    Existing and proposed methods for the prediction of plume radiation are discussed in terms of their application to the NASA Advanced Solid Rocket Motor (ASRM) and Space Shuttle Main Engine (SSME) projects. Extrapolations of the Solid Rocket Motor (SRM) are discussed with respect to preliminary predictions of the primary and secondary radiation environments. The methodology for radiation and initial plume property predictions are set forth, including a new code for scattering media and independent secondary source models based on flight data. The Monte Carlo code employs a reverse-evaluation approach which traces rays back to their point of absorption in the plume. The SRM sea-level plume model is modified to account for the increased radiation in the ASRM plume due to the ASRM's propellant chemistry. The ASRM cycle-1 environment predictions are shown to identify a potential reason for the shutdown spike identified with pre-SRM staging.

  19. The 100 People Project

    ERIC Educational Resources Information Center

    McLeod, Keri

    2007-01-01

    This article describes the 100 People Project and how the author integrates the project in her class. The 100 People Project is a nonprofit organization based in New York City. The organization poses the question: If there were only 100 people in the world, what would the world look like? Through the project, students were taught about ethics in…

  20. Earth System Science Project

    ERIC Educational Resources Information Center

    Rutherford, Sandra; Coffman, Margaret

    2004-01-01

    For several decades, science teachers have used bottles for classroom projects designed to teach students about biology. Bottle projects do not have to just focus on biology, however. These projects can also be used to engage students in Earth science topics. This article describes the Earth System Science Project, which was adapted and developed…

  1. Determinants of project success

    NASA Technical Reports Server (NTRS)

    Murphy, D. C.; Baker, B. N.; Fisher, D.

    1974-01-01

    The interactions of numerous project characteristics, with particular reference to project performance, were studied. Determinants of success are identified along with the accompanying implications for client organization, parent organization, project organization, and future research. Variables are selected which are found to have the greatest impact on project outcome, and the methodology and analytic techniques to be employed in identification of those variables are discussed.

  2. Project Lodestar Special Report.

    ERIC Educational Resources Information Center

    Brown, Peggy, Ed.

    1981-01-01

    The Association of American Colleges' (AAC) Project Lodestar is addressed in an article and descriptions of the pilot phase of the project at 13 institutions. In "Project Lodestar: Realistically Assessing the Future," Peggy Brown provides an overview of the project, which is designed to help colleges and universities in assessment of institutional…

  3. Assembling the Project Team.

    ERIC Educational Resources Information Center

    Mills, Donald B.

    2003-01-01

    Although the approval of a project's design and budget typically rests with the campus governing board, a project team determines the configuration, the cost, and the utility of the completed project. Because of the importance of these decisions, colleges and universities must select project team members carefully. (Author)

  4. Korea's School Grounds Projects

    ERIC Educational Resources Information Center

    Park, Joohun

    2003-01-01

    This article describes two projects which Korea has undertaken to improve its school grounds: (1) the Green School Project; and (2) the School Forest Pilot Project. The Korean Ministry of Education and Human Resources Development (MOE&HRI) recently launched the Green School Project centred on existing urban schools with poor outdoor environments.…

  5. Project Follow Through.

    ERIC Educational Resources Information Center

    Illinois State Office of the Superintendent of Public Instruction, Springfield. Dept. for Exceptional Children.

    The four Follow Through projects in Illinois are described and evaluated. These projects involve approximately 1,450 children in K-3 in Mounds, East Saint Louis, Waukegan, and Chicago. The Chicago project is subdivided into three individual projects and is trying three experimental programs. Emphasis is given to the nature of the environmental…

  6. eProject Builder

    SciTech Connect

    2014-06-01

    eProject Builder enables Energy Services Companies (ESCOs) and their contracting agencies to: 1. upload and track project-level Information 2. generate basic project reports required by local, state, and/or federal agencies 3. benchmark new Energy Savings Performance Contract (ESPC) projects against historical data

  7. Improving Software Engineering on NASA Projects

    NASA Technical Reports Server (NTRS)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  8. Crystal gazing v. computer system technology projections.

    NASA Astrophysics Data System (ADS)

    Wells, Donald C.

    The following sections are included: * INTRODUCTION * PREDICTIONS FOR THE EARLY NINETIES * EVOLUTION OF COMPUTER CAPACITIES * UNIX IS COMING! * GRAPHICS TECHNOLOGY * MASSIVE ARCHIVAL STORAGE * ADA AND PROJECT MANAGEMENT * ARTIFICIAL INTELLIGENCE TECHNOLOGY * FILLING IN THE DETAILS * UNIX DESIDERATA * ICON-DRIVEN COMMAND LANGUAGES? * AI AGAIN * DISCUSSION * REFERENCES * BIBLIOGRAPHY—FOR FURTHER READING

  9. Advanced Ground Systems Maintenance Prognostics Project

    NASA Technical Reports Server (NTRS)

    Harp, Janicce Leshay

    2014-01-01

    The project implements prognostics capabilities to predict when a component, system or subsystem will no longer meet desired functional or performance criteria, called the "end of life." The capability also provides an assessment of the "remaining useful life" of a hardware component.

  10. Project management skills.

    PubMed

    Perce, K H

    1998-08-01

    1. Project management skills are important to develop because occupational and environmental health nurses are increasingly asked to implement and manage health related projects and programs. 2. Project management is the process of planning and managing project tasks and resources, and communicating the progress and results. This requires the coordination of time, tasks, equipment, people, and budget. 3. Three main critical skill areas are needed to be an effective project manager: behavioral skills such as negotiation, conflict resolution, and interpersonal problem solving; use of project management tools to manage project tasks and resources; and effective communication skills. PMID:9748920

  11. Guidelines for Project Management

    NASA Technical Reports Server (NTRS)

    Ben-Arieh, David

    2001-01-01

    Project management is an important part of the professional activities at Kennedy Space Center (KSC). Project management is the means by which many of the operations at KSC take shape. Moreover, projects at KSC are implemented in a variety of ways in different organizations. The official guidelines for project management are provided by NASA headquarters and are quite general. The project reported herein deals with developing practical and detailed project management guidelines in support of the project managers. This report summarizes the current project management effort in the Process Management Division and presents a new modeling approach of project management developed by the author. The report also presents the Project Management Guidelines developed during the summer.

  12. Predictive reward signal of dopamine neurons.

    PubMed

    Schultz, W

    1998-07-01

    amygdala, which process specific reward information but do not emit a global reward prediction error signal. A cooperation between the different reward signals may assure the use of specific rewards for selectively reinforcing behaviors. Among the other projection systems, noradrenaline neurons predominantly serve attentional mechanisms and nucleus basalis neurons code rewards heterogeneously. Cerebellar climbing fibers signal errors in motor performance or errors in the prediction of aversive events to cerebellar Purkinje cells. Most deficits following dopamine-depleting lesions are not easily explained by a defective reward signal but may reflect the absence of a general enabling function of tonic levels of extracellular dopamine. Thus dopamine systems may have two functions, the phasic transmission of reward information and the tonic enabling of postsynaptic neurons. PMID:9658025

  13. Dalhousie Orimulsion FGD project

    SciTech Connect

    1995-09-01

    NB Power implemented an {open_quotes}off oil{close_quotes} program following the oil crisis of the 70`s and 80`s. A component of this program was the investigation and implementation of Orimulsion as an alternative to Bunker C. In the mid 1980`s the concept of burning Orinoco, a heavy bitumen, was investigated at a 100 MW plant capable of burning pitch. The predicted burning temperature of Orinoco is 350{degrees}F. Lagoven, the division of Petroleos de Venezuela SA which handled the Orinoco fuel, subsequently developed the emulsified {open_quotes}Orimulsion{close_quotes} form. An agreement between Lagoven and NB Power resulted in the 100 MW Dalhousie No. 1 Generating Station being used as a commercial demonstration facility. The demonstration program ran from 1988 to 1990. Fuel handling, combustion, and operational aspects were established. In 1990, the Dalhousie No. 2 boiler conversion project was establihsed which also included the addition of a wet limestone scrubber to the facility.

  14. The lightcraft project

    NASA Technical Reports Server (NTRS)

    Messitt, Don G.; Myrabo, Leik N.

    1991-01-01

    Rensselaer Polytechnic Institute has been developing a transatmospheric 'Lightcraft' technology which uses beamed laser energy to propel advanced shuttle craft to orbit. In the past several years, Rensselaer students have analyzed the unique combined-cycle Lightcraft engine, designed a small unmanned Lightcraft Technology Demonstrator, and conceptualized larger manned Lightcraft - to name just a few of the interrelated design projects. The 1990-91 class carried out preliminary and detailed design efforts for a one-person 'Mercury' Lightcraft, using computer-aided design and finite-element structural modeling techniques. In addition, they began construction of a 2.6 m-diameter, full-scale engineering prototype mockup. The mockup will be equipped with three robotic legs that 'kneel' for passenger entry and exit. More importantly, the articulated tripod gear is crucial for accurately pointing at, and tracking the laser relay mirrors, a maneuver that must be performed just prior to liftoff. Also accomplished were further design improvements on a 6-inch-diameter Lightcraft model (for testing in RPI's hypersonic tunnel), and new laser propulsion experiments. The resultant experimental data will be used to calibrate Computational Fluid Dynamic (CFD) codes and analytical laser propulsion models that can simulate vehicle/engine flight conditions along a transatmospheric boost trajectory. These efforts will enable the prediction of distributed aerodynamic and thruster loads over the entire full-scale spacecraft.

  15. Template Matching Approach to Signal Prediction

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; Kulikov, Igor

    2010-01-01

    A new approach to signal prediction and prognostic assessment of spacecraft health resolves an inherent difficulty in fusing sensor data with simulated data. This technique builds upon previous work that demonstrated the importance of physics-based transient models to accurate prediction of signal dynamics and system performance. While models can greatly improve predictive accuracy, they are difficult to apply in general because of variations in model type, accuracy, or intended purpose. However, virtually any flight project will have at least some modeling capability at its disposal, whether a full-blown simulation, partial physics models, dynamic look-up tables, a brassboard analogue system, or simple hand-driven calculation by a team of experts. Many models can be used to develop a predict, or an estimate of the next day s or next cycle s behavior, which is typically used for planning purposes. The fidelity of a predict varies from one project to another, depending on the complexity of the simulation (i.e. linearized or full differential equations) and the level of detail in anticipated system operation, but typically any predict cannot be adapted to changing conditions or adjusted spacecraft command execution. Applying a predict blindly, without adapting the predict to current conditions, produces mixed results at best, primarily due to mismatches between assumed execution of spacecraft activities and actual times of execution. This results in the predict becoming useless during periods of complicated behavior, exactly when the predict would be most valuable. Each spacecraft operation tends to show up as a transient in the data, and if the transients are misaligned, using the predict can actually harm forecasting performance. To address this problem, the approach here expresses the predict in terms of a baseline function superposed with one or more transient functions. These transients serve as signal templates, which can be relocated in time and space against

  16. Deadbeat Predictive Controllers

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Phan, Minh

    1997-01-01

    Several new computational algorithms are presented to compute the deadbeat predictive control law. The first algorithm makes use of a multi-step-ahead output prediction to compute the control law without explicitly calculating the controllability matrix. The system identification must be performed first and then the predictive control law is designed. The second algorithm uses the input and output data directly to compute the feedback law. It combines the system identification and the predictive control law into one formulation. The third algorithm uses an observable-canonical form realization to design the predictive controller. The relationship between all three algorithms is established through the use of the state-space representation. All algorithms are applicable to multi-input, multi-output systems with disturbance inputs. In addition to the feedback terms, feed forward terms may also be added for disturbance inputs if they are measurable. Although the feedforward terms do not influence the stability of the closed-loop feedback law, they enhance the performance of the controlled system.

  17. [Predicting suicide or predicting the unpredictable in an uncertain world: Reinforcement Learning Model-Based analysis].

    PubMed

    Desseilles, Martin

    2012-01-01

    In general, it appears that the suicidal act is highly unpredictable with the current scientific means available. In this article, the author submits the hypothesis that predicting suicide is complex because it results in predicting a choice, in itself unpredictable. The article proposes a Reinforcement learning model-based analysis. In this model, we integrate on the one hand, four ascending modulatory neurotransmitter systems (acetylcholine, noradrenalin, serotonin, and dopamine) with their regions of respective projections and afferences, and on the other hand, various observations of brain imaging identified until now in the suicidal process.

  18. An Interactive Flow Model for Projecting School Enrolments.

    ERIC Educational Resources Information Center

    Gould, Edward

    1993-01-01

    Presents a mathematical model for projecting class cohort sizes for districts and schools via demographic data. The model allows entry of estimates for local preschool data to predict first-grade entrants, which are progressed through the school system. Average retention rates, school apportionments and national demographic projections can be…

  19. The Yangtze-Project

    NASA Astrophysics Data System (ADS)

    Subklew, Günter; Ulrich, Julia; Fürst, Leander; Höltkemeier, Agnes

    2010-05-01

    then encounter considerably improved climatic conditions with higher temperatures during their physiologically active season in the summer months. This reversal of the flood pulse in the course of the year will exert an enormous influence on the fauna and flora and the associated processes. Other parameters resulting from the management of the reservoir are the sediment deposits and their varying extents in the different zones of the WFZ. For example, the different degrees of compaction of the sediment of the river bank will largely determine the exchange of oxygen, nutrients and metabolites between the plants and the water body and thus the major ecosystem functions. The locally different thicknesses of the sediment body will be decisive for the emergence of plant shoots through the sediment. In areas of high flow rates, in contrast, habitats will be established that are strongly characterized by the dynamics of the pebbles and boulders. The Three Gorges Project will thus bring about a significant change in habitat conditions for vegetation in the WFZ whose consequences cannot yet be predicted with any certainty. This also concerns the potential and long-term impacts of changed vegetation on the local population, who exploit the plant resources, and also on tourism and on the hydroregime and the sedimentation regime in the reservoir. Landslides and rock falls are the major geological events in the Three Gorges region. The mud and debris avalanches formed during such landslips represent a danger both for areas of settlement and also for land used industrially and agriculturally, as well as for infrastructure facilities, and may also considerably obstruct navigation. Furthermore, the analogous mass movements are one of the reasons for the silting up of the Yangtze and many of its tributaries. The region of the Three Gorges contains rapidly growing urban centres that will receive further impulses for growth from the dam project. The fact that the Chongqing conurbation

  20. Dialogues on prediction errors.

    PubMed

    Niv, Yael; Schoenbaum, Geoffrey

    2008-07-01

    The recognition that computational ideas from reinforcement learning are relevant to the study of neural circuits has taken the cognitive neuroscience community by storm. A central tenet of these models is that discrepancies between actual and expected outcomes can be used for learning. Neural correlates of such prediction-error signals have been observed now in midbrain dopaminergic neurons, striatum, amygdala and even prefrontal cortex, and models incorporating prediction errors have been invoked to explain complex phenomena such as the transition from goal-directed to habitual behavior. Yet, like any revolution, the fast-paced progress has left an uneven understanding in its wake. Here, we provide answers to ten simple questions about prediction errors, with the aim of exposing both the strengths and the limitations of this active area of neuroscience research.

  1. Predicting Emergency Department Visits

    PubMed Central

    Poole, Sarah; Grannis, Shaun; Shah, Nigam H.

    2016-01-01

    High utilizers of emergency departments account for a disproportionate number of visits, often for nonemergency conditions. This study aims to identify these high users prospectively. Routinely recorded registration data from the Indiana Public Health Emergency Surveillance System was used to predict whether patients would revisit the Emergency Department within one month, three months, and six months of an index visit. Separate models were trained for each outcome period, and several predictive models were tested. Random Forest models had good performance and calibration for all outcome periods, with area under the receiver operating characteristic curve of at least 0.96. This high performance was found to be due to non-linear interactions among variables in the data. The ability to predict repeat emergency visits may provide an opportunity to establish, prioritize, and target interventions to ensure that patients have access to the care they require outside an emergency department setting. PMID:27570684

  2. Pilot workload prediction

    NASA Technical Reports Server (NTRS)

    Pepitone, David D.; Shively, Robert J.; Bortolussi, Michael R.

    1988-01-01

    A predicting model of pilot workload is developed using a time-based algorithm, work-load values from previous research, and experimental data obtained by a group of experienced pilots on a Singer-Link Gat-1 instrument trainer with three degrees of motion (roll, pitch, and yaw). Each pilot performed three experimental flights presented in a counterbalanced order; each flight consisted of short, medium, or long cruise and initial approach segments. Results strongly suggest that pilots were more sensitive to the rate at which work was done than to the total amount of work accomplished. The result of predictions obtained with the model showed that the time-weighted average of the component work-load ratings were able to predict the obtained work-load ratings accurately.

  3. Is Suicide Predictable?

    PubMed Central

    Seghatoleslam, T; Habi, H; Rashid, R Abdul; Mosavi, N; Asmaee, S; Naseri, A

    2012-01-01

    Background: The current study aimed to test the hypothesis: Is suicide predictable? And try to classify the predictive factors in multiple suicide attempts. Methods: A cross-sectional study was administered to 223 multiple attempters, women who came to a medical poison centre after a suicide attempt. The participants were young, poor, and single. A Logistic Regression Analiysis was used to classify the predictive factors of suicide. Results: Women who had multiple suicide attempts exhibited a significant tendency to attempt suicide again. They had a history for more than two years of multiple suicide attempts, from three to as many as 18 times, plus mental illnesses such as depression and substance abuse. They also had a positive history of mental illnesses. Conclusion: Results indicate that contributing factors for another suicide attempt include previous suicide attempts, mental illness (depression), or a positive history of mental illnesses in the family affecting them at a young age, and substance abuse. PMID:23113176

  4. Predicting the Sunspot Cycle

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.

    2009-01-01

    The 11-year sunspot cycle was discovered by an amateur astronomer in 1844. Visual and photographic observations of sunspots have been made by both amateurs and professionals over the last 400 years. These observations provide key statistical information about the sunspot cycle that do allow for predictions of future activity. However, sunspots and the sunspot cycle are magnetic in nature. For the last 100 years these magnetic measurements have been acquired and used exclusively by professional astronomers to gain new information about the nature of the solar activity cycle. Recently, magnetic dynamo models have evolved to the stage where they can assimilate past data and provide predictions. With the advent of the Internet and open data policies, amateurs now have equal access to the same data used by professionals and equal opportunities to contribute (but, alas, without pay). This talk will describe some of the more useful prediction techniques and reveal what they say about the intensity of the upcoming sunspot cycle.

  5. Detecting and predicting changes.

    PubMed

    Brown, Scott D; Steyvers, Mark

    2009-02-01

    When required to predict sequential events, such as random coin tosses or basketball free throws, people reliably use inappropriate strategies, such as inferring temporal structure when none is present. We investigate the ability of observers to predict sequential events in dynamically changing environments, where there is an opportunity to detect true temporal structure. In two experiments we demonstrate that participants often make correct statistical decisions when asked to infer the hidden state of the data generating process. However, when asked to make predictions about future outcomes, accuracy decreased even though normatively correct responses in the two tasks were identical. A particle filter model accounts for all data, describing performance in terms of a plausible psychological process. By varying the number of particles, and the prior belief about the probability of a change occurring in the data generating process, we were able to model most of the observed individual differences.

  6. Predictive aging of polymers

    NASA Technical Reports Server (NTRS)

    Cuddihy, Edward F. (Inventor); Willis, Paul B. (Inventor)

    1989-01-01

    A method of predicting aging of polymers operates by heating a polymer in the outdoors to an elevated temperature until a change of property is induced. The test is conducted at a plurality of temperatures to establish a linear Arrhenius plot which is extrapolated to predict the induction period for failure of the polymer at ambient temperature. An Outdoor Photo Thermal Aging Reactor (OPTAR) is also described including a heatable platen for receiving a sheet of polymer, means to heat the platen, and switching means such as a photoelectric switch for turning off the heater during dark periods.

  7. Predictive aging of polymers

    NASA Technical Reports Server (NTRS)

    Cuddihy, Edward F. (Inventor); Willis, Paul B. (Inventor)

    1990-01-01

    A method of predicting aging of polymers operates by heating a polymer in the outdoors to an elevated temperature until a change of property is induced. The test is conducted at a plurality of temperatures to establish a linear Arrhenius plot which is extrapolated to predict the induction period for failure of the polymer at ambient temperature. An Outdoor Photo Thermal Aging Reactor (OPTAR) is also described including a heatable platen for receiving a sheet of polymer, means to heat the platen and switching means such as a photoelectric switch for turning off the heater during dark periods.

  8. Limits and Uses of Dynamical Predictions of Meteorological Drought

    NASA Astrophysics Data System (ADS)

    Lyon, B.

    2012-12-01

    The overall technical capabilities now exist to make real time, seasonal drought forecasts on a near global scale, but how skillful are such predictions? In this talk the skill of seasonal drought indicator predictions based on a combination of real time observations and dynamical model seasonal forecasts is first evaluated over the US and Mexico. The relative contributions of predictive skill from sea surface temperatures and initialed land surface and atmospheric conditions is discussed relative to baseline predictability resulting from the inherent persistence of the indicators. Web-based tools which display such predictions are then briefly described. Finally, the challenges in using such predictions in decision-making settings is described. In many applications, more detailed or tailored information is desired. Examples of the latter are based on IRI-related projects on fire early warning in Kalimantan, food security outlooks in East Africa and research towards drought early warning in the agriculture sector in the Philippines and Sri Lanka.

  9. Multiple regression analyses in the prediction of aerospace instrument costs

    NASA Astrophysics Data System (ADS)

    Tran, Linh

    The aerospace industry has been investing for decades in ways to improve its efficiency in estimating the project life cycle cost (LCC). One of the major focuses in the LCC is the cost/prediction of aerospace instruments done during the early conceptual design phase of the project. The accuracy of early cost predictions affects the project scheduling and funding, and it is often the major cause for project cost overruns. The prediction of instruments' cost is based on the statistical analysis of these independent variables: Mass (kg), Power (watts), Instrument Type, Technology Readiness Level (TRL), Destination: earth orbiting or planetary, Data rates (kbps), Number of bands, Number of channels, Design life (months), and Development duration (months). This author is proposing a cost prediction approach of aerospace instruments based on these statistical analyses: Clustering Analysis, Principle Components Analysis (PCA), Bootstrap, and multiple regressions (both linear and non-linear). In the proposed approach, the Cost Estimating Relationship (CER) will be developed for the dependent variable Instrument Cost by using a combination of multiple independent variables. "The Full Model" will be developed and executed to estimate the full set of nine variables. The SAS program, Excel, Automatic Cost Estimating Integrate Tool (ACEIT) and Minitab are the tools to aid the analysis. Through the analysis, the cost drivers will be identified which will help develop an ultimate cost estimating software tool for the Instrument Cost prediction and optimization of future missions.

  10. PREDICTIVE MODELS. Enhanced Oil Recovery Model

    SciTech Connect

    Ray, R.M.

    1992-02-26

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding, where soap-like surfactants are injected into the reservoir to wash out the oil; 2 carbon dioxide miscible flooding, where carbon dioxide mixes with the lighter hydrocarbons making the oil easier to displace; 3 in-situ combustion, which uses the heat from burning some of the underground oil to thin the product; 4 polymer flooding, where thick, cohesive material is pumped into a reservoir to push the oil through the underground rock; and 5 steamflood, where pressurized steam is injected underground to thin the oil. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes.

  11. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  12. The CrossGrid project

    NASA Astrophysics Data System (ADS)

    Kunze, M.; CrossGrid Collaboration

    2003-04-01

    There are many large-scale problems that require new approaches to computing, such as earth observation, environmental management, biomedicine, industrial and scientific modeling. The CrossGrid project addresses realistic problems in medicine, environmental protection, flood prediction, and physics analysis and is oriented towards specific end-users: Medical doctors, who could obtain new tools to help them to obtain correct diagnoses and to guide them during operations; industries, that could be advised on the best timing for some critical operations involving risk of pollution; flood crisis teams, that could predict the risk of a flood on the basis of historical records and actual hydrological and meteorological data; physicists, who could optimize the analysis of massive volumes of data distributed across countries and continents. Corresponding applications will be based on Grid technology and could be complex and difficult to use: the CrossGrid project aims at developing several tools that will make the Grid more friendly for average users. Portals for specific applications will be designed, that should allow for easy connection to the Grid, create a customized work environment, and provide users with all necessary information to get their job done.

  13. Predicting service life margins

    NASA Technical Reports Server (NTRS)

    Egan, G. F.

    1971-01-01

    Margins are developed for equipment susceptible to malfunction due to excessive time or operation cycles, and for identifying limited life equipment so monitoring and replacing is accomplished before hardware failure. Method applies to hardware where design service is established and where reasonable expected usage prediction is made.

  14. Brightness predictions for comets

    NASA Astrophysics Data System (ADS)

    Green, Daniel W. E.; Marsden, Brian G.; Morris, Charles S.

    2001-02-01

    Daniel W E Green, Brian G Marsden and Charles S Morris write with the aim of illuminating the issue of cometary light curves and brightness predictions, following the publication in this journal last October of the letter by John McFarland (2000).

  15. Predicting Intrinsic Motivation

    ERIC Educational Resources Information Center

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the extent to…

  16. Predicted airframe noise levels

    NASA Technical Reports Server (NTRS)

    Raney, J. P.

    1980-01-01

    Calculated values of airframe noise levels corresponding to FAA noise certification conditions for six aircraft are presented. The aircraft are: DC-9-30; Boeing 727-200; A300-B2 Airbus; Lockheed L-1011; DC-10-10; and Boeing 747-200B. The prediction methodology employed is described and discussed.

  17. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  18. Predicting asthma outcomes.

    PubMed

    Sears, Malcolm R

    2015-10-01

    This review addresses predictors of remission or persistence of wheezing and asthma from early childhood through adulthood. Early childhood wheezing is common, but predicting who will remit or have persistent childhood asthma remains difficult. By adding parental history of asthma and selected infant biomarkers to the history of recurrent wheezing, the Asthma Predictive Index and its subsequent modifications provide better predictions of persistence than simply the observation of recurrent wheeze. Sensitization, especially to multiple allergens, increases the likelihood of development of classic childhood asthma. Remission is more likely in male subjects and those with milder disease (less frequent and less severe symptoms), less atopic sensitization, a lesser degree of airway hyperresponsiveness, and no concomitant allergic disease. Conversely, persistence is linked strongly to allergic sensitization, greater frequency and severity of symptoms, abnormal lung function, and a greater degree of airway hyperresponsiveness. A genetic risk score might predict persistence more accurately than family history. Remission of established adult asthma is substantially less common than remission during childhood and adolescence. Loss of lung function can begin early in life and tracks through childhood and adolescence. Despite therapy which controls symptoms and exacerbations, the outcomes of asthma appear largely resistant to pharmacologic therapy.

  19. Predicting Classroom Success.

    ERIC Educational Resources Information Center

    Kessler, Ronald P.

    A study was conducted at Rancho Santiago College (RSC) to identify personal and academic factors that are predictive of students' success in their courses. The study examined the following possible predictors of success: language and math test scores; background characteristics; length of time out of high school; high school background; college…

  20. Predicting rainfall beyond tomorrow

    Technology Transfer Automated Retrieval System (TEKTRAN)

    NOAA’s Climate Prediction Center issues climate precipitation forecasts that offer potential support for water resource managers and farmers and ranchers in New Mexico, but the forecasts are frequently misunderstood and not widely used in practical decision making. The objectives of this newsletter ...

  1. Prediction method abstracts

    SciTech Connect

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  2. Prediction of preterm birth.

    PubMed

    Borg, F; Gravino, G; Schembri-Wismayer, P; Calleja-Agius, J

    2013-06-01

    Preterm delivery is birth occurring before 37 completed weeks of gestation. Preterm birth is the primary cause of morbidity and mortality in children especially if this occurs before 34 weeks of gestation. If preterm birth could be predicted and treated accordingly, this would greatly reduce mortality, morbidity and associated costs. There have been many attempts to develop an accurate and efficient method to predict preterm premature rupture of membranes (PPROM) and preterm labor that leads to spontaneous preterm birth (SPB). However, the initial signs and symptoms are most often mild and may even occur in normal pregnancies, making early detection rather difficult. The aim of this paper is to provide an overview of the current methods employed in predicting preterm birth occurring due to preterm labor and PPROM. Among these methods are risk scoring systems, cervical/vaginal screening for fetal fibronectin, cervical assessment by ultrasonography, uterine activity monitoring, biomarkers such as endocrine factors, cytokines and enzymes, fetal DNA and genetic polymorphism. SPB is multifactorial, and so it is highly unlikely that a single test can accurately predict SPB. A combination of biological markers is also reviewed in the estimation of the risk of preterm delivery.

  3. Can You Predict?

    ERIC Educational Resources Information Center

    Brown, William R.

    1977-01-01

    Describes a variation of "the suffocating candle" activity used to develop the process of predicting based on reliable data. Instead of using jars of varying sizes under which the burning time of candles is measured, the same jar is used while the candle is elevated on varying numbers of blocks. (CS)

  4. Predictability of critical transitions

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaozhu; Kuehn, Christian; Hallerberg, Sarah

    2015-11-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socioeconomic changes and climate transitions between ice ages and warm ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However, especially in the presence of noise, it is not clear whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the quadratic integrate-and-fire model and the van der Pol model under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictability of the system. The performance of different indicator variables turns out to be dependent on the specific model under study and the conditions of accessing it. Furthermore, we study the influence of the magnitude of transitions on the predictive performance.

  5. Predicting Future Citation Behavior.

    ERIC Educational Resources Information Center

    Burrell, Quentin L.

    2003-01-01

    Develops the theory for a stochastic model for the citation process in the presence of obsolescence to predict the future citation pattern of individual papers in a collection. Shows that the expected number of future citations is a linear function of the current number, interpreted as an example of a success-breeds-success phenomenon. (Author/LRW)

  6. Predicting Systemic Confidence

    ERIC Educational Resources Information Center

    Falke, Stephanie Inez

    2009-01-01

    Using a mixed method approach, this study explored which educational factors predicted systemic confidence in master's level marital and family therapy (MFT) students, and whether or not the impact of these factors was influenced by student beliefs and their perception of their supervisor's beliefs about the value of systemic practice. One hundred…

  7. Prediction of Reading Success.

    ERIC Educational Resources Information Center

    Hirst, Wilma E.

    The findings of a 3-year longitudinal research study to investigate predictive instruments for beginning reading achievement are reported. The original sample consisted of 300 kindergarten children from three socioeconomic attendance areas in the Cheyenne, Wyoming, public schools. For the final evaluation at the end of grade 2, only those pupils…

  8. Predicting visibility of aircraft.

    PubMed

    Watson, Andrew; Ramirez, Cesar V; Salud, Ellen

    2009-05-20

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration.

  9. PREVAPORATION PERFORMANCE PREDICTION SOFTWARE

    EPA Science Inventory

    The Pervaporation, Performance, Prediction Software and Database (PPPS&D) computer software program is currently being developed within the USEPA, NRMRL. The purpose of the PPPS&D program is to educate and assist potential users in identifying opportunities for using pervaporati...

  10. Predicting Visibility of Aircraft

    PubMed Central

    Watson, Andrew; Ramirez, Cesar V.; Salud, Ellen

    2009-01-01

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration. PMID:19462007

  11. Predicting Reasoning from Memory

    ERIC Educational Resources Information Center

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  12. Inflation of Conditional Predictions

    ERIC Educational Resources Information Center

    Koriat, Asher; Fiedler, Klaus; Bjork, Robert A.

    2006-01-01

    The authors report 7 experiments indicating that conditional predictions--the assessed probability that a certain outcome will occur given a certain condition--tend to be markedly inflated. The results suggest that this inflation derives in part from backward activation in which the target outcome highlights aspects of the condition that are…

  13. Optimising Impact in Astronomy for Development Projects

    NASA Astrophysics Data System (ADS)

    Grant, Eli

    2015-08-01

    Positive outcomes in the fields of science education and international development are notoriously difficult to achieve. Among the challenges facing projects that use astronomy to improve education and socio-economic development is how to optimise project design in order to achieve the greatest possible benefits. Over the past century, medical scientists along with statisticians and economists have progressed an increasingly sophisticated and scientific approach to designing, testing and improving social intervention and public health education strategies. This talk offers a brief review of the history and current state of `intervention science'. A similar framework is then proposed for astronomy outreach and education projects, with applied examples given of how existing evidence can be used to inform project design, predict and estimate cost-effectiveness, minimise the risk of unintended negative consequences and increase the likelihood of target outcomes being achieved.

  14. Human genetics: international projects and personalized medicine.

    PubMed

    Apellaniz-Ruiz, Maria; Gallego, Cristina; Ruiz-Pinto, Sara; Carracedo, Angel; Rodríguez-Antona, Cristina

    2016-03-01

    In this article, we present the progress driven by the recent technological advances and new revolutionary massive sequencing technologies in the field of human genetics. We discuss this knowledge in relation with drug response prediction, from the germline genetic variation compiled in the 1000 Genomes Project or in the Genotype-Tissue Expression project, to the phenome-genome archives, the international cancer projects, such as The Cancer Genome Atlas or the International Cancer Genome Consortium, and the epigenetic variation and its influence in gene expression, including the regulation of drug metabolism. This review is based on the lectures presented by the speakers of the Symposium "Human Genetics: International Projects & New Technologies" from the VII Conference of the Spanish Pharmacogenetics and Pharmacogenomics Society, held on the 20th and 21st of April 2015.

  15. Elective Program Projects

    ERIC Educational Resources Information Center

    Estrada, Christelle

    1976-01-01

    Outlined is an interdisciplinary program in Ecology and Oceanography for grades six through eight. Numerous student projects are suggested in the outline and the course requirements and the project system are explained. (MA)

  16. Number projection method

    SciTech Connect

    Kaneko, K.

    1987-02-01

    A relationship between the number projection and the shell model methods is investigated in the case of a single-j shell. We can find a one-to-one correspondence between the number projected and the shell model states.

  17. Venezuela's Bolivarian Schools Project.

    ERIC Educational Resources Information Center

    Diaz, Maria Magnolia Santamaria

    2002-01-01

    Discusses efforts by the Venezuelan government to improve the nation's school infrastructure through the Bolivarian Schools Project administered by the Ministry of Education, Culture and Sport. The project set educational principles which are guiding current school building efforts. (EV)

  18. The Alzheimer's Project

    MedlinePlus

    ... Navigation Bar Home Current Issue Past Issues The Alzheimer's Project Past Issues / Spring 2009 Table of Contents ... of this page please turn Javascript on. The Alzheimer's Project A 4-Part Documentary Series Starting May ...

  19. Radiation Effects: Core Project

    NASA Technical Reports Server (NTRS)

    Dicello, John F.

    1999-01-01

    methods and predictions which are being used to assess the levels of risks to be encountered and to evaluate appropriate strategies for countermeasures. Although the work in this project is primarily directed toward problems associated with space travel, the problem of protracted exposures to low-levels of radiation is one of national interest in our energy and defense programs, and the results may suggest new paradigms for addressing such risks.

  20. Uranium Pyrophoricity Phenomena and Prediction

    SciTech Connect

    DUNCAN, D.R.

    2000-04-20

    We have compiled a topical reference on the phenomena, experiences, experiments, and prediction of uranium pyrophoricity for the Hanford Spent Nuclear Fuel Project (SNFP) with specific applications to SNFP process and situations. The purpose of the compilation is to create a reference to integrate and preserve this knowledge. Decades ago, uranium and zirconium fires were commonplace at Atomic Energy Commission facilities, and good documentation of experiences is surprisingly sparse. Today, these phenomena are important to site remediation and analysis of packaging, transportation, and processing of unirradiated metal scrap and spent nuclear fuel. Our document, bearing the same title as this paper, will soon be available in the Hanford document system [Plys, et al., 2000]. This paper explains general content of our topical reference and provides examples useful throughout the DOE complex. Moreover, the methods described here can be applied to analysis of potentially pyrophoric plutonium, metal, or metal hydride compounds provided that kinetic data are available. A key feature of this paper is a set of straightforward equations and values that are immediately applicable to safety analysis.

  1. GHPsRUS Project

    DOE Data Explorer

    Battocletti, Liz

    2013-07-09

    The GHPsRUS Project's full name is "Measuring the Costs and Benefits of Nationwide Geothermal Heat Pump Deployment." The dataset contains employment and installation price data collected by four economic surveys: (1)GHPsRUS Project Manufacturer & OEM Survey, (2) GHPsRUS Project Geothermal Loop Survey, (3) GHPsRUS Project Mechanical Equipment Installation Survey, and (4) GHPsRUS Geothermal Heat Pump Industry Survey

  2. Predicting Major Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-05-01

    Coronal mass ejections (CMEs) and solar flares are two examples of major explosions from the surface of the Sun but theyre not the same thing, and they dont have to happen at the same time. A recent study examines whether we can predict which solar flares will be closely followed by larger-scale CMEs.Image of a solar flare from May 2013, as captured by NASAs Solar Dynamics Observatory. [NASA/SDO]Flares as a Precursor?A solar flare is a localized burst of energy and X-rays, whereas a CME is an enormous cloud of magnetic flux and plasma released from the Sun. We know that some magnetic activity on the surface of the Sun triggers both a flare and a CME, whereas other activity only triggers a confined flare with no CME.But what makes the difference? Understanding this can help us learn about the underlying physical drivers of flares and CMEs. It also might help us to better predict when a CME which can pose a risk to astronauts, disrupt radio transmissions, and cause damage to satellites might occur.In a recent study, Monica Bobra and Stathis Ilonidis (Stanford University) attempt to improve our ability to make these predictions by using a machine-learning algorithm.Classification by ComputerUsing a combination of 6 or more features results in a much better predictive success (measured by the True Skill Statistic; higher positive value = better prediction) for whether a flare will be accompanied by a CME. [Bobra Ilonidis 2016]Bobra and Ilonidis used magnetic-field data from an instrument on the Solar Dynamics Observatory to build a catalog of solar flares, 56 of which were accompanied by a CME and 364 of which were not. The catalog includes information about 18 different features associated with the photospheric magnetic field of each flaring active region (for example, the mean gradient of the horizontal magnetic field).The authors apply a machine-learning algorithm known as a binary classifier to this catalog. This algorithm tries to predict, given a set of features

  3. System Alternatives Project

    ERIC Educational Resources Information Center

    Petrait, James A.

    1977-01-01

    The Systems Alternatives Project is an attempt to develop open classroom alternatives within a modular scheduling system. Biology students are given both action and test objectives that emphasize individualization. Structure of the project is detailed and an attempt to analyze the project evaluation data statistically is included. (MA)

  4. The Sidewalk Project

    ERIC Educational Resources Information Center

    Church, William

    2005-01-01

    In this article, the author features "the sidewalk project" in Littleton High School. The sidewalk project is a collaboration of more than 40 high school physics students, 10 local mentors, and a few regional and national organizations who worked together to invent a way to heat a sidewalk with an alternative energy source. The project, which…

  5. The Proposal Project

    ERIC Educational Resources Information Center

    Pierce, Elizabeth

    2007-01-01

    The proposal project stretches over a significant portion of the semester-long sophomore course Professional Communication (ENG 250) at Monroe Community College. While developing their proposal project, students need to use time management skills to successfully complete a quality project on time. In addition, excellent oral and written…

  6. Project ASTRO: A Partnership.

    ERIC Educational Resources Information Center

    Rothenberger, Lisa

    2001-01-01

    Describes a project that enriches astronomy lessons with hands-on activities facilitated by an astronomer. The project links professional and amateur astronomers with middle-level classroom teachers and informal educators. Families and community organizations are also involved in the project. Provides information on how to join the ASTRO network.…

  7. Kansas Advanced Semiconductor Project

    SciTech Connect

    Baringer, P.; Bean, A.; Bolton, T.; Horton-Smith, G.; Maravin, Y.; Ratra, B.; Stanton, N.; von Toerne, E.; Wilson, G.

    2007-09-21

    KASP (Kansas Advanced Semiconductor Project) completed the new Layer 0 upgrade for D0, assumed key electronics projects for the US CMS project, finished important new physics measurements with the D0 experiment at Fermilab, made substantial contributions to detector studies for the proposed e+e- international linear collider (ILC), and advanced key initiatives in non-accelerator-based neutrino physics.

  8. Of Principals and Projects.

    ERIC Educational Resources Information Center

    Wyant, Spencer H.; And Others

    Principals play an important role in the success of externally funded change projects in their schools. Interviews exploring the participation of principals in such projects in 14 Oregon elementary and secondary schools provided 11 case studies illustrating helpful and unhelpful behaviors. The projects were found to have life cycles of their own,…

  9. THE ATLANTA SUPERSITE PROJECT

    EPA Science Inventory

    The Atlanta Supersites project is the first of two Supersites projects to be established during Phase I of EPA's Supersites Program; Phase 11 is being established through a Request for Assistance. The other initial project is in Fresno, California. The Supersites Program is par...

  10. Visible Human Project

    MedlinePlus

    ... Mobile Gallery Site Navigation Home The Visible Human Project ® Overview The Visible Human Project ® is an outgrowth of the NLM's 1986 Long- ... The long-term goal of the Visible Human Project ® is to produce a system of knowledge structures ...

  11. Projection: A Bibliography.

    ERIC Educational Resources Information Center

    Pedrini, D. T.; Pedrini, Bonnie C.

    Sigmund Freud and his associates did much clinical work with the dynamic of projection, especially with regard to paranoid symptoms and syndromes. Much experimental work has also been done with projection. Sears evaluated the results of some of those studies. Murstein and Pryer sub-classified projection and reviewed typical studies. The…

  12. The Llama Project.

    ERIC Educational Resources Information Center

    Ganzel, Candy; Stuglik, Jan

    2003-01-01

    At a suburban Indiana elementary school, the Project Approach serves as the basis of the curriculum in all Kindergarten classrooms. The four classes of 5- and 6-year-old children at this school chose to study llamas. This article discusses how the project evolved, describes the three phases of the project, and provides teachers' reflections on the…

  13. Little River Project.

    ERIC Educational Resources Information Center

    Naisbitt, Ian

    1995-01-01

    Describes the adoption of an old riverside landfill by an elementary school as a Habitat 2000 community project. Contains a "how-to" checklist for such a project, information on building school-community community partnerships, and promotional ideas for stewardship projects. (LZ)

  14. Humane Education Projects Handbook.

    ERIC Educational Resources Information Center

    Junior League of Ogden, UT.

    This handbook was developed to promote interest in humane education and to encourage the adoption of humane education projects. Although specifically designed to assist Junior Leagues in developing such projects, the content should prove valuable to animal welfare organizations, zoos, aquariums, nature centers, and other project-oriented groups…

  15. The Eggen Card Project

    NASA Astrophysics Data System (ADS)

    Silvis, G.

    2014-06-01

    (Abstract only) Olin Eggen, noted astronomer (1919-1998), left to us all his raw observation records recorded on 3x5 cards. This project is to make all this data available as an online resource. History and progress of the project will be presented. Project details available at: https://sites.google.com/site/eggencards/home.

  16. Tips for Project Management.

    ERIC Educational Resources Information Center

    Thornley, James

    1996-01-01

    Presents ideas regarding instructional design project management for performance technologists that are likely to increase client satisfaction. Topics include a project kickoff summary; maintaining communication, including submitting reports, talking to the client, addressing problems quickly, and follow-up; and closing the project effectively.…

  17. Data driven propulsion system weight prediction model

    NASA Technical Reports Server (NTRS)

    Gerth, Richard J.

    1994-01-01

    The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.

  18. Pioneering Heat Pump Project

    SciTech Connect

    Aschliman, Dave; Lubbehusen, Mike

    2015-06-30

    This project was initiated at a time when ground coupled heat pump systems in this region were limited in size and quantity. There were economic pressures with costs for natural gas and electric utilities that had many organizations considering ground coupled heat pumps; The research has added to the understanding of how ground temperatures fluctuate seasonally and how this affects the performance and operation of the heat pumps. This was done by using a series of temperature sensors buried within the middle of one of the vertical bore fields with sensors located at various depths below grade. Trending of the data showed that there is a lag in ground temperature with respect to air temperatures in the shoulder months, however as full cooling and heating season arrives, the heat rejection and heat extraction from the ground has a significant effect on the ground temps; Additionally it is better understood that while a large community geothermal bore field serving multiple buildings does provide a convenient central plant to use, it introduces complexity of not being able to easily model and predict how each building will contribute to the loads in real time. Additional controllers and programming were added to provide more insight into this real time load profile and allow for intelligent shedding of load via a dry cooler during cool nights in lieu of rejecting to the ground loop. This serves as a means to ‘condition’ the ground loop and mitigate thermal creep of the field, as is typically observed; and It has been observed when compared to traditional heating and cooling equipment, there is still a cost premium to use ground source heat pumps that is driven mostly by the cost for vertical bore holes. Horizontal loop systems are less costly to install, but do not perform as well in this climate zone for heating mode

  19. Underestimation of Project Costs

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  20. Projecting future sea level

    USGS Publications Warehouse

    Cayan, Daniel R.; Bromirski, Peter; Hayhoe, Katharine; Tyree, Mary; Dettinger, Mike; Flick, Reinhard

    2006-01-01

    California’s coastal observations and global model projections indicate that California’s open coast and estuaries will experience increasing sea levels over the next century. Sea level rise has affected much of the coast of California, including the Southern California coast, the Central California open coast, and the San Francisco Bay and upper estuary. These trends, quantified from a small set of California tide gages, have ranged from 10–20 centimeters (cm) (3.9–7.9 inches) per century, quite similar to that estimated for global mean sea level. So far, there is little evidence that the rate of rise has accelerated, and the rate of rise at California tide gages has actually flattened since 1980, but projections suggest substantial sea level rise may occur over the next century. Climate change simulations project a substantial rate of global sea level rise over the next century due to thermal expansion as the oceans warm and runoff from melting land-based snow and ice accelerates. Sea level rise projected from the models increases with the amount of warming. Relative to sea levels in 2000, by the 2070–2099 period, sea level rise projections range from 11–54 cm (4.3–21 in) for simulations following the lower (B1) greenhouse gas (GHG) emissions scenario, from 14–61 cm (5.5–24 in) for the middle-upper (A2) emission scenario, and from 17–72 cm (6.7–28 in) for the highest (A1fi) scenario. In addition to relatively steady secular trends, sea levels along the California coast undergo shorter period variability above or below predicted tide levels and changes associated with long-term trends. These variations are caused by weather events and by seasonal to decadal climate fluctuations over the Pacific Ocean that in turn affect the Pacific coast. Highest coastal sea levels have occurred when winter storms and Pacific climate disturbances, such as El Niño, have coincided with high astronomical tides. This study considers a range of projected future

  1. Predictive spark timing method

    SciTech Connect

    Tang, D.L.; Chang, M.F.; Sultan, M.C.

    1990-01-09

    This patent describes a method of determining spark time in a spark timing system of an internal combustion engine having a plurality of cylinders and a spark period for each cylinder in which a spark occurs. It comprises: generating at least one crankshaft position reference pulse for each spark firing event, the reference pulse nearest the next spark being set to occur within a same cylinder event as the next spark; measuring at least two reference periods between recent reference pulses; calculating the spark timing synchronously with crankshaft position by performing the calculation upon receipt of the reference pulse nearest the next spark; predicting the engine speed for the next spark period from at least two reference periods including the most recent reference period; and based on the predicted speed, calculating a spark time measured from the the reference pulse nearest the next spark.

  2. Predicting catastrophic shifts.

    PubMed

    Weissmann, Haim; Shnerb, Nadav M

    2016-05-21

    Catastrophic shifts are known to pose a serious threat to ecology, and a reliable set of early warning indicators is desperately needed. However, the tools suggested so far have two problems. First, they cannot discriminate between a smooth transition and an imminent irreversible shift. Second, they aimed at predicting the tipping point where a state loses its stability, but in noisy spatial system the actual transition occurs when an alternative state invades. Here we suggest a cluster tracking technique that solves both problems, distinguishing between smooth and catastrophic transitions and to identify an imminent shift in both cases. Our method may allow for the prediction, and thus hopefully the prevention of such transitions, avoiding their destructive outcomes. PMID:26970446

  3. Prediction of Antibody Epitopes.

    PubMed

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    Antibodies recognize their cognate antigens in a precise and effective way. In order to do so, they target regions of the antigenic molecules that have specific features such as large exposed areas, presence of charged or polar atoms, specific secondary structure elements, and lack of similarity to self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin. Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody epitopes from the sequence and/or the three-dimensional structure of a target protein. PMID:26424260

  4. Predictive Temperature Equations for Three Sites at the Grand Canyon

    NASA Astrophysics Data System (ADS)

    McLaughlin, Katrina Marie Neitzel

    Climate data collected at a number of automated weather stations were used to create a series of predictive equations spanning from December 2009 to May 2010 in order to better predict the temperatures along hiking trails within the Grand Canyon. The central focus of this project is how atmospheric variables interact and can be combined to predict the weather in the Grand Canyon at the Indian Gardens, Phantom Ranch, and Bright Angel sites. Through the use of statistical analysis software and data regression, predictive equations were determined. The predictive equations are simple or multivariable best fits that reflect the curvilinear nature of the data. With data analysis software curves resulting from the predictive equations were plotted along with the observed data. Each equation's reduced chi2 was determined to aid the visual examination of the predictive equations' ability to reproduce the observed data. From this information an equation or pair of equations was determined to be the best of the predictive equations. Although a best predictive equation for each month and season was determined for each site, future work may refine equations to result in a more accurate predictive equation.

  5. Coating life prediction

    NASA Technical Reports Server (NTRS)

    Nesbitt, James A.; Gedwill, Michael A.

    1985-01-01

    The investigation combines both experimental studies and numerical modeling to predict coating life in an oxidizing environment. The experimental work provides both input to and verification of two numerical models. The coatings being examined are an aluminide coating on Udimet 700 (U-700), a low-pressure plasma spray (LPPS) Ni-18Co-17Cr-24Al-0.2Y overlay coating also on U- 700, and bulk deposits of the LPPS NiCoCrAlY coating.

  6. Predicting appointment breaking.

    PubMed

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows. PMID:10142384

  7. Predicting Individual Fuel Economy

    SciTech Connect

    Lin, Zhenhong; Greene, David L

    2011-01-01

    To make informed decisions about travel and vehicle purchase, consumers need unbiased and accurate information of the fuel economy they will actually obtain. In the past, the EPA fuel economy estimates based on its 1984 rules have been widely criticized for overestimating on-road fuel economy. In 2008, EPA adopted a new estimation rule. This study compares the usefulness of the EPA's 1984 and 2008 estimates based on their prediction bias and accuracy and attempts to improve the prediction of on-road fuel economies based on consumer and vehicle attributes. We examine the usefulness of the EPA fuel economy estimates using a large sample of self-reported on-road fuel economy data and develop an Individualized Model for more accurately predicting an individual driver's on-road fuel economy based on easily determined vehicle and driver attributes. Accuracy rather than bias appears to have limited the usefulness of the EPA 1984 estimates in predicting on-road MPG. The EPA 2008 estimates appear to be equally inaccurate and substantially more biased relative to the self-reported data. Furthermore, the 2008 estimates exhibit an underestimation bias that increases with increasing fuel economy, suggesting that the new numbers will tend to underestimate the real-world benefits of fuel economy and emissions standards. By including several simple driver and vehicle attributes, the Individualized Model reduces the unexplained variance by over 55% and the standard error by 33% based on an independent test sample. The additional explanatory variables can be easily provided by the individuals.

  8. Predictive Game Theory

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  9. Multivariate respiratory motion prediction

    NASA Astrophysics Data System (ADS)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  10. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  11. Modeling Success in FLOSS Project Groups

    SciTech Connect

    Beaver, Justin M; Cui, Xiaohui; ST Charles, Jesse Lee; Potok, Thomas E

    2009-01-01

    A significant challenge in software engineering is accurately modeling projects in order to correctly forecast success or failure. The primary difficulty is that software development efforts are complex in terms of both the technical and social aspects of the engineering environment. This is compounded by the lack of real data that captures both the measures of success in performing a process, and the measures that reflect a group s social dynamics. This research focuses on the development of a model for predicting software project success that leverages the wealth of available open source project data in order to accurately model the behavior of those software engineering groups. Our model accounts for both the technical elements of software engineering as well as the social elements that drive the decisions of individual developers. We use agent-based simulations to represent the complexity of the group interactions, and base the behavior of the agents on the real software engineering data acquired. For four of the five project success measures, our results indicate that the developed model represents the underlying data well and provides accurate predictions of open source project success indicators.

  12. Asian summer monsoon rainfall predictability: a predictable mode analysis

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Lee, June-Yi; Xiang, Baoqiang

    2015-01-01

    To what extent the Asian summer monsoon (ASM) rainfall is predictable has been an important but long-standing issue in climate science. Here we introduce a predictable mode analysis (PMA) method to estimate predictability of the ASM rainfall. The PMA is an integral approach combining empirical analysis, physical interpretation and retrospective prediction. The empirical analysis detects most important modes of variability; the interpretation establishes the physical basis of prediction of the modes; and the retrospective predictions with dynamical models and physics-based empirical (P-E) model are used to identify the "predictable" modes. Potential predictability can then be estimated by the fractional variance accounted for by the "predictable" modes. For the ASM rainfall during June-July-August, we identify four major modes of variability in the domain (20°S-40°N, 40°E-160°E) during 1979-2010: (1) El Niño-La Nina developing mode in central Pacific, (2) Indo-western Pacific monsoon-ocean coupled mode sustained by a positive thermodynamic feedback with the aid of background mean circulation, (3) Indian Ocean dipole mode, and (4) a warming trend mode. We show that these modes can be predicted reasonably well by a set of P-E prediction models as well as coupled models' multi-model ensemble. The P-E and dynamical models have comparable skills and complementary strengths in predicting ASM rainfall. Thus, the four modes may be regarded as "predictable" modes, and about half of the ASM rainfall variability may be predictable. This work not only provides a useful approach for assessing seasonal predictability but also provides P-E prediction tools and a spatial-pattern-bias correction method to improve dynamical predictions. The proposed PMA method can be applied to a broad range of climate predictability and prediction problems.

  13. Managing Projects with KPRO

    NASA Technical Reports Server (NTRS)

    Braden, Barry M.

    2004-01-01

    How does a Project Management Office provide: Consistent, familiar, easily used scheduling tools to Project Managers and project team members? Provide a complete list of organization resources available for use on the project? Facilitate resource tracking and visibility? Provide the myriad reports that the organization requires? Facilitate consistent budget planning and cost performance information? Provide all of this to the entire organization? Provide for the unique requirement of the organization? and get people to use it? Answer: Implementation of the Kennedy space Center Projects and Resources Online (KPRO), a modified COTS solution.

  14. Seasonal Prediction with the GEOS GCM

    NASA Technical Reports Server (NTRS)

    Suarez, Max; Schubert, S.; Chang, Y.

    1999-01-01

    A number of ensembles of seasonal forecasts have recently been completed as part of NASA's Seasonal to Interannual Prediction Project (NSIPP). The focus is on the extratropical response of the atmosphere to observed SST anomalies during boreal winter. Each prediction consists of nine forecasts starting from slightly different initial conditions. Forecasts are done for every winter from 1981 to 1995 using Version 2 of the GEOS GCM. Comparisons with six long-term integrations (1978-1995) using the same model are used to separate the contributions of initial and boundary conditions to forecast skill. The forecasts also allow us to isolate the SSt forced response (the signal) from the atmosphere's natural variability (the noise).

  15. Predicting Human Cooperation.

    PubMed

    Nay, John J; Vorobeychik, Yevgeniy

    2016-01-01

    The Prisoner's Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner's Dilemma (defection), when played by both players, is mutually harmful. Repetition of the Prisoner's Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner's Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner's Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation. PMID:27171417

  16. Predicting Human Cooperation

    PubMed Central

    Nay, John J.; Vorobeychik, Yevgeniy

    2016-01-01

    The Prisoner’s Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner’s Dilemma (defection), when played by both players, is mutually harmful. Repetition of the Prisoner’s Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner’s Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner’s Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation. PMID:27171417

  17. Prediction of psychoacoustic parameters

    NASA Astrophysics Data System (ADS)

    Genuit, Klaus; Fiebig, Andre

    2005-09-01

    Noise is defined as an audible sound which either disturbs the silence, or an intentional sound that listening to leads to annoyance. Thus, it is clearly defined that the assignment of noise cannot be reduced to simple determining objective parameters like the A-weighted SPL. The question whether a sound is judged as noise can only be answered after the transformation from the sound event into an hearing event has been accomplished. The evaluation of noise depends on the physical characteristics of the sound event, on the psychoacoustical features of the human ear as well as on the psychological aspects of men. The subjectively felt noise quality depends not only on the A-weighted sound-pressure level, but also on other psychoacoustical parameters such as loudness, roughness, sharpness, etc. The known methods for the prediction of the spatial A-weighted SPL distribution in dependence on the propagation are not suitable to predict psychoacoustic parameters in an adequate way. Especially, the roughness provoked by modulation or the sharpness generated by an accumulation of high, frequent sound energy cannot offhandedly be predicted as distance dependent.

  18. Eclipse prediction in Mesopotamia.

    NASA Astrophysics Data System (ADS)

    Steele, J. M.

    2000-02-01

    Among the many celestial phenomena observed in ancient Mesopotamia, eclipses, particularly eclipses of the Moon, were considered to be among the astrologically most significant events. In Babylon, by at least the middle of the seventh century BC, and probably as early as the middle of the eighth century BC, astronomical observations were being systematically conducted and recorded in a group of texts which we have come to call Astronomical Diaries. These Diaries contain many observations and predictions of eclipses. The predictions generally include the expected time of the eclipse, apparently calculated quite precisely. By the last three centuries BC, the Babylonian astronomers had developed highly advanced mathematical theories of the Moon and planets. This paper outlines the various methods which appear to have been formulated by the Mesopotamian astronomers to predict eclipses of the Sun and the Moon. It also considers the question of which of these methods were actually used in compiling the Astronomical Diaries, and speculates why these particular methods were used.

  19. Predictive Modeling of the CDRA 4BMS

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.

    2016-01-01

    As part of NASA's Advanced Exploration Systems (AES) program and the Life Support Systems Project (LSSP), fully predictive models of the Four Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) on the International Space Station (ISS) are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  20. Phonematic recognition by linear prediction: Experiment

    NASA Astrophysics Data System (ADS)

    Miclet, L.; Grenier, Y.; Leroux, J.

    The recognition of speech signals analyzed by linear prediction is introduced. The principle of the channel adapted vocoder (CAV) is outlined. The learning of each channel model and adaptation to the speaker are discussed. A method stemming from the canonical analysis of correlations is given. This allows, starting with the CAV of one speaker, the calculation of that of another. The projection function is learned from a series of key words pronounced by both speakers. The reconstruction of phonemes can be explained by recognition factors arising from the vocoder. Automata associated with the channels are used for local smoothing and series of segments are treated in order to produce a phonemic lattice.

  1. Dopamine reward prediction error coding.

    PubMed

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  2. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377

  3. DOE-EPSCOR SPONSORED PROJECT FINAL REPORT

    SciTech Connect

    Zhu, Jianting

    2010-03-11

    Concern over the quality of environmental management and restoration has motivated the model development for predicting water and solute transport in the vadose zone. Soil hydraulic properties are required inputs to subsurface models of water flow and contaminant transport in the vadose zone. Computer models are now routinely used in research and management to predict the movement of water and solutes into and through the vadose zone of soils. Such models can be used successfully only if reliable estimates of the soil hydraulic parameters are available. The hydraulic parameters considered in this project consist of the saturated hydraulic conductivity and four parameters of the water retention curves. To quantify hydraulic parameters for heterogeneous soils is both difficult and time consuming. The overall objective of this project was to better quantify soil hydraulic parameters which are critical in predicting water flows and contaminant transport in the vadose zone through a comprehensive and quantitative study to predict heterogeneous soil hydraulic properties and the associated uncertainties. Systematic and quantitative consideration of the parametric heterogeneity and uncertainty can properly address and further reduce predictive uncertainty for contamination characterization and environmental restoration at DOE-managed sites. We conducted a comprehensive study to assess soil hydraulic parameter heterogeneity and uncertainty. We have addressed a number of important issues related to the soil hydraulic property characterizations. The main focus centered on new methods to characterize anisotropy of unsaturated hydraulic property typical of layered soil formations, uncertainty updating method, and artificial neural network base pedo-transfer functions to predict hydraulic parameters from easily available data. The work also involved upscaling of hydraulic properties applicable to large scale flow and contaminant transport modeling in the vadose zone and

  4. Evaluating the 1995 BLS Projections.

    ERIC Educational Resources Information Center

    Rosenthal, Neal H.; Fullerton, Howard N., Jr.; Andreassen, Arthur; Veneri, Carolyn M.

    1997-01-01

    Includes "Introduction" (Neal H. Rosenthal); "Labor Force Projections" (Howard N. Fullerton, Jr.); "Industry Employment Projections" (Arthur Andreassen); and "Occupational Employment Projections" (Carolyn M. Veneri). (JOW)

  5. Analytical predictions of RTG power degradation. [Radioisotope Thermoelectric Generator

    NASA Technical Reports Server (NTRS)

    Noon, E. L.; Raag, V.

    1979-01-01

    The DEGRA computer code that is based on a mathematical model which predicts performance and time-temperature dependent degradation of a radioisotope thermoelectric generator is discussed. The computer code has been used to predict performance and generator degradation for the selenide Ground Demonstration Unit (GDS-1) and the generator used in the Galileo Project. Results of parametric studies of load voltage vs generator output are examined as well as the I-V curve and the resulting predicted power vs voltage. The paper also discusses the increased capability features contained in DEGRA2 and future plans for expanding the computer code performance.

  6. Coastal Ohio Wind Project

    SciTech Connect

    Gorsevski, Peter; Afjeh, Abdollah; Jamali, Mohsin; Bingman, Verner

    2014-04-04

    The Coastal Ohio Wind Project intends to address problems that impede deployment of wind turbines in the coastal and offshore regions of Northern Ohio. The project evaluates different wind turbine designs and the potential impact of offshore turbines on migratory and resident birds by developing multidisciplinary research, which involves wildlife biology, electrical and mechanical engineering, and geospatial science. Firstly, the project conducts cost and performance studies of two- and three-blade wind turbines using a turbine design suited for the Great Lakes. The numerical studies comprised an analysis and evaluation of the annual energy production of two- and three-blade wind turbines to determine the levelized cost of energy. This task also involved wind tunnel studies of model wind turbines to quantify the wake flow field of upwind and downwind wind turbine-tower arrangements. The experimental work included a study of a scaled model of an offshore wind turbine platform in a water tunnel. The levelized cost of energy work consisted of the development and application of a cost model to predict the cost of energy produced by a wind turbine system placed offshore. The analysis found that a floating two-blade wind turbine presents the most cost effective alternative for the Great Lakes. The load effects studies showed that the two-blade wind turbine model experiences less torque under all IEC Standard design load cases considered. Other load effects did not show this trend and depending on the design load cases, the two-bladed wind turbine showed higher or lower load effects. The experimental studies of the wake were conducted using smoke flow visualization and hot wire anemometry. Flow visualization studies showed that in the downwind turbine configuration the wake flow was insensitive to the presence of the blade and was very similar to that of the tower alone. On the other hand, in the upwind turbine configuration, increasing the rotor blade angle of attack

  7. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  8. Enhancing seasonal climate prediction capacity for the Pacific countries

    NASA Astrophysics Data System (ADS)

    Kuleshov, Y.; Jones, D.; Hendon, H.; Charles, A.; Cottrill, A.; Lim, E.-P.; Langford, S.; de Wit, R.; Shelton, K.

    2012-04-01

    Seasonal and inter-annual climate variability is a major factor in determining the vulnerability of many Pacific Island Countries to climate change and there is need to improve weekly to seasonal range climate prediction capabilities beyond what is currently available from statistical models. In the seasonal climate prediction project under the Australian Government's Pacific Adaptation Strategy Assistance Program (PASAP), we describe a comprehensive project to strengthen the climate prediction capacities in National Meteorological Services in 14 Pacific Island Countries and East Timor. The intent is particularly to reduce the vulnerability of current services to a changing climate, and improve the overall level of information available assist with managing climate variability. Statistical models cannot account for aspects of climate variability and change that are not represented in the historical record. In contrast, dynamical physics-based models implicitly include the effects of a changing climate whatever its character or cause and can predict outcomes not seen previously. The transition from a statistical to a dynamical prediction system provides more valuable and applicable climate information to a wide range of climate sensitive sectors throughout the countries of the Pacific region. In this project, we have developed seasonal climate outlooks which are based upon the current dynamical model POAMA (Predictive Ocean-Atmosphere Model for Australia) seasonal forecast system. At present, meteorological services of the Pacific Island Countries largely employ statistical models for seasonal outlooks. Outcomes of the PASAP project enhanced capabilities of the Pacific Island Countries in seasonal prediction providing National Meteorological Services with an additional tool to analyse meteorological variables such as sea surface temperatures, air temperature, pressure and rainfall using POAMA outputs and prepare more accurate seasonal climate outlooks.

  9. The FLARECAST Project and What Lies Beyond

    NASA Astrophysics Data System (ADS)

    Georgoulis, Manolis K.; Flarecast Team

    2016-04-01

    Solar eruptions exhibit three different legs, namely flares, coronal mass ejections, and solar energetic particle (SEP) events. All these eruptive manifestations entail an impact in heliospheric space weather, at different spatial and temporal scales. Therefore, these eruptive manifestations should be ideally predicted to shield humanity and its assets in space and, in some cases, on Earth's surface. The EC has endorsed this need, calling for and funding projects targeted on the forecasting of aspects of the near-Earth space environment. The Flare Likelihood And Region Eruption foreCASTing (FLARECAST) is one of them, with an objective to develop a definitive, openly accessible solar-flare prediction facility. We will focus on the main attributes of this facility, namely its ability to expand by reconciling new flare predictors and its setup, that is intended to couple tactical understanding of the flare phenomenon with a consolidated view on how this understanding can be turned into a deliverable with practical, operational face value. A third component of the FLARECAST project, namely its exploratory part, aims to bridge flare prediction with prediction of CMEs and, hopefully, SPE events, touching the other two areas of space-weather forecasting. Fragmented but very significant work exists in these areas that prompts one to envision a future, EC-funded unified prediction platform that could address all forecasting needs of the Sun-generated space weather. Research partially funded by the European Union's Horizon 2020 Research and Innovation Programme under grant agreement No. 640216.

  10. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  11. Selection of sequence variants to improve dairy cattle genomic predictions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genomic prediction reliabilities improved when adding selected sequence variants from run 5 of the 1,000 bull genomes project. High density (HD) imputed genotypes for 26,970 progeny tested Holstein bulls were combined with sequence variants for 444 Holstein animals. The first test included 481,904 c...

  12. Seasonal Atmospheric and Oceanic Predictions

    NASA Technical Reports Server (NTRS)

    Roads, John; Rienecker, Michele (Technical Monitor)

    2003-01-01

    Several projects associated with dynamical, statistical, single column, and ocean models are presented. The projects include: 1) Regional Climate Modeling; 2) Statistical Downscaling; 3) Evaluation of SCM and NSIPP AGCM Results at the ARM Program Sites; and 4) Ocean Forecasts.

  13. Environmental Science: 49 Science Fair Projects. Science Fair Projects Series.

    ERIC Educational Resources Information Center

    Bonnet, Robert L.; Keen, G. Daniel

    This book contains 49 science fair projects designed for 6th to 9th grade students. Projects are organized by the topics of soil, ecology (projects in habitat and life cycles), pests and controls (projects in weeds and insects), recycling (projects in resources and conservation), waste products (projects in decomposition), microscopic organisms,…

  14. Project Surveillance and Maintenance Plan. [UMTRA Project

    SciTech Connect

    Not Available

    1985-09-01

    The Project Surveillance and Maintenance Plan (PSMP) describes the procedures that will be used by the US Department of Energy (DOE), or other agency as designated by the President to verify that inactive uranium tailings disposal facilities remain in compliance with licensing requirements and US Environmental Protection Agency (EPA) standards for remedial actions. The PSMP will be used as a guide for the development of individual Site Surveillance and Maintenance Plans (part of a license application) for each of the UMTRA Project sites. The PSMP is not intended to provide minimum requirements but rather to provide guidance in the selection of surveillance measures. For example, the plan acknowledges that ground-water monitoring may or may not be required and provides the (guidance) to make this decision. The Site Surveillance and Maintenance Plans (SSMPs) will form the basis for the licensing of the long-term surveillance and maintenance of each UMTRA Project site by the NRC. Therefore, the PSMP is a key milestone in the licensing process of all UMTRA Project sites. The Project Licensing Plan (DOE, 1984a) describes the licensing process. 11 refs., 22 figs., 8 tabs.

  15. On identified predictive control

    NASA Technical Reports Server (NTRS)

    Bialasiewicz, Jan T.

    1993-01-01

    Self-tuning control algorithms are potential successors to manually tuned PID controllers traditionally used in process control applications. A very attractive design method for self-tuning controllers, which has been developed over recent years, is the long-range predictive control (LRPC). The success of LRPC is due to its effectiveness with plants of unknown order and dead-time which may be simultaneously nonminimum phase and unstable or have multiple lightly damped poles (as in the case of flexible structures or flexible robot arms). LRPC is a receding horizon strategy and can be, in general terms, summarized as follows. Using assumed long-range (or multi-step) cost function the optimal control law is found in terms of unknown parameters of the predictor model of the process, current input-output sequence, and future reference signal sequence. The common approach is to assume that the input-output process model is known or separately identified and then to find the parameters of the predictor model. Once these are known, the optimal control law determines control signal at the current time t which is applied at the process input and the whole procedure is repeated at the next time instant. Most of the recent research in this field is apparently centered around the LRPC formulation developed by Clarke et al., known as generalized predictive control (GPC). GPC uses ARIMAX/CARIMA model of the process in its input-output formulation. In this paper, the GPC formulation is used but the process predictor model is derived from the state space formulation of the ARIMAX model and is directly identified over the receding horizon, i.e., using current input-output sequence. The underlying technique in the design of identified predictive control (IPC) algorithm is the identification algorithm of observer/Kalman filter Markov parameters developed by Juang et al. at NASA Langley Research Center and successfully applied to identification of flexible structures.

  16. Stress Prediction System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA wanted to know how astronauts' bodies would react under various gravitational pulls and space suit weights. Under contract to NASA, the University of Michigan's Center for Ergonomics developed a model capable of predicting what type of stress and what degree of load a body could stand. The algorithm generated was commercialized with the ISTU (Isometric Strength Testing Unit) Functional Capacity Evaluation System, which simulates tasks such as lifting a heavy box or pushing a cart and evaluates the exertion expended. It also identifies the muscle group that limits the subject's performance. It is an effective tool of personnel evaluation, selection and job redesign.

  17. Timebias corrections to predictions

    NASA Technical Reports Server (NTRS)

    Wood, Roger; Gibbs, Philip

    1993-01-01

    The importance of an accurate knowledge of the time bias corrections to predicted orbits to a satellite laser ranging (SLR) observer, especially for low satellites, is highlighted. Sources of time bias values and the optimum strategy for extrapolation are discussed from the viewpoint of the observer wishing to maximize the chances of getting returns from the next pass. What is said may be seen as a commercial encouraging wider and speedier use of existing data centers for mutually beneficial exchange of time bias data.

  18. Coal extraction - environmental prediction

    SciTech Connect

    C. Blaine Cecil; Susan J. Tewalt

    2002-08-01

    To predict and help minimize the impact of coal extraction in the Appalachian region, the U.S. Geological Survey (USGS) is addressing selected mine-drainage issues through the following four interrelated studies: spatial variability of deleterious materials in coal and coal-bearing strata; kinetics of pyrite oxidation; improved spatial geologic models of the potential for drainage from abandoned coal mines; and methodologies for the remediation of waters discharged from coal mines. As these goals are achieved, the recovery of coal resources will be enhanced. 2 figs.

  19. Predicting Ground Illuminance

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.

    2014-01-01

    Our Sun outputs 3.85 × 1026 W of radiation, of which ≈37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather. Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types. Ground illuminance data from an observing run at the White Sands missile range were obtained from the United Kingdom Meteorology Office. Based on available weather reports, five days of clear sky observations were selected. These data are compared to the predictions of the two models. We find that neither of the models provide an accurate treatment during twilight conditions when the Sun is at or a few degrees below the horizon. When the Sun is above the horizon, the Shapiro model straddles the observed data, ranging between 90% and 120% of the recorded illuminance. During the same times, the Brown model is between 70% and 90% of the

  20. Predicting Ground Illuminance

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.; Tregoning, Brett D.; Hitchens, Alexandra E.

    2015-01-01

    Our Sun outputs 3.85 x 1026 W of radiation, of which roughly 37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather.Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types.In this paper we compare the models' predictions to ground illuminance data from an observing run at the White Sands missile range (data was obtained from the United Kingdom's Meteorology Office). Continuous illuminance readings were recorded under various cloud conditions, during both daytime and nighttime hours. We find that under clear skies, the Shapiro model tends to better fit the observations during daytime hours with typical discrepancies under 10%. Under cloudy skies, both models tend to poorly predict ground illuminance. However, the Shapiro model, with typical average daytime discrepancies of 25% or less in many cases

  1. Consciousness -- A Verifiable Prediction

    NASA Astrophysics Data System (ADS)

    Panchapakesan, N.

    2014-07-01

    Consciousness may or may not be completely within the realm of science. We have argued elsewhere that there is a high probability that it is not within the purview of science, just like humanities and arts are outside science. Even social sciences do not come under science when human interactions are involved. Here, we suggest a possible experiment to decide whether it is part of science. We suggest that a scientific signal may be available to investigate the prediction in the form of an electromagnetic brainwave background radiation.

  2. Freeze Prediction Model

    NASA Technical Reports Server (NTRS)

    Morrow, C. T. (Principal Investigator)

    1981-01-01

    Measurements of wind speed, net irradiation, and of air, soil, and dew point temperatures in an orchard at the Rock Springs Agricultural Research Center, as well as topographical and climatological data and a description of the major apple growing regions of Pennsylvania were supplied to the University of Florida for use in running the P-model, freeze prediction program. Results show that the P-model appears to have considerable applicability to conditions in Pennsylvania. Even though modifications may have to be made for use in the fruit growing regions, there are advantages for fruit growers with the model in its present form.

  3. Regional Earth System Prediction for Policy Decision-Making

    NASA Astrophysics Data System (ADS)

    Murtugudde, R. G.; Cbfs Team

    2010-12-01

    While the IPCC will continue to lead Earth System projections for global issues such as greenhouse gas levels and global temperature increase, high-resolution regional Earth System predictions will be crucial for producing effective decision-making tools for day-to-day, sustainable Earth System management and adaptive management of resources. Regional Earth System predictions and projections at the order of a few meters resolution from days to decades must be validated and provide uncertainties and skill scores to be usable. While the task is daunting, it would be criminally negligent of the global human not to embark on this task immediately. The observational needs for the integrated natural-human system for the regional Earth System are distinct from the global needs even though there are many overlaps. The process understanding of the Earth System at the micro-scale can be translated into predictive understanding and skillful predictions for sustainable management and adaptation by merging these observations with Earth System models to go from global scale predictions and projections to regional environmental manifestations and mechanistic depiction of human interactions with the Earth System and exploitation of its resources. Regional Earth System monitoring and predictions thus will continuously take the pulse of the planet to prescribe appropriate actions for participatory decision-making for sustainable and adaptive management of the Earth System and to avoid catastrophic domains of potential outcomes. An example of a regional Earth System prediction system over the Chesapeake Bay with detailed interactions with users is discussed. Routine forecasts of atmospheric and hydrodynamic forecasts are used to produce linked prediction products for water quality, hypoxia, sea nettles, harmful algal blooms, striped bass, pathogens, etc.

  4. The CHPRC Columbia River Protection Project Quality Assurance Project Plan

    SciTech Connect

    Fix, N. J.

    2008-11-30

    Pacific Northwest National Laboratory researchers are working on the CHPRC Columbia River Protection Project (hereafter referred to as the Columbia River Project). This is a follow-on project, funded by CH2M Hill Plateau Remediation Company, LLC (CHPRC), to the Fluor Hanford, Inc. Columbia River Protection Project. The work scope consists of a number of CHPRC funded, related projects that are managed under a master project (project number 55109). All contract releases associated with the Fluor Hanford Columbia River Project (Fluor Hanford, Inc. Contract 27647) and the CHPRC Columbia River Project (Contract 36402) will be collected under this master project. Each project within the master project is authorized by a CHPRC contract release that contains the project-specific statement of work. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Columbia River Project staff.

  5. Defensive externality and blame projection following failure.

    PubMed

    Hochreich, D J

    1975-09-01

    This study focuses upon the relationship between internal-external control and defensive blame projection. Trust was used as a moderator variable for making differential predictions concerning the behavior of two subgroups of externals: defensive externals, whose externality is presumed to reflect primarily a verbal technique of defense, and congruent externals, whose externality reflects a more genuine belief that most outcomes are determined by forces beyond their personal control. As predicted, defensive externals showed a stronger tendency than did congruent externals and internals to resort to blame projection following failure at an achievement task. There were no group differences in attribution following task success. Defensive externals were found to be more responsive to negative feedback than were congruent externals.

  6. River Protection Project (RPP) Project Management Plan

    SciTech Connect

    SEEMAN, S.E.

    2000-04-01

    The U.S. Department of Energy (DOE), in accordance with the Strom Thurmond National Defense Authorization Act for Fiscal Year 1999, established the Office of River Protection (ORP) to successfully execute and manage the River Protection Project (RPP), formerly known as the Tank Waste Remediation System (TWRS). The mission of the RPP is to store, retrieve, treat, and dispose of the highly radioactive Hanford tank waste in an environmentally sound, safe, and cost-effective manner. The team shown in Figure 1-1 is accomplishing the project. The ORP is providing the management and integration of the project; the Tank Farm Contractor (TFC) is responsible for providing tank waste storage, retrieval, and disposal; and the Privatization Contractor (PC) is responsible for providing tank waste treatment.

  7. TULSA UNIVERSITY PARAFFIN DEPOSITION PROJECTS

    SciTech Connect

    Michael Volk; Cem Sarica

    2003-10-01

    As oil and gas production moves to deeper and colder water, subsea multiphase production systems become critical for economic feasibility. It will also become increasingly imperative to adequately identify the conditions for paraffin precipitation and predict paraffin deposition rates to optimize the design and operation of these multiphase production systems. Although several oil companies have paraffin deposition predictive capabilities for single-phase oil flow, these predictive capabilities are not suitable for the multiphase flow conditions encountered in most flowlines and wellbores. For deepwater applications in the Gulf of Mexico, it is likely that multiphase production streams consisting of crude oil, produced water and gas will be transported in a single multiphase pipeline to minimize capital cost and complexity at the mudline. Existing single-phase (crude oil) paraffin deposition predictive tools are clearly inadequate to accurately design these pipelines because they do not account for the second and third phases, namely, produced water and gas. The objective of this program is to utilize the current test facilities at The University of Tulsa, as well as member company expertise, to accomplish the following: enhance our understanding of paraffin deposition in single and two-phase (gas-oil) flows; conduct focused experiments to better understand various aspects of deposition physics; and, utilize knowledge gained from experimental modeling studies to enhance the computer programs developed in the previous JIP for predicting paraffin deposition in single and two-phase flow environments. These refined computer models will then be tested against field data from member company pipelines. The following deliverables are scheduled during the first three projects of the program: (1) Single-Phase Studies, with three different black oils, which will yield an enhanced computer code for predicting paraffin deposition in deepwater and surface pipelines. (2) Two

  8. Prediction uncertainty of environmental change effects on temperate European biodiversity.

    PubMed

    Dormann, Carsten F; Schweiger, Oliver; Arens, P; Augenstein, I; Aviron, St; Bailey, Debra; Baudry, J; Billeter, R; Bugter, R; Bukácek, R; Burel, F; Cerny, M; Cock, Raphaël De; De Blust, Geert; DeFilippi, R; Diekötter, Tim; Dirksen, J; Durka, W; Edwards, P J; Frenzel, M; Hamersky, R; Hendrickx, Frederik; Herzog, F; Klotz, St; Koolstra, B; Lausch, A; Le Coeur, D; Liira, J; Maelfait, J P; Opdam, P; Roubalova, M; Schermann-Legionnet, Agnes; Schermann, N; Schmidt, T; Smulders, M J M; Speelmans, M; Simova, P; Verboom, J; van Wingerden, Walter; Zobel, M

    2008-03-01

    Observed patterns of species richness at landscape scale (gamma diversity) cannot always be attributed to a specific set of explanatory variables, but rather different alternative explanatory statistical models of similar quality may exist. Therefore predictions of the effects of environmental change (such as in climate or land cover) on biodiversity may differ considerably, depending on the chosen set of explanatory variables. Here we use multimodel prediction to evaluate effects of climate, land-use intensity and landscape structure on species richness in each of seven groups of organisms (plants, birds, spiders, wild bees, ground beetles, true bugs and hoverflies) in temperate Europe. We contrast this approach with traditional best-model predictions, which we show, using cross-validation, to have inferior prediction accuracy. Multimodel inference changed the importance of some environmental variables in comparison with the best model, and accordingly gave deviating predictions for environmental change effects. Overall, prediction uncertainty for the multimodel approach was only slightly higher than that of the best model, and absolute changes in predicted species richness were also comparable. Richness predictions varied generally more for the impact of climate change than for land-use change at the coarse scale of our study. Overall, our study indicates that the uncertainty introduced to environmental change predictions through uncertainty in model selection both qualitatively and quantitatively affects species richness projections. PMID:18070098

  9. Project GlobWave

    NASA Astrophysics Data System (ADS)

    Busswell, Geoff; Ash, Ellis; Piolle, Jean-Francois; Poulter, David J. S.; Snaith, Helen; Collard, Fabrice; Sheera, Harjit; Pinnock, Simon

    2010-12-01

    The ESA GlobWave project is a three year initiative, funded by ESA and CNES, to service the needs of satellite wave product users across the globe. Led by Logica UK, with support from CLS, IFREMER, SatOC and NOCS, the project will provide free access to satellite wave data and products in a common format, both historical and in near real time, from various European and American SAR and altimeter missions. Building on the successes of similar projects for Sea Surface Temperature and ocean colour, the project aims to stimulate increased use and analysis of satellite wave products. In addition to common-format satellite data the project will provide comparisons with in situ measurements, interactive data analysis tools and a pilot spatial wave forecast verification scheme for operational forecast production centres. The project will begin operations in January 2010, with direction from regular structured user consultation.

  10. Battleground Energy Recovery Project

    SciTech Connect

    Bullock, Daniel

    2011-12-31

    In October 2009, the project partners began a 36-month effort to develop an innovative, commercial-scale demonstration project incorporating state-of-the-art waste heat recovery technology at Clean Harbors, Inc., a large hazardous waste incinerator site located in Deer Park, Texas. With financial support provided by the U.S. Department of Energy, the Battleground Energy Recovery Project was launched to advance waste heat recovery solutions into the hazardous waste incineration market, an area that has seen little adoption of heat recovery in the United States. The goal of the project was to accelerate the use of energy-efficient, waste heat recovery technology as an alternative means to produce steam for industrial processes. The project had three main engineering and business objectives: Prove Feasibility of Waste Heat Recovery Technology at a Hazardous Waste Incinerator Complex; Provide Low-cost Steam to a Major Polypropylene Plant Using Waste Heat; and Create a Showcase Waste Heat Recovery Demonstration Project.

  11. Structuring small projects

    SciTech Connect

    Pistole, C.O.

    1995-11-01

    One of the most difficult hurdles facing small project developers is obtaining financing. Many major banks and institutional investors are unwilling to become involved in projects valued at less than $25 million. To gain the interest of small project investors, developers will want to present a well-considered plan and an attractive rate of return. Waste-to-energy projects are one type that can offer diversified revenue sources that assure maximum profitability. The Ripe Touch Greenhouse project, a $14.5 million waste tire-to-energy facility in Colorado, provides a case study of how combining the strengths of the project partners can help gain community and regulatory acceptance and maximize profit opportunities.

  12. Affine projective Osserman structures

    NASA Astrophysics Data System (ADS)

    Gilkey, P.; Nikčević, S.

    2013-08-01

    By considering the projectivized spectrum of the Jacobi operator, we introduce the concept of projective Osserman manifold in both the affine and in the pseudo-Riemannian settings. If M is an affine projective Osserman manifold, then the deformed Riemannian extension metric on the cotangent bundle is both spacelike and timelike projective Osserman. Since any rank-1-symmetric space is affine projective Osserman, this provides additional information concerning the cotangent bundle of a rank-1 Riemannian symmetric space with the deformed Riemannian extension metric. We construct other examples of affine projective Osserman manifolds where the Ricci tensor is not symmetric and thus the connection in question is not the Levi-Civita connection of any metric. If the dimension is odd, we use methods of algebraic topology to show the Jacobi operator of an affine projective Osserman manifold has only one non-zero eigenvalue and that eigenvalue is real.

  13. Pine Hollow Watershed Project : FY 2000 Projects.

    SciTech Connect

    Sherman County Soil and Water Conservation District

    2001-06-01

    The Pine Hollow Project (1999-010-00) is an on-going watershed restoration effort administered by Sherman County Soil and Water Conservation District and spearheaded by Pine Hollow/Jackknife Watershed Council. The headwaters are located near Shaniko in Wasco County, and the mouth is in Sherman County on the John Day River. Pine Hollow provides more than 20 miles of potential summer steelhead spawning and rearing habitat. The watershed is 92,000 acres. Land use is mostly range, with some dryland grain. There are no water rights on Pine Hollow. Due to shallow soils, the watershed is prone to rapid runoff events which scour out the streambed and the riparian vegetation. This project seeks to improve the quality of upland, riparian and in-stream habitat by restoring the natural hydrologic function of the entire watershed. Project implementation to date has consisted of construction of water/sediment control basins, gradient terraces on croplands, pasture cross-fences, upland water sources, and grass seeding on degraded sites, many of which were crop fields in the early part of the century. The project is expected to continue through about 2007. From March 2000 to June 2001, the Pine Hollow Project built 6 sediment basins, 1 cross-fence, 2 spring developments, 1 well development, 1 solar pump, 50 acres of native range seeding and 1 livestock waterline. FY2000 projects were funded by BPA, Oregon Watershed Enhancement Board, US Fish and Wildlife Service and landowners. In-kind services were provided by Sherman County Soil and Water Conservation District, USDA Natural Resources Conservation Service, USDI Bureau of Land Management, Oregon Department of Fish and Wildlife, Pine Hollow/Jackknife Watershed Council, landowners and Wasco County Soil and Water Conservation District.

  14. Compressor map prediction tool

    NASA Astrophysics Data System (ADS)

    Ravi, Arjun; Sznajder, Lukasz; Bennett, Ian

    2015-08-01

    Shell Global Solutions uses an in-house developed system for remote condition monitoring of centrifugal compressors. It requires field process data collected during operation to calculate and assess the machine's performance. Performance is assessed by comparing live results of polytropic head and efficiency versus design compressor curves provided by the Manufacturer. Typically, these design curves are given for specific suction conditions. The further these conditions on site deviate from those prescribed at design, the less accurate the health assessment of the compressor becomes. To address this specified problem, a compressor map prediction tool is proposed. The original performance curves of polytropic head against volumetric flow for varying rotational speeds are used as an input to define a range of Mach numbers within which the non-dimensional invariant performance curve of head and volume flow coefficient is generated. The new performance curves of polytropic head vs. flow for desired set of inlet conditions are then back calculated using the invariant non-dimensional curve. Within the range of Mach numbers calculated from design data, the proposed methodology can predict polytropic head curves at a new set of inlet conditions within an estimated 3% accuracy. The presented methodology does not require knowledge of detailed impeller geometry such as throat areas, blade number, blade angles, thicknesses nor other aspects of the aerodynamic design - diffusion levels, flow angles, etc. The only required mechanical design feature is the first impeller tip diameter. Described method makes centrifugal compressor surveillance activities more accurate, enabling precise problem isolation affecting machine's performance.

  15. Advanced hydrologic prediction system

    NASA Astrophysics Data System (ADS)

    Connelly, Brian A.; Braatz, Dean T.; Halquist, John B.; Deweese, Michael M.; Larson, Lee; Ingram, John J.

    1999-08-01

    As our Nation's population and infrastructure grow, natural disasters are becoming a greater threat to our society's stability. In an average year, inland flooding claims 133 lives and resulting property losses exceed 4.0 billion. Last year, 1997, these losses totaled 8.7 billion. Because of this blossoming threat, the National Weather Service (NWS) has requested funding within its 2000 budget to begin national implementation of the Advanced Hydrologic Prediction System (AHPS). With this system in place the NWS will be able to utilize precipitation and climate predictions to provide extended probabilistic river forecasts for risk-based decisions. In addition to flood and drought mitigation benefits, extended river forecasts will benefit water resource managers in decision making regarding water supply, agriculture, navigation, hydropower, and ecosystems. It's estimated that AHPS, if implemented nationwide, would save lives and provide $677 million per year in economic benefits. AHPS is used currently on the Des Moines River basin in Iowa and will be implemented soon on the Minnesota River basin in Minnesota. Experience gained from user interaction is leading to refined and enhanced product formats and displays. This discussion will elaborate on the technical requirements associated with AHPS implementation, its enhanced products and informational displays, and further refinements based on customer feedback.

  16. A linear regression model for predicting PNW estuarine temperatures in a changing climate

    EPA Science Inventory

    Pacific Northwest coastal regions, estuaries, and associated ecosystems are vulnerable to the potential effects of climate change, especially to changes in nearshore water temperature. While predictive climate models simulate future air temperatures, no such projections exist for...

  17. Germination prediction from soil moisture and temperature data across the Great Basin

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Preventing cheatgrass (Bromus tectorum) dominance associated with frequent wildfires depends, in part, on successful establishment of desirable species sown in fire rehabilitation and fuel control projects. We tested the effects of fire, herbicide applications, and mechanical treatments on predicte...

  18. Predicted phototoxicities of carbon nano-material by quantum mechanical calculations.

    EPA Science Inventory

    The basis of this research is obtaining the best quantum mechanical structure of carbon nanomaterials and is fundamental in determining their other properties. Therefore, their predictive phototoxicity is directly related to the materials’ structure. The results of this project w...

  19. Operational waste volume projection

    SciTech Connect

    Koreski, G.M.; Strode, J.N.

    1995-06-01

    Waste receipts to the double-shell tank system are analyzed and wastes through the year 2015 are projected based on generation trends of the past 12 months. A computer simulation of site operations is performed, which results in projections of tank fill schedules, tank transfers, evaporator operations, tank retrieval, and aging waste tank usage. This projection incorporates current budget planning and the clean-up schedule of the tri-party agreement. Assumptions are current as of June 1995.

  20. Operational waste volume projection

    SciTech Connect

    Koreski, G.M.

    1996-09-20

    Waste receipts to the double-shell tank system are analyzed and wastes through the year 2015 are projected based on generation trends of the past 12 months. A computer simulation of site operations is performed, which results in projections of tank fill schedules, tank transfers, evaporator operations, tank retrieval, and aging waste tank usage. This projection incorporates current budget planning and the clean-up schedule of the Tri-Party Agreement. Assumptions were current as of June 1996.

  1. Computer Assets Recovery Project

    NASA Technical Reports Server (NTRS)

    CortesPena, Aida Yoguely

    2010-01-01

    This document reports on the project that was performed during the internship of the author. The project involved locating and recovering machines in various locations that Boeing has no need for, and therefore requires that they be transferred to another user or transferred to a non-profit organization. Other projects that the author performed was an inventory of toner and printers, loading new computers and connecting them to the network.

  2. Operational Waste Volume Projection

    SciTech Connect

    STRODE, J.N.

    1999-08-24

    Waste receipts to the double-shell tank system are analyzed and wastes through the year 2018 are projected based on assumption as of July 1999. A computer simulation of site operations is performed, which results in projections of tank fill schedules, tank transfers, evaporator operations, tank retrieval, and aging waste tank usage. This projection incorporates current budget planning and the clean-up schedule of the Tri-Party Agreement.

  3. Spartan Project Overview

    NASA Technical Reports Server (NTRS)

    Carson, Donald E.

    1999-01-01

    The Spartan Project is the result of Office of Space Science requirements for a transition capability between sounding rockets and orbital missions. The project started early in the Shuttle program and drew from suborbital program designs, GAS programs, and existing Marshall Space Flight Center's bridge and attach mechanisms. Features include reusable Shuttle-based carriers. Spartan is an in-house project drawing support from a mix of support service contractors and matrixed discipline support from Goddard Space Flight Center organizations

  4. Microwave solidification project overview

    SciTech Connect

    Sprenger, G.

    1993-01-01

    The Rocky Flats Plant Microwave Solidification Project has application potential to the Mixed Waste Treatment Project and the The Mixed Waste Integrated Program. The technical areas being addressed include (1) waste destruction and stabilization; (2) final waste form; and (3) front-end waste handling and feed preparation. This document covers need for such a program; technology description; significance; regulatory requirements; and accomplishments to date. A list of significant reports published under this project is included.

  5. KSC History Project

    NASA Technical Reports Server (NTRS)

    Snaples, Lee

    2001-01-01

    The project is a joint endeavor between Dr. Henry Dethloff and myself and is producing a number of products related to KSC history. This report is a summary of those projects. First, there is an overview monograph covering KSC history. Second, there is a chapter outline for an eventual book-length history. Third, there is monograph on safety at KSC. Finally, there is a web page and database dedicated to the KSC oral history project.

  6. Collaborative Research: Separating Forced and Unforced Decadal Predictability in Models and Observations

    SciTech Connect

    Tippett, Michael K.

    2014-04-09

    This report is a progress report of the accomplishments of the research grant “Collaborative Research: Separating Forced and Unforced Decadal Predictability in Models and Observa- tions” during the period 1 May 2011- 31 August 2013. This project is a collaborative one between Columbia University and George Mason University. George Mason University will submit a final technical report at the conclusion of their no-cost extension. The purpose of the proposed research is to identify unforced predictable components on decadal time scales, distinguish these components from forced predictable components, and to assess the reliability of model predictions of these components. Components of unforced decadal predictability will be isolated by maximizing the Average Predictability Time (APT) in long, multimodel control runs from state-of-the-art climate models. Components with decadal predictability have large APT, so maximizing APT ensures that components with decadal predictability will be detected. Optimal fingerprinting techniques, as used in detection and attribution analysis, will be used to separate variations due to natural and anthropogenic forcing from those due to unforced decadal predictability. This methodology will be applied to the decadal hindcasts generated by the CMIP5 project to assess the reliability of model projections. The question of whether anthropogenic forcing changes decadal predictability, or gives rise to new forms of decadal predictability, also will be investigated.

  7. Change in avian abundance predicted from regional forest inventory data

    USGS Publications Warehouse

    Twedt, Daniel J.; Tirpak, John M.; Jones-Farrand, D. Todd; Thompson, Frank R.; Uihlein, William B.; Fitzgerald, Jane A.

    2010-01-01

    An inability to predict population response to future habitat projections is a shortcoming in bird conservation planning. We sought to predict avian response to projections of future forest conditions that were developed from nationwide forest surveys within the Forest Inventory and Analysis (FIA) program. To accomplish this, we evaluated the historical relationship between silvicolous bird populations and FIA-derived forest conditions within 25 ecoregions that comprise the southeastern United States. We aggregated forest area by forest ownership, forest type, and tree size-class categories in county-based ecoregions for 5 time periods spanning 1963-2008. We assessed the relationship of forest data with contemporaneous indices of abundance for 24 silvicolous bird species that were obtained from Breeding Bird Surveys. Relationships between bird abundance and forest inventory data for 18 species were deemed sufficient as predictive models. We used these empirically derived relationships between regional forest conditions and bird populations to predict relative changes in abundance of these species within ecoregions that are anticipated to coincide with projected changes in forest variables through 2040. Predicted abundances of these 18 species are expected to remain relatively stable in over a quarter (27%) of the ecoregions. However, change in forest area and redistribution of forest types will likely result in changed abundance of some species within many ecosystems. For example, abundances of 11 species, including pine warbler (Dendroica pinus), brown-headed nuthatch (Sitta pusilla), and chuckwills- widow (Caprimulgus carolinensis), are projected to increase within more ecoregions than ecoregions where they will decrease. For 6 other species, such as blue-winged warbler (Vermivora pinus), Carolina wren (Thryothorus ludovicianus), and indigo bunting (Passerina cyanea), we projected abundances will decrease within more ecoregions than ecoregions where they will

  8. Advancing Drought Understanding, Monitoring and Prediction

    NASA Technical Reports Server (NTRS)

    Mariotti, Annarita; Schubert, Siegfried D.; Mo, Kingtse; Peters-Lidard, Christa; Wood, Andy; Pulwarty, Roger; Huang, Jin; Barrie, Dan

    2013-01-01

    , focused and coordinated research efforts are needed, drawing from excellence across the broad drought research community. To meet this challenge, National Oceanic and Atmospheric Administration (NOAA)'s Drought Task Force was established in October 2011 with the ambitious goal of achieving significant new advances in the ability to understand, monitor, and predict drought over North America. The Task Force (duration of October 2011-September 2014) is an initiative of NOAA's Climate Program Office Modeling, Analysis, Predictions, and Projections (MAPP) program in partnership with NIDIS. It brings together over 30 leading MAPP-funded drought scientists from multiple academic and federal institutions [involves scientists from NOAA's research laboratories and centers, the National Aeronautics and Space Administration (NASA), U.S. Department of Agriculture, National Center for Atmospheric Research (NCAR), and many universities] in a concerted research effort that builds on individual MAPP research projects. These projects span the wide spectrum of drought research needed to make fundamental advances, from those aimed at the basic understanding of drought mechanisms to those aimed at testing new drought monitoring and prediction tools for operational and service purposes (as part of NCEP's Climate Test Bed). The Drought Task Force provides focus and coordination to MAPP drought research activities and also facilitates synergies with other national and international drought research efforts, including those by the GDIS.

  9. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  10. Ceramic Technology for Advanced Heat Engines Project

    SciTech Connect

    Not Available

    1990-08-01

    The Ceramic Technology For Advanced Heat Engines Project was developed by the Department of Energy's Office of Transportation Systems (OTS) in Conservation and Renewable Energy. This project, part of the OTS's Advanced Materials Development Program, was developed to meet the ceramic technology requirements of the OTS's automotive technology programs. Significant accomplishments in fabricating ceramic components for the Department of Energy (DOE), National Aeronautics and Space Administration (NASA), and Department of Defense (DOD) advanced heat engine programs have provided evidence that the operation of ceramic parts in high-temperature engine environments is feasible. However, these programs have also demonstrated that additional research is needed in materials and processing development, design methodology, and data base and life prediction before industry will have a sufficient technology base from which to produce reliable cost-effective ceramic engine components commercially. An assessment of needs was completed, and a five year project plan was developed with extensive input from private industry. The objective of the project is to develop the industrial technology base required for reliable ceramics for application in advanced automotive heat engines. The project approach includes determining the mechanisms controlling reliability, improving processes for fabricating existing ceramics, developing new materials with increased reliability, and testing these materials in simulated engine environments to confirm reliability. Although this is a generic materials project, the focus is on structural ceramics for advanced gas turbine and diesel engines, ceramic hearings and attachments, and ceramic coatings for thermal barrier and wear applications in these engines.

  11. Other School Projects.

    ERIC Educational Resources Information Center

    Learning By Design, 2001

    2001-01-01

    Highlights selected construction projects for learning centers, early childhood and development schools, and special purpose educational facilities that have won the Learning By Design Awards for 2001.(GR)

  12. Manpower and project planning

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1991-01-01

    The purpose was to study how manpower and projects are planned at the Facilities Engineering Division (FENGD) within the Systems Engineering and Operations Directorate of the LaRC and to make recommendations for improving the effectiveness and productivity ot the tools that are used. The existing manpower and project planning processes (including the management plan for the FENGD, existing manpower planning reports, project reporting to LaRC and NASA Headquarters, employee time reporting, financial reporting, and coordination/tracking reports for procurement) were discussed with several people, and project planning software was evaluated.

  13. FLEXI Project Management Survey

    NASA Astrophysics Data System (ADS)

    Rohunen, Anna; Krzanik, Lech; Kuvaja, Pasi; Similä, Jouni; Rodriguez, Pilar; Hyysalo, Jarkko; Linna, Tommi

    FLEXI Project Management Survey (FLEXI PMS) has been established to gain detailed knowledge on how the software industry - in particular successful companies - manages agile software development. FLEXI PMS investigates the actual agile values, principles, practices and contexts. The survey is supported by a careful literature review and analysis of existing studies. Special attention is attached to large, multi-site, multi-company and distributed projects - the target area of FLEXI project. The survey is intended to provide solid data for further knowledge acquisition and project/company positioning with regard to feasible agile management practices.

  14. KSC History Project

    NASA Technical Reports Server (NTRS)

    Dethloff, Henry C.

    2001-01-01

    The KSC History Project focuses on archival research and oral history interviews on the history of Kennedy Space Center (KSC). Related projects include the preparation of a precis and chapter outline for a proposed book-length narrative history, a bibliography of key primary and secondary resources, a brief monograph overview of the history of KSC, and a monograph on the history of safety at the Center. Finally, there is work on the development of a web page and a personal history data base associated with the oral history project. The KSC History Project has been a joint endeavor between Henry C. Dethloff and Dr. Noble Lee Snaples, Jr.

  15. Project Worm Bin.

    ERIC Educational Resources Information Center

    McGuire, Daniel C.

    1987-01-01

    Describes a project centering around earthworm activity in a compost bin. Includes suggestions for exercises involving biological and conservation concepts, gardening skills, and dramatical presentations. (ML)

  16. Project Risk Management

    NASA Technical Reports Server (NTRS)

    Jr., R. F. Miles

    1995-01-01

    Project risk management is primarily concerned with performance, reliability, cost, and schedule. Environmental risk management is primarily concerned with human health and ecological hazards and likelihoods. This paper discusses project risk management and compares it to environmental risk management, both with respect to goals and implementation. The approach of the Jet Propulsion Laboratory to risk management is presented as an example of a project risk management approach that is an extension to NASA NHB 7120.5: Management of Major System Programs and Projects.

  17. Prediction and predictability of North American seasonal climate variability

    NASA Astrophysics Data System (ADS)

    Infanti, Johnna M.

    Climate prediction on short time-scales such as months to seasons is of broad and current interest in the scientific research community. Monthly and seasonal climate prediction of variables such as precipitation, temperature, and sea surface temperature (SST) has implications for users in the agricultural and water management domains, among others. It is thus important to further understand the complexities of prediction of these variables using the most recent practices in climate prediction. The overarching goal of this dissertation is to determine the important contributions to seasonal prediction skill, predictability, and variability over North America using current climate prediction models and approaches. This dissertation aims to study a variety of approaches to seasonal climate prediction of variables over North America, including both climate prediction systems and methods of analysis. We utilize the North American Multi-Model Ensemble (NMME) System for Intra-Seasonal to Inter-Annual Prediction (ISI) to study seasonal climate prediction skill of North American and in particular for southeast US precipitation. We find that NMME results are often equal to or better than individual model results in terms of skill, as expected, making it a reasonable choice for southeast US seasonal climate predictions. However, climate models, including those involved in NMME, typically overestimate eastern Pacific warming during central Pacific El Nino events, which can affect regions that are influenced by teleconnections, such as the southeast US. Community Climate System Model version 4.0 (CCSM4) hindacasts and forecasts are included in NMME, and we preform a series of experiments that examine contributions to skill from certain drivers of North American climate prediction. The drivers we focus on are sea surface temperatures (SSTs) and their accuracy, land and atmosphere initialization, and ocean-atmosphere coupling. We compare measures of prediction skill of

  18. Prediction and predictability of North American seasonal climate variability

    NASA Astrophysics Data System (ADS)

    Infanti, Johnna M.

    Climate prediction on short time-scales such as months to seasons is of broad and current interest in the scientific research community. Monthly and seasonal climate prediction of variables such as precipitation, temperature, and sea surface temperature (SST) has implications for users in the agricultural and water management domains, among others. It is thus important to further understand the complexities of prediction of these variables using the most recent practices in climate prediction. The overarching goal of this dissertation is to determine the important contributions to seasonal prediction skill, predictability, and variability over North America using current climate prediction models and approaches. This dissertation aims to study a variety of approaches to seasonal climate prediction of variables over North America, including both climate prediction systems and methods of analysis. We utilize the North American Multi-Model Ensemble (NMME) System for Intra-Seasonal to Inter-Annual Prediction (ISI) to study seasonal climate prediction skill of North American and in particular for southeast US precipitation. We find that NMME results are often equal to or better than individual model results in terms of skill, as expected, making it a reasonable choice for southeast US seasonal climate predictions. However, climate models, including those involved in NMME, typically overestimate eastern Pacific warming during central Pacific El Nino events, which can affect regions that are influenced by teleconnections, such as the southeast US. Community Climate System Model version 4.0 (CCSM4) hindacasts and forecasts are included in NMME, and we preform a series of experiments that examine contributions to skill from certain drivers of North American climate prediction. The drivers we focus on are sea surface temperatures (SSTs) and their accuracy, land and atmosphere initialization, and ocean-atmosphere coupling. We compare measures of prediction skill of

  19. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  20. Basalt Waste Isolation Project Reclamation Support Project:

    SciTech Connect

    Brandt, C.A.; Rickard, W.H. Jr.; Cadoret, N.A.

    1992-06-01

    The Basalt Waste Isolation Project (BWIP) Reclamation Support Project began in the spring of 1988 by categorizing sites distributed during operations of the BWIP into those requiring revegetation and those to be abandoned or transferred to other programs. The Pacific Northwest Laboratory's role in this project was to develop plans for reestablishing native vegetation on the first category of sites, to monitor the implementation of these plans, to evaluate the effectiveness of these efforts, and to identify remediation methods where necessary. The Reclamation Support Project focused on three major areas: geologic hydrologic boreholes, the Exploratory Shaft Facility (ESF), and the Near-Surface Test Facility (NSTF). A number of BWIP reclamation sites seeded between 1989 and 1990 were found to be far below reclamation objectives. These sites were remediated in 1991 using various seedbed treatments designed to rectify problems with water-holding capacity, herbicide activity, surficial crust formation, and nutrient imbalances. Remediation was conducted during November and early December 1991. Sites were examined on a monthly basis thereafter to evaluate plant growth responses to these treatments. At all remediation sites early plant growth responses to these treatments. At all remediation sites, early plant growth far exceeded any previously obtained using other methods and seedbed treatments. Seeded plants did best where amendments consisted of soil-plus-compost or fertilizer-only. Vegetation growth on Gable Mountain was less than that found on other areas nearby, but this difference is attributed primarily to the site's altitude and north-facing orientation.

  1. The International Negotiation Seminars Project. Project ICONS.

    ERIC Educational Resources Information Center

    Wilkenfeld, Jonathan; Kaufman, Joyce; Starkey, Brigid

    This report of a study at the University of Maryland describes an international, interactive, and interdisciplinary project for first- and second-year students, which combines a large lecture format with small-group, seminar-type sessions organized around a computer-assisted simulation model, the International Communication and Negotiation…

  2. River Protection Project (RPP) Project Management Plan

    SciTech Connect

    NAVARRO, J.E.

    2001-03-07

    The Office of River Protection (ORP) Project Management Plan (PMP) for the River Protection Project (RPP) describes the process for developing and operating a Waste Treatment Complex (WTC) to clean up Hanford Site tank waste. The Plan describes the scope of the project, the institutional setting within which the project must be completed, and the management processes and structure planned for implementation. The Plan is written from the perspective of the ORP as the taxpayers' representative. The Hanford Site, in southeastern Washington State, has one of the largest concentrations of radioactive waste in the world, as a result of producing plutonium for national defense for more than 40 years. Approximately 53 million gallons of waste stored in 177 aging underground tanks represent major environmental, social, and political challenges for the U.S. Department of Energy (DOE). These challenges require numerous interfaces with state and federal environmental officials, Tribal Nations, stakeholders, Congress, and the US Department of Energy-Headquarters (DOE-HQ). The cleanup of the Site's tank waste is a national issue with the potential for environmental and economic impacts to the region and the nation.

  3. Predicting mud toxicity

    SciTech Connect

    Bleler, R. )

    1991-10-01

    Acute toxicity of drilling muds is measured in the U.S. by the mysid shrimp test. Drilling muds that fail the test cannot be discharged into the Gulf of Mexico, and such muds and their cuttings must be brought onshore for disposal. Discharge of water-based muds that pass the test is permitted in most instances. Because of the economic implications associated with hauling cuttings and fluids, a model that predicts test results on the basis of mud composition is clearly desirable. This paper focuses on the modeling of mysid shrimp test data. European laboratories use different test species and procedures. It seems plausible to expect, however, that the line of reasoning used here could apply to the modeling of aquatic data on other test species once a sufficient quantity of such data becomes available.

  4. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  5. Model predicts global warming

    NASA Astrophysics Data System (ADS)

    Wainger, Lisa A.

    Global greenhouse warming will be clearly identifiable by the 1990s, according to eight scientists who have been studying climate changes using computer models. Researchers at NASA's Goddard Space Flight Center, Goddard Institute for Space Studies, New York, and the Massachusetts Institute of Technology, Cambridge, say that by the 2010s, most of the globe will be experiencing “substantial” warming. The level of warming will depend on amounts of trace gases, or greenhouse gases, in the atmosphere.Predictions for the next 70 years are based on computer simulations of Earth's climate. In three runs of the model, James Hansen and his colleagues looked at the effects of changing amounts of atmospheric gases with time.

  6. ECLSS predictive monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, Richard J.; Chien, Steve A.

    1991-01-01

    On Space Station Freedom (SSF), design iterations have made clear the need to keep the sensor complement small. Along with the unprecendented duration of the mission, it is imperative that decisions regarding placement of sensors be carefully examined and justified during the design phase. In the ECLSS Predictive Monitoring task, we are developing AI-based software to enable design engineers to evaluate alternate sensor configurations. Based on techniques from model-based reasoning and information theory, the software tool makes explicit the quantitative tradeoffs among competing sensor placements, and helps designers explore and justify placement decisions. This work is being applied to the Environmental Control and Life Support System (ECLSS) testbed at MSFC to assist design personnel in placing sensors for test purposes to evaluate baseline configurations and ultimately to select advanced life support system technologies for evolutionary SSF.

  7. Motor degradation prediction methods

    SciTech Connect

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.

  8. Predicting the earth's future

    NASA Technical Reports Server (NTRS)

    Dutton, J. A.

    1986-01-01

    The development of earth system models that will simulate the past and present and provide predictions of future conditions is essential now that human activities have the potential to induce changes in the planetary environment. Critical aspects of global change include its pervasiveness and ubiquity, its distribution in several distinct time-scale bands, and the interactions between the atmosphere, ocean, land surface, and the terrestrial and marine biospheres. A model of the earth system on the scale of decades to centuries, developed by the Earth System Science Committee (NASA) with the strategy of dividing by time scale rather than discipline, is presented and the requirements for observations to support the implementation of the model are reviewed.

  9. A Bayesian ensemble approach for epidemiological projections.

    PubMed

    Lindström, Tom; Tildesley, Michael; Webb, Colleen

    2015-04-01

    Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, ensemble modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model ensembles based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the ensemble prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for ensembles with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for ensemble modeling of disease outbreaks.

  10. Understanding and Predicting Decadal Coastal Evolution

    NASA Astrophysics Data System (ADS)

    Nicholls, Robert J.

    2016-04-01

    Coastal management requires understanding and prognosis of decadal coastal evolution. This evolution is sensitive to climate change among other drivers. The iCOASST project has developed new and improved methods to understand and predict such changes with application to coastal erosion and flood risk management as the application. The project is based on an integrated framework that links several components to develop a system-level understanding of this change as follows: (1) new methods for system-level analysis and mapping of coast, estuary and inner shelf landform behaviour; (2) well validated 'bottom-up' hydrodynamic and sediment transport shelf models that can be applied at shelf scales to investigate inner shelf-coastal interactions; and (3) compositions of existing or new 'reduced complexity models' of selected coastal landforms and processes. The ability to link models and the availability of the data is also fundamental. The ultimate goal is multiple simulations of coastal evolution to explore uncertainties in future decadal-scale coastal response, including the effects of climate change and management choices. This paper reviews the achievements of this project, the lessons learnt and the next step research steps.

  11. Milford Visual Communications Project.

    ERIC Educational Resources Information Center

    Milford Exempted Village Schools, OH.

    This study discusses a visual communications project designed to develop activities to promote visual literacy at the elementary and secondary school levels. The project has four phases: (1) perception of basic forms in the environment, what these forms represent, and how they inter-relate; (2) discovery and communication of more complex…

  12. Avoiding Project Creep.

    ERIC Educational Resources Information Center

    Kennerknecht, Norbert J.; Scarnati, James T.

    1998-01-01

    Discusses how to keep school district capital-improvement projects within budget. Examines areas where runaway costs creep into a project and ways of cutting or lessening these costs, such as using standard agreements, controlling architect's expense reimbursements, developing a quality-control process, and reducing document duplication. (GR)

  13. Plant Biology Science Projects.

    ERIC Educational Resources Information Center

    Hershey, David R.

    This book contains science projects about seed plants that deal with plant physiology, plant ecology, and plant agriculture. Each of the projects includes a step-by-step experiment followed by suggestions for further investigations. Chapters include: (1) "Bean Seed Imbibition"; (2) "Germination Percentages of Different Types of Seeds"; (3)…

  14. Fundred Dollar Bill Project

    ERIC Educational Resources Information Center

    Rubin, Mary

    2009-01-01

    This article describes the Fundred Dollar Bill Project which is an innovative artwork made of millions of drawings. This creative collective action is intended to support Operation Paydirt, an extraordinary art/science project uniting three million children with educators, scientists, healthcare professionals, designers, urban planners, engineers,…

  15. Science Explorers Translation Project.

    ERIC Educational Resources Information Center

    Jacobs, Dolores

    This paper describes a pilot project of Los Alamos National Laboratory (New Mexico) to translate a science education curriculum for junior and senior high school students into Navajo. The project consisted of translating a video, a teacher's guide, and an interactive multimedia product on the 1993 hantavirus outbreak in the Four Corners area…

  16. The Titration Project.

    ERIC Educational Resources Information Center

    Kilner, Cary

    1988-01-01

    Discusses the development of concentration and organizational skills, patience, self-discipline, attention to detail, and appreciation for error analysis through an expanded titration project. Describes the various activities in the extended project and the materials and instructional support needed. Stresses the advantage to students in their…

  17. The Home Microbiome Project

    ScienceCinema

    Gilbert, Jack

    2016-07-12

    The Home Microbiome Project is an initiative aimed at uncovering the dynamic co-associations between people's bacteria and the bacteria found in their homes.The hope is that the data and project will show that routine monitoring of the microbial diversity of your body and of the environment in which you live is possible.

  18. The Moon Project

    ERIC Educational Resources Information Center

    Trundle, Kathy Cabe; Willmore, Sandra; Smith, Walter S.

    2006-01-01

    What Australia, Alaska, Qatar, Indiana, and Ohio have in common is the authentic writing More Observations Of Nature (MOON) project. In this unique project, teachers from these disparate geographic locations teamed up to instruct children in grades four through eight via the internet on a nearly universally challenging subject for teachers in the…

  19. The Hospital Project

    ERIC Educational Resources Information Center

    Sánchez, Xiomara

    2007-01-01

    This article describes an investigation of medical facilities undertaken by 3-, 4-, and 5-year-old children in a dual-language prekindergarten program in Chicago, Illinois. The article describes the phases of the project and documents the children's involvement in the project through anecdotal reports, photographs, and samples of the children's…

  20. The Mars Millennium Project.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The countdown to a new century provides a unique opportunity to engage America's youth in charting a course for the future. The Mars Millennium Project challenges students across the nation to design a community yet to be imagined for the planet Mars. This interdisciplinary learning project aims to encourage K-12 students in classrooms and youth…