Science.gov

Sample records for prediction project modificado

  1. Prediction in projection.

    PubMed

    Garland, Joshua; Bradley, Elizabeth

    2015-12-01

    Prediction models that capture and use the structure of state-space dynamics can be very effective. In practice, however, one rarely has access to full information about that structure, and accurate reconstruction of the dynamics from scalar time-series data-e.g., via delay-coordinate embedding-can be a real challenge. In this paper, we show that forecast models that employ incomplete reconstructions of the dynamics-i.e., models that are not necessarily true embeddings-can produce surprisingly accurate predictions of the state of a dynamical system. In particular, we demonstrate the effectiveness of a simple near-neighbor forecast technique that works with a two-dimensional time-delay reconstruction of both low- and high-dimensional dynamical systems. Even though correctness of the topology may not be guaranteed for incomplete reconstructions like this, the dynamical structure that they do capture allows for accurate predictions-in many cases, even more accurate than predictions generated using a traditional embedding. This could be very useful in the context of real-time forecasting, where the human effort required to produce a correct delay-coordinate embedding is prohibitive. PMID:26723147

  2. Prediction in projection

    NASA Astrophysics Data System (ADS)

    Garland, Joshua; Bradley, Elizabeth

    2015-12-01

    Prediction models that capture and use the structure of state-space dynamics can be very effective. In practice, however, one rarely has access to full information about that structure, and accurate reconstruction of the dynamics from scalar time-series data—e.g., via delay-coordinate embedding—can be a real challenge. In this paper, we show that forecast models that employ incomplete reconstructions of the dynamics—i.e., models that are not necessarily true embeddings—can produce surprisingly accurate predictions of the state of a dynamical system. In particular, we demonstrate the effectiveness of a simple near-neighbor forecast technique that works with a two-dimensional time-delay reconstruction of both low- and high-dimensional dynamical systems. Even though correctness of the topology may not be guaranteed for incomplete reconstructions like this, the dynamical structure that they do capture allows for accurate predictions—in many cases, even more accurate than predictions generated using a traditional embedding. This could be very useful in the context of real-time forecasting, where the human effort required to produce a correct delay-coordinate embedding is prohibitive.

  3. The Experimental MJO Prediction Project

    NASA Technical Reports Server (NTRS)

    Waliser, Duane; Weickmann, Klaus; Dole, Randall; Schubert, Siegfried; Alves, Oscar; Jones, Charles; Newman, Matthew; Pan, Hua-Lu; Roubicek, Andres; Saha, Suranjana; Smith, Cathy; VanDenDool, Huug; Vitart, Frederic; Wheeler, Matthew; Whitaker, Jeffrey

    2006-01-01

    Weather prediction is typically concerned with lead times of hours to days, while seasonal-to-interannual climate prediction is concerned with lead times of months to seasons. Recently, there has been growing interest in 'subseasonal' forecasts---those that have lead times on the order of weeks (e.g., Schubert et al. 2002; Waliser et al. 2003; Waliser et al. 2005). The basis for developing and exploiting subseasonal predictions largely resides with phenomena such as the Pacific North American (PNA) pattern, the North Atlantic oscillation (NAO), the Madden-Julian Oscillation (MJO), mid-latitude blocking, and the memory associated with soil moisture, as well as modeling techniques that rely on both initial conditions and slowly varying boundary conditions (e.g., tropical Pacific SST). An outgrowth of this interest has been the development of an Experimental MJO Prediction Project (EMPP). Th project provides real-time weather and climate information and predictions for a variety of applications, broadly encompassing the subseasonal weather-climate connection. Th focus is on the MJO because it represents a repeatable, low-frequency phenomenon. MJO's importance among the subseasonal phenomena is very similar to that of El Nino-Southern Oscillation(ENSO) among the interannual phenomena. This note describes the history and objectives of EMPP, its status,capabilities, and plans.

  4. Decadal climate prediction (project GCEP).

    PubMed

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability. PMID:19087944

  5. The Predictive Validity of Projective Measures.

    ERIC Educational Resources Information Center

    Suinn, Richard M.; Oskamp, Stuart

    Written for use by clinical practitioners as well as psychological researchers, this book surveys recent literature (1950-1965) on projective test validity by reviewing and critically evaluating studies which shed light on what may reliably be predicted from projective test results. Two major instruments are covered: the Rorschach and the Thematic…

  6. The NIEHS Predictive-Toxicology Evaluation Project.

    PubMed

    Bristol, D W; Wachsman, J T; Greenwell, A

    1996-10-01

    The Predictive-Toxicology Evaluation (PTE) project conducts collaborative experiments that subject the performance of predictive-toxicology (PT) methods to rigorous, objective evaluation in a uniquely informative manner. Sponsored by the National Institute of Environmental Health Sciences, it takes advantage of the ongoing testing conducted by the U.S. National Toxicology Program (NTP) to estimate the true error of models that have been applied to make prospective predictions on previously untested, noncongeneric-chemical substances. The PTE project first identifies a group of standardized NTP chemical bioassays either scheduled to be conducted or are ongoing, but not yet complete. The project then announces and advertises the evaluation experiment, disseminates information about the chemical bioassays, and encourages researchers from a wide variety of disciplines to publish their predictions in peer-reviewed journals, using whatever approaches and methods they feel are best. A collection of such papers is published in this Environmental Health Perspectives Supplement, providing readers the opportunity to compare and contrast PT approaches and models, within the context of their prospective application to an actual-use situation. This introduction to this collection of papers on predictive toxicology summarizes the predictions made and the final results obtained for the 44 chemical carcinogenesis bioassays of the first PTE experiment (PTE-1) and presents information that identifies the 30 chemical carcinogenesis bioassays of PTE-2, along with a table of prediction sets that have been published to date. It also provides background about the origin and goals of the PTE project, outlines the special challenge associated with estimating the true error of models that aspire to predict open-system behavior, and summarizes what has been learned to date. PMID:8933048

  7. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    PubMed Central

    Mansouri, Kamel; Abdelaziz, Ahmed; Rybacka, Aleksandra; Roncaglioni, Alessandra; Tropsha, Alexander; Varnek, Alexandre; Zakharov, Alexey; Worth, Andrew; Richard, Ann M.; Grulke, Christopher M.; Trisciuzzi, Daniela; Fourches, Denis; Horvath, Dragos; Benfenati, Emilio; Muratov, Eugene; Wedebye, Eva Bay; Grisoni, Francesca; Mangiatordi, Giuseppe F.; Incisivo, Giuseppina M.; Hong, Huixiao; Ng, Hui W.; Tetko, Igor V.; Balabin, Ilya; Kancherla, Jayaram; Shen, Jie; Burton, Julien; Nicklaus, Marc; Cassotti, Matteo; Nikolov, Nikolai G.; Nicolotti, Orazio; Andersson, Patrik L.; Zang, Qingda; Politi, Regina; Beger, Richard D.; Todeschini, Roberto; Huang, Ruili; Farag, Sherif; Rosenberg, Sine A.; Slavov, Svetoslav; Hu, Xin; Judson, Richard S.

    2016-01-01

    Background: Humans are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Most of these chemicals have never been tested for their ability to interact with the estrogen receptor (ER). Risk assessors need tools to prioritize chemicals for evaluation in costly in vivo tests, for instance, within the U.S. EPA Endocrine Disruptor Screening Program. Objectives: We describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) and demonstrate the efficacy of using predictive computational models trained on high-throughput screening data to evaluate thousands of chemicals for ER-related activity and prioritize them for further testing. Methods: CERAPP combined multiple models developed in collaboration with 17 groups in the United States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure–activity relationship models and docking approaches were employed, mostly using a common training set of 1,677 chemical structures provided by the U.S. EPA, to build a total of 40 categorical and 8 continuous models for binding, agonist, and antagonist ER activity. All predictions were evaluated on a set of 7,522 chemicals curated from the literature. To overcome the limitations of single models, a consensus was built by weighting models on scores based on their evaluated accuracies. Results: Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. Out of the 32,464 chemicals, the consensus model predicted 4,001 chemicals (12.3%) as high priority actives and 6,742 potential actives (20.8%) to be considered for further testing. Conclusion: This project demonstrated the possibility to screen large libraries of chemicals using a consensus of different in silico approaches. This concept will be applied in future projects related to other

  8. Projections and predictability of Arctic shipping accessibility

    NASA Astrophysics Data System (ADS)

    Melia, Nathanael; Haines, Keith; Hawkins, Ed

    2016-04-01

    The observed reduction in Arctic sea ice opens up the potential for shorter shipping routes across the Arctic Ocean, leading to potentially significant global economic savings. We demonstrate, using bias-corrected global climate models, that the projected sea ice melt through the 21st century increases opportunities for ships to sail through the Arctic between North Atlantic and East Asian ports. Transit potential for Open Water vessels doubles from early to mid-century and coincides with the opening of the trans-polar sea route. Although seasonal, routes become more reliable with an overall increased shipping season length, but with considerable variability from year-to-year. We also demonstrate that there is potential predictability for whether a particular season will be relatively open or closed to shipping access from a few months ahead.

  9. The Vog Measurement and Prediction (VMAP) Project

    NASA Astrophysics Data System (ADS)

    Businger, S.; Huff, R.; Sutton, A. J.; Elias, T.; Horton, K. A.

    2011-12-01

    Emissions from Kilauea volcano pose significant environmental and health risks to Hawaii. The overarching goal of this feasibility project is to develop an accurate and timely volcanic air-pollution forecasting capacity with a program of verification using state-of-the-art observation methods. To date VMAP has (i) created a real-time modeling and forecast capability using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to predict the concentration and dispersion of SO2 gas and sulfate aerosol from Kilauea volcano (Fig. 1). HYSPLIT uses the output of a high-resolution operational run of the Weather Research and Forecast (WRF) model for initial and boundary conditions. (ii) Developed an operational spectrometer-based SO2 emission rate monitor for use as input to the dispersion model, (iii) Cooperatively deployed an array of stationary SO2 gas and sulfate aerosol sensors to record the ground-level spatial characteristics of Kilauea's gas plume in high temporal and spatial resolution for verification and improvement of the gas dispersion prediction, (iv) Developed a series of web pages to disseminate observations and forecasts, which can be used by safety officials to protect the public, and to raise public awareness of the hazards of volcanic emissions to respiratory health, agriculture, and general aviation, (v) Developed an archive of vog data to facilitate estimation of historical concentration frequency-of-exposure. VMAP provides technical support for researchers, health professionals, and to our stakeholders, who have also provided constructive input in the development of our products. Preliminary results of our efforts will be presented and future work will be discussed.

  10. Geospatial application of the Water Erosion Prediction Project (WEPP) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based technology for prediction of soil erosion by water at hillslope profile, field, and small watershed scales. In particular, WEPP utilizes observed or generated daily climate inputs to drive the surface hydrology processes (infiltrat...

  11. EVA Robotic Assistant Project: Platform Attitude Prediction

    NASA Technical Reports Server (NTRS)

    Nickels, Kevin M.

    2003-01-01

    The Robotic Systems Technology Branch is currently working on the development of an EVA Robotic Assistant under the sponsorship of the Surface Systems Thrust of the NASA Cross Enterprise Technology Development Program (CETDP). This will be a mobile robot that can follow a field geologist during planetary surface exploration, carry his tools and the samples that he collects, and provide video coverage of his activity. Prior experiments have shown that for such a robot to be useful it must be able to follow the geologist at walking speed over any terrain of interest. Geologically interesting terrain tends to be rough rather than smooth. The commercial mobile robot that was recently purchased as an initial testbed for the EVA Robotic Assistant Project, an ATRV Jr., is capable of faster than walking speed outside but it has no suspension. Its wheels with inflated rubber tires are attached to axles that are connected directly to the robot body. Any angular motion of the robot produced by driving over rough terrain will directly affect the pointing of the on-board stereo cameras. The resulting image motion is expected to make tracking of the geologist more difficult. This will either require the tracker to search a larger part of the image to find the target from frame to frame or to search mechanically in pan and tilt whenever the image motion is large enough to put the target outside the image in the next frame. This project consists of the design and implementation of a Kalman filter that combines the output of the angular rate sensors and linear accelerometers on the robot to estimate the motion of the robot base. The motion of the stereo camera pair mounted on the robot that results from this motion as the robot drives over rough terrain is then straightforward to compute. The estimates may then be used, for example, to command the robot s on-board pan-tilt unit to compensate for the camera motion induced by the base movement. This has been accomplished in two ways

  12. Geospatial application of the Water Erosion Prediction Project (WEPP) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    At the hillslope profile and/or field scale, a simple Windows graphical user interface (GUI) is available to easily specify the slope, soil, and management inputs for application of the USDA Water Erosion Prediction Project (WEPP) model. Likewise, basic small watershed configurations of a few hillsl...

  13. Water Erosion Prediction Project (WEPP) model status and updates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This presentation will provide current information on the USDA-ARS Water Erosion Prediction Project (WEPP) model, and its implementation by the USDA-Forest Service (FS), USDA-Natural Resources Conservation Service (NRCS), and other agencies and universities. Most recently, the USDA-NRCS has begun ef...

  14. Noise prediction and control of Pudong International Airport expansion project.

    PubMed

    Lei, Bin; Yang, Xin; Yang, Jianguo

    2009-04-01

    The Environmental Impact Assessment (EIA) process of the third runway building project of Pudong International Airport is briefly introduced in the paper. The basic principle, the features, and the operation steps of newly imported FAA's Integrated Noise Model (INM) are discussed for evaluating the aircraft noise impacts. The prediction of the aircraft noise and the countermeasures for the noise mitigation are developed, which includes the reasonable runway location, the optimized land use, the selection of low noise aircrafts, the Fly Quit Program, the relocation of sensitive receptors and the noise insulation of sensitive buildings. Finally, the expansion project is justified and its feasibility is confirmed. PMID:18373206

  15. GEWEX America Prediction Project (GAPP) Science and Implementation Plan

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose of this Science and Implementation Plan is to describe GAPP science objectives and the activities required to meet these objectives, both specifically for the near-term and more generally for the longer-term. The GEWEX Americas Prediction Project (GAPP) is part of the Global Energy and Water Cycle Experiment (GEWEX) initiative that is aimed at observing, understanding and modeling the hydrological cycle and energy fluxes at various time and spatial scales. The mission of GAPP is to demonstrate skill in predicting changes in water resources over intraseasonal-to-interannual time scales, as an integral part of the climate system.

  16. Predictions of Chemical Weather in Asia: The EU Panda Project

    NASA Astrophysics Data System (ADS)

    Brasseur, G. P.; Petersen, A. K.; Wang, X.; Granier, C.; Bouarar, I.

    2014-12-01

    Air quality has become a pressing problem in Asia and specifically in China due to rapid economic development (i.e., rapidly expanding motor vehicle fleets, growing industrial and power generation activities, domestic and biomass burning). In spite of efforts to reduce chemical emissions, high levels of particle matter and ozone are observed and lead to severe health problems with a large number of premature deaths. To support efforts to reduce air pollution, the European Union is supporting the PANDA project whose objective is to use space and surface observations of chemical species as well as advanced meteorological and chemical models to analyze and predict air quality in China. The Project involves 7 European and 7 Chinese groups. The paper will describe the objectives of the project and present some first accomplishments. The project focuses on the improvement of methods for monitoring air quality from combined space and in-situ observations, the development of a comprehensive prediction system that makes use of these observations, the elaboration of indicators for air quality in support of policies, and the development of toolboxes for the dissemination of information.

  17. Space debris orbit prediction requirements in the CLEANSPACE project

    NASA Astrophysics Data System (ADS)

    Wnuk, Edwin; Jacquelard, Christophe; Esmiller, Bruno; Speiser, Jochen

    2012-07-01

    Overall CLEANSPACE objective is to define a global architecture (including surveillance, identification and tracking) for an innovative ground-based laser solution which can remove hazardous medium debris around selected space assets. The CLEANSPACE project is realized by an European consortium in the frame of the European Commission Seventh Framework Programme (FP7), Space theme. The paper, in the first part, will present general information about the CLEANSPACE including the main drivers and requirements. Orbital requirements of space debris which have to be reached in the project will be discussed in the second part of the paper. Proposed systems of removal space debris objects with the use of sequence of laser operations, like the CLEANSPACE system, needs very precise predictions of future space debris orbital positions, on a level even better than 1 meter. Orbit determination, tracking (radar, optical and laser) and orbit prediction have to be performed with accuracy much better than so far. The applied prediction tools have to take into account all perturbation factors which influence object orbit, mainly geopotential effects with arbitrary degree and order spherical harmonic coefficients, luni-solar attractions, solar radiation pressure and atmospheric drag. In our paper we discuss the influence of all important perturbation factors on the space debris orbital motion, taking into account different contemporary force models, in particular, geopotential models, atmospheric models and the Sun and the Moon ephemeris.

  18. Predicting future uncertainty constraints on global warming projections

    PubMed Central

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  19. Predicting future uncertainty constraints on global warming projections

    NASA Astrophysics Data System (ADS)

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.

  20. Predicting future uncertainty constraints on global warming projections.

    PubMed

    Shiogama, H; Stone, D; Emori, S; Takahashi, K; Mori, S; Maeda, A; Ishizaki, Y; Allen, M R

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  1. Transistor roadmap projection using predictive full-band atomistic modeling

    SciTech Connect

    Salmani-Jelodar, M. Klimeck, G.; Kim, S.; Ng, K.

    2014-08-25

    In this letter, a full band atomistic quantum transport tool is used to predict the performance of double gate metal-oxide-semiconductor field-effect transistors (MOSFETs) over the next 15 years for International Technology Roadmap for Semiconductors (ITRS). As MOSFET channel lengths scale below 20 nm, the number of atoms in the device cross-sections becomes finite. At this scale, quantum mechanical effects play an important role in determining the device characteristics. These quantum effects can be captured with the quantum transport tool. Critical results show the ON-current degradation as a result of geometry scaling, which is in contrast to previous ITRS compact model calculations. Geometric scaling has significant effects on the ON-current by increasing source-to-drain (S/D) tunneling and altering the electronic band structure. By shortening the device gate length from 20 nm to 5.1 nm, the ratio of S/D tunneling current to the overall subthreshold OFF-current increases from 18% to 98%. Despite this ON-current degradation by scaling, the intrinsic device speed is projected to increase at a rate of at least 8% per year as a result of the reduction of the quantum capacitance.

  2. Transistor roadmap projection using predictive full-band atomistic modeling

    NASA Astrophysics Data System (ADS)

    Salmani-Jelodar, M.; Kim, S.; Ng, K.; Klimeck, G.

    2014-08-01

    In this letter, a full band atomistic quantum transport tool is used to predict the performance of double gate metal-oxide-semiconductor field-effect transistors (MOSFETs) over the next 15 years for International Technology Roadmap for Semiconductors (ITRS). As MOSFET channel lengths scale below 20 nm, the number of atoms in the device cross-sections becomes finite. At this scale, quantum mechanical effects play an important role in determining the device characteristics. These quantum effects can be captured with the quantum transport tool. Critical results show the ON-current degradation as a result of geometry scaling, which is in contrast to previous ITRS compact model calculations. Geometric scaling has significant effects on the ON-current by increasing source-to-drain (S/D) tunneling and altering the electronic band structure. By shortening the device gate length from 20 nm to 5.1 nm, the ratio of S/D tunneling current to the overall subthreshold OFF-current increases from 18% to 98%. Despite this ON-current degradation by scaling, the intrinsic device speed is projected to increase at a rate of at least 8% per year as a result of the reduction of the quantum capacitance.

  3. Drought Prediction for Socio-Cultural Stability Project

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa; Eylander, John B.; Koster, Randall; Narapusetty, Balachandrudu; Kumar, Sujay; Rodell, Matt; Bolten, John; Mocko, David; Walker, Gregory; Arsenault, Kristi; Rheingrover, Scott

    2014-01-01

    The primary objective of this project is to answer the question: "Can existing, linked infrastructures be used to predict the onset of drought months in advance?" Based on our work, the answer to this question is "yes" with the qualifiers that skill depends on both lead-time and location, and especially with the associated teleconnections (e.g., ENSO, Indian Ocean Dipole) active in a given region season. As part of this work, we successfully developed a prototype drought early warning system based on existing/mature NASA Earth science components including the Goddard Earth Observing System Data Assimilation System Version 5 (GEOS-5) forecasting model, the Land Information System (LIS) land data assimilation software framework, the Catchment Land Surface Model (CLSM), remotely sensed terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE) and remotely sensed soil moisture products from the Aqua/Advanced Microwave Scanning Radiometer - EOS (AMSR-E). We focused on a single drought year - 2011 - during which major agricultural droughts occurred with devastating impacts in the Texas-Mexico region of North America (TEXMEX) and the Horn of Africa (HOA). Our results demonstrate that GEOS-5 precipitation forecasts show skill globally at 1-month lead, and can show up to 3 months skill regionally in the TEXMEX and HOA areas. Our results also demonstrate that the CLSM soil moisture percentiles are a goof indicator of drought, as compared to the North American Drought Monitor of TEXMEX and a combination of Famine Early Warning Systems Network (FEWS NET) data and Moderate Resolution Imaging Spectrometer (MODIS)'s Normalizing Difference Vegetation Index (NDVI) anomalies over HOA. The data assimilation experiments produced mixed results. GRACE terrestrial water storage (TWS) assimilation was found to significantly improve soil moisture and evapotransportation, as well as drought monitoring via soil moisture percentiles, while AMSR-E soil moisture

  4. Project Evaluation: Validation of a Scale and Analysis of Its Predictive Capacity

    ERIC Educational Resources Information Center

    Fernandes Malaquias, Rodrigo; de Oliveira Malaquias, Fernanda Francielle

    2014-01-01

    The objective of this study was to validate a scale for assessment of academic projects. As a complement, we examined its predictive ability by comparing the scores of advised/corrected projects based on the model and the final scores awarded to the work by an examining panel (approximately 10 months after the project design). Results of…

  5. Improving frost-simulation subroutines of the Water Erosion Prediction Project (WEPP) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Erosion models play an important role in assessing the influence of human activities on the environment. For cold areas, adequate frost simulation is crucial for predicting surface runoff and water erosion. The Water Erosion Prediction Project (WEPP) model, physically-based erosion-prediction softwa...

  6. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    ERIC Educational Resources Information Center

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  7. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project

    PubMed Central

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%–80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects. PMID:26495427

  8. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    PubMed

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects. PMID:26495427

  9. Reflexión bioética sobre el uso de organismos genéticamente modificados

    PubMed Central

    Yunta, Eduardo Rodríguez

    2011-01-01

    El presente artículo reflexiona desde los 4 principios de la bioética el uso comercial de organismos genéticamente modificados. Se cuestiona fundamentalmente la falta de transferencia de tecnología entre el mundo desarrollado y en desarrollo y el que el presente sistema de patentamiento de organismos vivos modificados fomenta intereses comerciales y no da debida importancia al desarrollo sostenible de la agricultura y ganadería en los países en desarrollo, donde más se necesita. Se reflexiona sobre la importancia que tiene evaluar los riesgos antes de introducirse en el mercado organismos genéticamente modificados y la necesidad de regulación en los países. PMID:21927675

  10. Demonstration of the Water Erosion Prediction Project (WEPP) internet interface and services

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based FORTRAN computer simulation program for prediction of runoff and soil erosion by water at hillslope profile, field, and small watershed scales. To effectively run the WEPP model and interpret results additional software has been de...

  11. Water Erosion Prediction Project (WEPP) –Development History, Model Capabilities and Future Enhancements

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) was initiated in August 1985 to develop new generation water erosion prediction technology for use by federal agencies involved in soil and water conservation and environmental planning and assessment. Developed by USDA-ARS as a replacement for empirically...

  12. A Global Perspective: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    NASA Technical Reports Server (NTRS)

    Zhang, Taiping; Stackhouse, Paul W., Jr.; Chandler, William S.; Hoell, James M.; Westberg, David; Whitlock, Charles H.

    2007-01-01

    The Prediction of the Worldwide Energy Resources (POWER) Project, initiated under the NASA Science Mission Directorate Applied Science Energy Management Program, synthesizes and analyzes data on a global scale that are invaluable to the renewable energy industries, especially to the solar and wind energy sectors. The POWER project derives its data primarily from NASA's World Climate Research Programme (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Version 2.9) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (Version 4). The latest development of the NASA POWER Project and its plans for the future are presented in this paper.

  13. The NASA Seasonal-to-Interannual Prediction Project (NSIPP). [Annual Report for 2000

    NASA Technical Reports Server (NTRS)

    Rienecker, Michele; Suarez, Max; Adamec, David; Koster, Randal; Schubert, Siegfried; Hansen, James; Koblinsky, Chester (Technical Monitor)

    2001-01-01

    The goal of the project is to develop an assimilation and forecast system based on a coupled atmosphere-ocean-land-surface-sea-ice model capable of using a combination of satellite and in situ data sources to improve the prediction of ENSO and other major S-I signals and their global teleconnections. The objectives of this annual report are to: (1) demonstrate the utility of satellite data, especially surface height surface winds, air-sea fluxes and soil moisture, in a coupled model prediction system; and (2) aid in the design of the observing system for short-term climate prediction by conducting OSSE's and predictability studies.

  14. Inroads to Predict in Vivo Toxicology—An Introduction to the eTOX Project

    PubMed Central

    Briggs, Katharine; Cases, Montserrat; Heard, David J.; Pastor, Manuel; Pognan, François; Sanz, Ferran; Schwab, Christof H.; Steger-Hartmann, Thomas; Sutter, Andreas; Watson, David K.; Wichard, Jörg D.

    2012-01-01

    There is a widespread awareness that the wealth of preclinical toxicity data that the pharmaceutical industry has generated in recent decades is not exploited as efficiently as it could be. Enhanced data availability for compound comparison (“read-across”), or for data mining to build predictive tools, should lead to a more efficient drug development process and contribute to the reduction of animal use (3Rs principle). In order to achieve these goals, a consortium approach, grouping numbers of relevant partners, is required. The eTOX (“electronic toxicity”) consortium represents such a project and is a public-private partnership within the framework of the European Innovative Medicines Initiative (IMI). The project aims at the development of in silico prediction systems for organ and in vivo toxicity. The backbone of the project will be a database consisting of preclinical toxicity data for drug compounds or candidates extracted from previously unpublished, legacy reports from thirteen European and European operation-based pharmaceutical companies. The database will be enhanced by incorporation of publically available, high quality toxicology data. Seven academic institutes and five small-to-medium size enterprises (SMEs) contribute with their expertise in data gathering, database curation, data mining, chemoinformatics and predictive systems development. The outcome of the project will be a predictive system contributing to early potential hazard identification and risk assessment during the drug development process. The concept and strategy of the eTOX project is described here, together with current achievements and future deliverables. PMID:22489185

  15. Adapting the Water Erosion Prediction Project (WEPP) Model for Forest Applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There has been an increasing public concern over forest stream pollution by excessive sedimentation due to natural or human disturbances. Adequate erosion simulation tools are needed for sound management of forest resources. The Water Erosion Prediction Project (WEPP) watershed model has proved usef...

  16. A project to predict meteor showers from all potential parent comets

    NASA Astrophysics Data System (ADS)

    Hajdukova, M.; Neslusan, L.; Tomko, D.; Jakubik, M.; Kanuchova, Z.

    2015-10-01

    In this project, new meteor showers associated with known periodic comets have been predicted, new parent bodies associated with known meteor showers have been suggested, and new relationships among the meteor showers that belong to the same complex have been found. Here, we present an overview of our results from the modelling of diverse meteorshower complexes [1].

  17. Application of the Water Erosion Prediction Project (WEPP) Model for Soil Erosion Estimation and Conservation Planning

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based, continuous- simulation, distributed parameter erosion simulation model for application to field-scale hillslope profiles and small watersheds. Developed over the past 25 years by the United States Department of Agriculture, it con...

  18. Predictive Models for Success in Occupational Education. Occupational Research Project Final Report.

    ERIC Educational Resources Information Center

    Lynch, Mary V.

    A comprehensive guidance program aimed at predicting chances of success in a student's choice of programs in community college and occupational programs, this project was undertaken during the fall of 1971 at Wayne Community College. The subjects used were those seniors from the five high schools who were interested in one of the vocational…

  19. Implementation of Channel-Routing Routines in the Water Erosion Prediction Project (WEPP) Model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based, continuous-simulation, watershed hydrology and erosion model. It is an important tool for water erosion simulation owing to its unique functionality in representing diverse landuse and management conditions. Its applicability is l...

  20. WATER EROSION PREDICTION PROJECT (WEPP) TECHNOLOGY FOR ASSESSMENT OF RUNOFF, SOIL LOSS AND SEDIMENT YIELD POTENTIAL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based, distributed parameter, continuous simulation computer program for estimation of runoff, soil loss and sediment yield from fields and small watersheds. In addition to having large databases for application to a multitude of U.S. s...

  1. International H2O Project (IHOP) 2002: Datasets Related to Atmospheric Moisture and Rainfall Prediction

    DOE Data Explorer

    Schanot, Allen [IHOP 2002 PI; Friesen, Dick [IHOP 2002 PI

    IHOP 2002 was a field experiment that took place over the Southern Great Plains of the United States from 13 May to 25 June 2002. The chief aim of IHOP_2002 was improved characterization of the four-dimensional (4-D) distribution of water vapor and its application to improving the understanding and prediction of convection. The region was an optimal location due to existing experimental and operational facilities, strong variability in moisture, and active convection [copied from http://www.eol.ucar.edu/projects/ihop/]. The project's master list of data identifies 146 publicly accessible datasets.

  2. Analytical Tools to Predict Distribution Outage Restoration Load. Final Project Report.

    SciTech Connect

    Law, John

    1994-11-14

    The main activity of this project has been twofold: (1) development of a computer model to predict CLPU(Cold Load Pickup) and (2) development of a field measurement and analysis method to obtain the input parameters of the CLPU model. The field measurement and analysis method is called the Step-Voltage-Test (STEPV). The Kootenai Electric Cooperative Appleway 51 feeder in Coeur d`Alene was selected for analysis in this project and STEPV tests were performed in winters of 92 and 93. The STEPV data was analyzed (method and results presented within this report) to obtain the Appleway 51 feeder parameters for prediction by the CLPU model. One only CLPU record was obtained in winter 1994. Unfortunately, the actual CLPU was not dramatic (short outage and moderate temperature) and did not display cyclic restoration current. A predicted Appleway 51 feeder CLPU was generated using the parameters obtained via the STEPV measurement/analysis/algorithm method at the same ambient temperature and outage duration as the measured actual CLPU. The predicted CLPU corresponds reasonably well with the single actual CLPU data obtained in winter 1994 on the Appleway 51 feeder.

  3. A GLOBAL ASSESSMENT OF SOLAR ENERGY RESOURCES: NASA's Prediction of Worldwide Energy Resources (POWER) Project

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Stackhouse, P. W., Jr.; Chandler, W.; Hoell, J. M.; Westberg, D.; Whitlock, C. H.

    2010-12-01

    NASA's POWER project, or the Prediction of the Worldwide Energy Resources project, synthesizes and analyzes data on a global scale. The products of the project find valuable applications in the solar and wind energy sectors of the renewable energy industries. The primary source data for the POWER project are NASA's World Climate Research Project (WCRP)/Global Energy and Water cycle Experiment (GEWEX) Surface Radiation Budget (SRB) project (Release 3.0) and the Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS) assimilation model (V 4.0.3). Users of the POWER products access the data through NASA's Surface meteorology and Solar Energy (SSE, Version 6.0) website (http://power.larc.nasa.gov). Over 200 parameters are available to the users. The spatial resolution is 1 degree by 1 degree now and will be finer later. The data covers from July 1983 to December 2007, a time-span of 24.5 years, and are provided as 3-hourly, daily and monthly means. As of now, there have been over 18 million web hits and over 4 million data file downloads. The POWER products have been systematically validated against ground-based measurements, and in particular, data from the Baseline Surface Radiation Network (BSRN) archive, and also against the National Solar Radiation Data Base (NSRDB). Parameters such as minimum, maximum, daily mean temperature and dew points, relative humidity and surface pressure are validated against the National Climate Data Center (NCDC) data. SSE feeds data directly into Decision Support Systems including RETScreen International clean energy project analysis software that is written in 36 languages and has greater than 260,000 users worldwide.

  4. Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Kolditz, O.

    2013-12-01

    The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work

  5. EPA Project Updates: DSSTox and ToxCast Generating New Data and Data Linkages for Use in Predictive Modeling

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...

  6. The Climate Variability & Predictability (CVP) Program at NOAA - DYNAMO Recent Project Advancements

    NASA Astrophysics Data System (ADS)

    Lucas, S. E.; Todd, J. F.; Higgins, W.

    2013-12-01

    The Climate Variability & Predictability (CVP) Program supports research aimed at providing process-level understanding of the climate system through observation, modeling, analysis, and field studies. This vital knowledge is needed to improve climate models and predictions so that scientists can better anticipate the impacts of future climate variability and change. To achieve its mission, the CVP Program supports research carried out at NOAA and other federal laboratories, NOAA Cooperative Institutes, and academic institutions. The Program also coordinates its sponsored projects with major national and international scientific bodies including the World Climate Research Programme (WCRP), the International Geosphere-Biosphere Programme (IGBP), and the U.S. Global Change Research Program (USGCRP). The CVP program sits within the Earth System Science (ESS) Division at NOAA's Climate Program Office. Dynamics of the Madden-Julian Oscillation (DYNAMO): The Indian Ocean is one of Earth's most sensitive regions because the interactions between ocean and atmosphere there have a discernable effect on global climate patterns. The tropical weather that brews in that region can move eastward along the equator and reverberate around the globe, shaping weather and climate in far-off places. The vehicle for this variability is a phenomenon called the Madden-Julian Oscillation, or MJO. The MJO, which originates over the Indian Ocean roughly every 30 to 90 days, is known to influence the Asian and Australian monsoons. It can also enhance hurricane activity in the northeast Pacific and Gulf of Mexico, trigger torrential rainfall along the west coast of North America, and affect the onset of El Niño. CVP-funded scientists participated in the DYNAMO field campaign in 2011-12. Results from this international campaign are expected to improve researcher's insights into this influential phenomenon. A better understanding of the processes governing MJO is an essential step toward

  7. NASA's Seasonal-to-Interannual Prediction Project: In Partnership With the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Researchers with NASA's Season-to-Interannual Prediction Project (NSIPP) refer to different types of memory when running models on NCCS computers: the computer memory required for their models and the memory of the atmosphere or the ocean. Because of the atmosphere's chaotic nature, its memory is short. For weather predictions, the initial information taken from atmospheric observations has a limited useful life. Currently, there is no way to take observations, initialize an atmosphere model, integrate ahead in time, and make an accurate weather forecast beyond about 2 weeks. After that, the system becomes chaotic. What conditions could be used to make predictions beyond 2 weeks? If not conditions in the atmosphere, then the memory must be found somewhere else. That place is in the oceans. Although most changes in the atmosphere vary on a short timescale, the weather being a prime example, some important large atmospheric climate variations occur over much longer timescales-month s, years, or decades. NSIPP is interested specifically in those phenomena that occur over timescales of several months to a few years, and the El Nino Southern Oscillation (ENSO) is the most significant of these.

  8. Predicting and mapping potential Whooping Crane stopover habitat to guide site selection for wind energy projects.

    PubMed

    Belaire, J Amy; Kreakie, Betty J; Keitt, Timothy; Minor, Emily

    2014-04-01

    Migratory stopover habitats are often not part of planning for conservation or new development projects. We identified potential stopover habitats within an avian migratory flyway and demonstrated how this information can guide the site-selection process for new development. We used the random forests modeling approach to map the distribution of predicted stopover habitat for the Whooping Crane (Grus americana), an endangered species whose migratory flyway overlaps with an area where wind energy development is expected to become increasingly important. We then used this information to identify areas for potential wind power development in a U.S. state within the flyway (Nebraska) that minimize conflicts between Whooping Crane stopover habitat and the development of clean, renewable energy sources. Up to 54% of our study area was predicted to be unsuitable as Whooping Crane stopover habitat and could be considered relatively low risk for conflicts between Whooping Cranes and wind energy development. We suggest that this type of analysis be incorporated into the habitat conservation planning process in areas where incidental take permits are being considered for Whooping Cranes or other species of concern. Field surveys should always be conducted prior to construction to verify model predictions and understand baseline conditions. PMID:24372936

  9. Toward a unified system for understanding, predicting and projecting regional hurricane activity

    NASA Astrophysics Data System (ADS)

    Vecchi, G. A.; Delworth, T. L.; Yang, X.; Murakami, H.; Zhang, W.; Underwood, S.; Zeng, F. J.; Jia, L.; Kapnick, S. B.; Paffendorf, K.; Krishnamurthy, L.; Wittenberg, A. T.; Msadek, R.; Villarini, G.; Chen, J. H.; Lin, S. J.; Harris, L.; Gudgel, R.; Stern, B.; Zhang, S.

    2015-12-01

    A family of high-resolution (50km and 25km atmospheric/land resolution) global coupled climate models provide a unified framework towards the understanding, intraseasonal-to-decadal prediction and decadal to multi-decadal projection of regional and extreme climate, including tropical cyclones. Initialized predictions of global hurricane activity show skill on regional scales, comparable to the skill on basin-wide scales, suggesting that regional seasonal TC predictions may be a feasible forecast target. The 25km version of the model shows skill at seasonal predictions of the frequency of the most intense hurricanes (Cat. 3-4-5 and Cat. 4-5). It is shown that large-scale systematic errors in the mean-state are a key constraint on the simulation and prediction of variations of regional climate and extremes, and methodologies for overcoming model biases are explored. Improvements in predictions of regional climate are due both to improved representation of local processes, and to improvements in the large-scale climate and variability from improved process representation. These models are used to explore the the response of tropical cyclones, both globally and regionally, to increasing greenhouse gases and to internal climate variations. The 25km model in generally shows a more faithful representation of the impact of climate variability on hurricane activity than the 50km model. The response of the total number and the total power dissipation index of tropical cyclones to increasing greenhouse gases can differ substantially between models of two atmospheric resolutions, 50km and 25km - with the 25km version of the model showing a larger increase in power dissipation from increasing greenhouse gases, principally because - in contrast to that of the 50km model - its global hurricane frequency does not decrease with increasing CO2. Some thoughts on the reasons behind those differences will be offered. The 25km model shows an increase in the frequency of intense tropical

  10. Abrupt CO2 experiments as tools for predicting and understanding CMIP5 representative concentration pathway projections

    NASA Astrophysics Data System (ADS)

    Good, Peter; Gregory, Jonathan M.; Lowe, Jason A.; Andrews, Timothy

    2013-02-01

    A fast simple climate modelling approach is developed for predicting and helping to understand general circulation model (GCM) simulations. We show that the simple model reproduces the GCM results accurately, for global mean surface air temperature change and global-mean heat uptake projections from 9 GCMs in the fifth coupled model inter-comparison project (CMIP5). This implies that understanding gained from idealised CO2 step experiments is applicable to policy-relevant scenario projections. Our approach is conceptually simple. It works by using the climate response to a CO2 step change taken directly from a GCM experiment. With radiative forcing from non-CO2 constituents obtained by adapting the Forster and Taylor method, we use our method to estimate results for CMIP5 representative concentration pathway (RCP) experiments for cases not run by the GCMs. We estimate differences between pairs of RCPs rather than RCP anomalies relative to the pre-industrial state. This gives better results because it makes greater use of available GCM projections. The GCMs exhibit differences in radiative forcing, which we incorporate in the simple model. We analyse the thus-completed ensemble of RCP projections. The ensemble mean changes between 1986-2005 and 2080-2099 for global temperature (heat uptake) are, for RCP8.5: 3.8 K (2.3 × 1024 J); for RCP6.0: 2.3 K (1.6 × 1024 J); for RCP4.5: 2.0 K (1.6 × 1024 J); for RCP2.6: 1.1 K (1.3 × 1024 J). The relative spread (standard deviation/ensemble mean) for these scenarios is around 0.2 and 0.15 for temperature and heat uptake respectively. We quantify the relative effect of mitigation action, through reduced emissions, via the time-dependent ratios (change in RCPx)/(change in RCP8.5), using changes with respect to pre-industrial conditions. We find that the effects of mitigation on global-mean temperature change and heat uptake are very similar across these different GCMs.

  11. Predicting project environmental performance under market uncertainties: case study of oil sands coke.

    PubMed

    McKellar, Jennifer M; Bergerson, Joule A; Kettunen, Janne; MacLean, Heather L

    2013-06-01

    A method combining life cycle assessment (LCA) and real options analyses is developed to predict project environmental and financial performance over time, under market uncertainties and decision-making flexibility. The method is applied to examine alternative uses for oil sands coke, a carbonaceous byproduct of processing the unconventional petroleum found in northern Alberta, Canada. Under uncertainties in natural gas price and the imposition of a carbon price, our method identifies that selling the coke to China for electricity generation by integrated gasification combined cycle is likely to be financially preferred initially, but eventually hydrogen production in Alberta is likely to be preferred. Compared to the results of a previous study that used life cycle costing to identify the financially preferred alternative, the inclusion of real options analysis adds value as it accounts for flexibility in decision-making (e.g., to delay investment), increasing the project's expected net present value by 25% and decreasing the expected life cycle greenhouse gas emissions by 11%. Different formulations of the carbon pricing policy or changes to the natural gas price forecast alter these findings. The combined LCA/real options method provides researchers and decision-makers with more comprehensive information than can be provided by either technique alone. PMID:23675646

  12. NERI PROJECT 99-119. TASK 2. DATA-DRIVEN PREDICTION OF PROCESS VARIABLES. FINAL REPORT

    SciTech Connect

    Upadhyaya, B.R.

    2003-04-10

    This report describes the detailed results for task 2 of DOE-NERI project number 99-119 entitled ''Automatic Development of Highly Reliable Control Architecture for Future Nuclear Power Plants''. This project is a collaboration effort between the Oak Ridge National Laboratory (ORNL,) The University of Tennessee, Knoxville (UTK) and the North Carolina State University (NCSU). UTK is the lead organization for Task 2 under contract number DE-FG03-99SF21906. Under task 2 we completed the development of data-driven models for the characterization of sub-system dynamics for predicting state variables, control functions, and expected control actions. We have also developed the ''Principal Component Analysis (PCA)'' approach for mapping system measurements, and a nonlinear system modeling approach called the ''Group Method of Data Handling (GMDH)'' with rational functions, and includes temporal data information for transient characterization. The majority of the results are presented in detailed reports for Phases 1 through 3 of our research, which are attached to this report.

  13. Geothermal Project Den Haag - 3-D models for temperature prediction and reservoir characterization

    NASA Astrophysics Data System (ADS)

    Mottaghy, D.; Pechnig, R.; Willemsen, G.; Simmelink, H. J.; Vandeweijer, V.

    2009-04-01

    In the framework of the "Den Haag Zuidwest" geothermal district heating system a deep geothermal installation is projected. The target horizon of the planned doublet is the "Delft sandstone" which has been extensively explored for oil- and gas reservoirs in the last century. In the target area, this upper Jurassic sandstone layer is found at a depth of about 2300 m with an average thickness of about 50 m. The study presented here focuses on the prediction of reservoir temperatures and production behavior which is crucial for planning a deep geothermal installation. In the first phase, the main objective was to find out whether there is a significant influence of the 3-dimensional structures of anticlines and synclines on the temperature field, which could cause formation temperatures deviating from the predicted extrapolated temperature data from oil and gas exploration wells. To this end a regional model was set up as a basis for steady state numerical simulations. Since representative input parameters are decisive for reliable model results, all available information was compiled: a) the subsurface geometry, depth and thickness of the stratigraphic layers known from seismic data sets 2) borehole geophysical data and c) geological and petrographical information from exploration wells. In addition 50 cuttings samples were taken from two selected key wells in order to provide direct information on thermal properties of the underlying strata. Thermal conductivity and rock matrix density were measured in the laboratory. These data were combined with a petrophysical log analysis (Gamma Ray, Sonic, Density and Resistivity), which resulted in continuous profiles of porosity, effective thermal conductivity and radiogenetic heat production. These profiles allowed to asses in detail the variability of the petrophysical properties with depth and to check for lateral changes between the wells. All this data entered the numerical simulations which were performed by a 3-D

  14. Predicting fire activity in the US over the next 50 years using new IPCC climate projections

    NASA Astrophysics Data System (ADS)

    Wang, D.; Morton, D. C.; Collatz, G. J.

    2012-12-01

    Fire is an integral part of the Earth system with both direct and indirect effects on terrestrial ecosystems, the atmosphere, and human societies (Bowman et al. 2009). Climate conditions regulate fire activities through a variety of ways, e.g., influencing the conditions for ignition and fire spread, changing vegetation growth and decay and thus the accumulation of fuels for combustion (Arora and Boer 2005). Our recent study disclosed the burned area (BA) in US is strongly correlated with potential evaporation (PE), a measurement of climatic dryness derived from National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) climate data (Morton et al. 2012). The correlation varies spatially and temporally. With regard to fire of peak fire seasons, Northwestern US, Great Plains and Alaska have the strongest BA/PE relationship. Using the recently released the Global Fire Emissions Database (GFED) Version 3 (van der Werf et al. 2010), we showed increasing BA in the last decade in most of NCA regions. Longer time series of Monitoring Trends in Burn Severity (MTBS) (Eidenshink et al. 2007) data showed the increasing trends occurred in all NCA regions from 1984 to 2010. This relationship between BA and PE provides us the basis to predict the future fire activities in the projected climate conditions. In this study, we build spatially explicit predictors using the historic PE/BA relationship. PE from 2011 to 2060 is calculated from the Coupled Model Intercomparison Project Phase 5 (CMIP5) data and the historic PE/BA relationship is then used to estimate BA. This study examines the spatial pattern and temporal dynamics of the future US fires driven by new climate predictions for the next 50 years. Reference: Arora, V.K., & Boer, G.J. (2005). Fire as an interactive component of dynamic vegetation models. Journal of Geophysical Research-Biogeosciences, 110 Bowman, D.M.J.S., Balch, J.K., Artaxo, P., Bond, W.J., Carlson, J.M., Cochrane, M.A., D

  15. Fault kinematics and retro-deformation analysis for prediction of potential leakage pathways - joint project PROTECT

    NASA Astrophysics Data System (ADS)

    Ziesch, Jennifer; Tanner, David C.; Dance, Tess; Beilecke, Thies; Krawczyk, Charlotte M.

    2014-05-01

    Within the context of long-term CO2 storage integrity, we determine the seismic and sub-seismic characteristics of potential fluid migration pathways between reservoir and surface. As a part of the PROTECT project we focus on the sub-seismic faults of the CO2CRC Otway Project pilot site in Australia. We carried out a detailed interpretation of 3D seismic data and have built a geological 3D model of 8 km x 7 km x 4.5 km (depth). The model comprises triangulated surfaces of 8 stratigraphic horizons and 24 large-scale faults with 75 m grid size. We have confirmed the site to comprise a complex system of south-dipping normal faults and north-dipping antithetic normal faults. Good knowledge of the kinematics of the large-scale faults is essential to predict sub-seismic structures. For this reason preconditioning analyses, such as thickness maps, fault curvature, cylindricity and connectivity studies, as well as Allan mapping were carried out. The most important aspect is that two different types of fault kinematics were simultaneously active: Dip-slip and a combination of dip-slip with dextral strike slip movement. Using these input parameters stratigraphic volumes are kinematically restored along the large-scale faults, taking fault topography into account (retro-deformation). The stratigraphic volumes are analyzed at the same time with respect to sub-seismic strain variation. Thereby we produce strain tensor maps to locate highly deformed or fractured zones and their orientation within the stratigraphic volumes. We will discuss the results in the framework of possible fluid/gas migration pathways and communication between storage reservoir and overburden. This will provide a tool to predict CO2 leakage and thus to adapt time-dependent monitoring strategies for subsurface storage in general. Acknowledgement: This work was sponsored in part by the Australian Commonwealth Government through the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC). PROTECT

  16. Prediction of sub-seismic faults and fractures to ensure carbon traps - joint project PROTECT

    NASA Astrophysics Data System (ADS)

    Ziesch, Jennifer; Tanner, David C.; Beilecke, Thies; Krawczyk, Charlotte M.

    2015-04-01

    Deformation in the form of fractures and faults affects many reservoirs and their overburden. In a 3-D seismic data set we can identify faults on the large scale, while in well data we observe small-scale fractures. A large number of faults at the intermediate scale (sub-seismic space) also plays a very important role, but these are not detectable with conventional geophysical methods. Therefore, we use the retro-deformation approach within the context of long-term CO2 storage integrity to determine the characteristics of potential fluid migration pathways between reservoir and surface. This allows to produce strain maps, in order to analyse fault behaviour in the sub-seismic space. As part of the PROTECT (prediction of deformation to ensure carbon traps) project we focus on the sub-seismic faults of the CO2CRC Otway Project site in Australia. We interpreted a geological 3-D model of 8 km x 7 km x 4.5 km that comprises 8 stratigraphic horizons and 24 large-scale faults. This confirmed the site to contain a complex system of south-dipping normal faults and north-dipping antithetic normal faults. The most important aspect is that two different types of fault kinematics were simultaneously active: Dip-slip and a combination of dip-slip with dextral strike slip movement. After the retro-deformation of the 3-D model we calculated strain tensor maps to locate highly deformed or fractured zones and their orientation within the stratigraphic volume. The e1-strain magnitude shows heterogeneous distribution. The south of the study area is at least twice as much fractured on a sub-seismic scale. Four major faults act as "controlling faults" with smaller faults in between. The overburden is tilted northwards after retro-deformation. Thus, we believe that the area was affected by an even larger normal fault outside of the study area. In summary, this study reveals that good knowledge of the kinematics of the large-scale faults is essential to predict sub-seismic structures

  17. EU Framework 6 Project: Predictive Toxicology (PredTox)-overview and outcome

    SciTech Connect

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-04-15

    In this publication, we report the outcome of the integrated EU Framework 6 Project: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and 'omics' technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of 'omics' and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that 'omics' technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated 'omics' analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.

  18. EU framework 6 project: predictive toxicology (PredTox)--overview and outcome.

    PubMed

    Suter, Laura; Schroeder, Susanne; Meyer, Kirstin; Gautier, Jean-Charles; Amberg, Alexander; Wendt, Maria; Gmuender, Hans; Mally, Angela; Boitier, Eric; Ellinger-Ziegelbauer, Heidrun; Matheis, Katja; Pfannkuch, Friedlieb

    2011-04-15

    In this publication, we report the outcome of the integrated EU Framework 6 PROJECT: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and "omics" technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach of "omics" and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that "omics" technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated "omics" analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity. PMID:20955723

  19. Introduction of the NWP Model Development Project at Korea Institute of Atmospheric Prediction Systems - KIAPS

    NASA Astrophysics Data System (ADS)

    Kim, Y.

    2012-12-01

    Korea Meteorological Administration (KMA) launched a 9-year project in 2011 to develop Korea's own global NWP system with the total funding of about 100 million US dollars. To lead the effort, Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded by KMA as a non-profit foundation. The project consists of three stages. We are in the middle of the first stage (2011-2013), which is to set up the Institute, recruit researchers, lay out plans for the research and development, and design the basic structure and explore/develop core NWP technologies. The second stage (2014-2016) aims at developing the modules for the dynamical core, physical parameterizations and data assimilation systems as well as the system framework and couplers to connect the modules in a systematic and efficient way, and eventually building a prototype NWP system. The third stage (2017-2019) is for evaluating the prototype system by selecting/improving modules, and refining/finalizing it for operational use at KMA as well as developing necessary post-processing systems. In 2012, we are designing key modules for the dynamical core by adopting existing and/or developing new cores, and developing the barographic model first and the baroclinic model later with code parallelization and optimization in mind. We are collecting various physical parameterization schemes, mostly developed by Korean scientists, and evaluating and improving them by using single-column and LES models, etc. We are designing control variables for variational data assimilation systems, constructing testbeds for observational data pre-processing systems, developing linear models for a barographic system, designing modules for cost function minimization. We are developing the module framework, which is flexible for prognostic and diagnostic variables, designing the I/O structure of the system, coupling modules for external systems, and also developing post-processing systems. At the meeting, we will present the

  20. Aeroheating Testing and Predictions for Project Orion CEV at Turbulent Conditions

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Berger, Karen T.; Horvath, Thomas J.; Coblish, Joseph J.; Norris, Joseph D.; Lillard, Randolph P.; Kirk, Benjamin S.

    2009-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Exploration Vehicle was performed in the Arnold Engineering Development Center Hypervelocity Wind Tunnel No. 9 Mach 8 and Mach 10 nozzles and in the NASA Langley Research Center 20 - Inch Mach 6 Air Tunnel. Heating data were obtained using a thermocouple-instrumented approx.0.035-scale model (0.1778-m/7-inch diameter) of the flight vehicle. Runs were performed in the Tunnel 9 Mach 10 nozzle at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 20x10(exp 6)/ft, in the Tunnel 9 Mach 8 nozzle at free stream unit Reynolds numbers of 8 x 10(exp 6)/ft to 48x10(exp 6)/ft, and in the 20-Inch Mach 6 Air Tunnel at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 7x10(exp 6)/ft. In both facilities, enthalpy levels were low and the test gas (N2 in Tunnel 9 and air in the 20-Inch Mach 6) behaved as a perfect-gas. These test conditions produced laminar, transitional and turbulent data in the Tunnel 9 Mach 10 nozzle, transitional and turbulent data in the Tunnel 9 Mach 8 nozzle, and laminar and transitional data in the 20- Inch Mach 6 Air Tunnel. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the experimental data to help define the accuracy of computational method. In general, it was found that both laminar data and predictions, and turbulent data and predictions, agreed to within less than the estimated 12% experimental uncertainty estimate. Laminar heating distributions from all three data sets were shown to correlate well and demonstrated Reynolds numbers independence when expressed in terms of the Stanton number based on adiabatic wall-recovery enthalpy. Transition onset locations on the leeside centerline were determined from the data and correlated in terms of boundary-layer parameters. Finally turbulent heating augmentation ratios were determined for several body-point locations and correlated in terms of the

  1. The land use plan and water quality prediction for the Saemangeum reclamation project.

    PubMed

    Hwang, D H; Choi, J Y; Yi, S M; Han, D H; Jang, S H

    2009-01-01

    As the final closure of the world's longest sea dike of 33 km, the use of the Saemangeum reclaimed land becomes an issue in Korea. The Korean government has proclaimed that the Saemangeum Reclamation Project will be handled in an environmentally friendly manner but its effect on the water quality of reservoirs has always been controversial. This study was conducted to estimate the water quality of the Saemangeum reservoir using WASP5 according to the new land use plan adopted in 2007. Predictions on water quality shows that Dongjin reservoir would meet the standards for COD, T-P, and Chl-a if the wastewater from the Dongjin region was properly managed. However, T-P and Chl-a in Mangyeong reservoir would exceed the standards even without releasing the treated wastewater into the reservoir. With further reductions of 20% for T-P and Chl-a from the mouth of Mangyeong river, the water quality standards in the reservoir were achieved. This means that additional schemes, as well as water quality management programs established in the Government Master Plan in 2001, should be considered. Although the Saemangeum reservoir would manage to achieve the standards, it will enter a eutrophic state due to the high concentration of nutrients. PMID:19381006

  2. The CONVEX project - Using Observational Evidence and Process Understanding to Improve Predictions of Extreme Rainfall Change

    NASA Astrophysics Data System (ADS)

    Fowler, Hayley; Kendon, Elizabeth; Blenkinsop, Stephen; Chan, Steven; Ferro, Christopher; Roberts, Nigel; Stephenson, David; Jones, Richard; Sessford, Pat

    2013-04-01

    During the last decade, widespread major flood events in the UK and across the rest of Europe have focussed attention on perceived increases in rainfall intensities. Whilst Regional Climate Models (RCMs) are able to simulate the magnitude and spatial pattern of observed daily extreme rainfall events more reliably than Global Circulation Models (GCMs), they still underestimate extreme rainfall in relation to observations. Particularly during the summer a large proportion of the precipitation comes from convective storms that are typically too small to be explicitly represented by climate models. Instead, convection parameterisation schemes are necessary to represent the larger-scale effect of unresolved convective cells. Given the deficiencies in the simulation of extreme rainfall by climate models, even in the current generation of high-resolution RCMs, the CONVEX project (CONVective EXtremes) argues that an integrated approach is needed that brings together observations, basic understanding and models. This should go hand in hand with a change from a focus on traditional validation exercises (comparing modelled and observed extremes) to an understanding and quantification of the causes of model deficiencies in the simulation of extreme rainfall processes on different spatial and temporal scales. It is particularly true for localised intense summer convection. CONVEX therefore aims to contribute to the goals of enabling society to respond to global climate change and predicting the regional and local impacts of environmental change. In addition to an improved understanding of the spatial-temporal characteristics of extreme rainfall processes (principally in the UK) the project is also assessing the influence of model parameterisations and resolution on the simulation of extreme rainfall events and processes. This includes the running of new RCM simulations undertaken by the UK Meteorological Office at 50km and 12km resolutions (parameterised convection) and

  3. A Comprehensive Framework for Quantitative Evaluation of Downscaled Climate Predictions and Projections

    NASA Astrophysics Data System (ADS)

    Barsugli, J. J.; Guentchev, G.

    2012-12-01

    The variety of methods used for downscaling climate predictions and projections is large and growing larger. Comparative studies of downscaling techniques to date are often initiated in relation to specific projects, are focused on limited sets of downscaling techniques, and hence do not allow for easy comparison of outcomes. In addition, existing information about the quality of downscaled datasets is not available in digital form. There is a strong need for systematic evaluation of downscaling methods using standard protocols which will allow for a fair comparison of their advantages and disadvantages with respect to specific user needs. The National Climate Predictions and Projections platform, with the contributions of NCPP's Climate Science Advisory Team, is developing community-based standards and a prototype framework for the quantitative evaluation of downscaling techniques and datasets. Certain principles guide the development of this framework. We want the evaluation procedures to be reproducible and transparent, simple to understand, and straightforward to implement. To this end we propose a set of open standards that will include the use of specific data sets, time periods of analysis, evaluation protocols, evaluation tests and metrics. Secondly, we want the framework to be flexible and extensible to downscaling techniques which may be developed in the future, to high-resolution global models, and to evaluations that are meaningful for additional applications and sectors. Collaboration among practitioners who will be using the downscaled data and climate scientists who develop downscaling methods will therefore be essential to the development of this framework. The proposed framework consists of three analysis protocols, along with two tiers of specific metrics and indices that are to be calculated. The protocols describe the following types of evaluation that can be performed: 1) comparison to observations, 2) comparison to a "perfect model" simulation

  4. Projected climate change impacts and short term predictions on staple crops in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Mereu, V.; Spano, D.; Gallo, A.; Carboni, G.

    2013-12-01

    . Multiple combinations of soils and climate conditions, crop management and varieties were considered for the different Agro-Ecological Zones. The climate impact was assessed using future climate prediction, statistically and/or dynamically downscaled, for specific areas. Direct and indirect effects of different CO2 concentrations projected for the future periods were separately explored to estimate their effects on crops. Several adaptation strategies (e.g., introduction of full irrigation, shift of the ordinary sowing/planting date, changes in the ordinary fertilization management) were also evaluated with the aim to reduce the negative impact of climate change on crop production. The results of the study, analyzed at local, AEZ and country level, will be discussed.

  5. Evaluation of numerical weather predictions performed in the context of the project DAPHNE

    NASA Astrophysics Data System (ADS)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Bampzelis, Dimitris; Karacostas, Theodore

    2014-05-01

    The region of Thessaly in central Greece is one of the main areas of agricultural production in Greece. Severe weather phenomena affect the agricultural production in this region with adverse effects for farmers and the national economy. For this reason the project DAPHNE aims at tackling the problem of drought by means of weather modification through the development of the necessary tools to support the application of a rainfall enhancement program. In the present study the numerical weather prediction system WRF-ARW is used, in order to assess its ability to represent extreme weather phenomena in the region of Thessaly. WRF is integrated in three domains covering Europe, Eastern Mediterranean and Central-Northern Greece (Thessaly and a large part of Macedonia) using telescoping nesting with grid spacing of 15km, 5km and 1.667km, respectively. The cases examined span throughout the transitional and warm period (April to September) of the years 2008 to 2013, including days with thunderstorm activity. Model results are evaluated against all available surface observations and radar products, taking into account the spatial characteristics and intensity of the storms. Preliminary results indicate a good level of agreement between the simulated and observed fields as far as the standard parameters (such as temperature, humidity and precipitation) are concerned. Moreover, the model generally exhibits a potential to represent the occurrence of the convective activity, but not its exact spatiotemporal characteristics. Acknowledgements This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-2013)

  6. Track infrared point targets based on projection coefficient templates and non-linear correlation combined with Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Li, Xuelong; Han, Lei; Meng, Jiao

    2013-03-01

    For a long time, tracking IR point targets is a great challenge task. We propose a tracking framework based on template matching combined with Kalman prediction. Firstly, a novel template matching method for detecting infrared point targets is presented. Different from the classic template matching, the projection coefficients obtained from principal component analysis are used as templates and the non-linear correlation coefficient is used to measure the matching degree. The non-linear correlation can capture the higher-order statistics. So the detection performance is improved greatly. Secondly, a framework of tracking point targets, based on the proposed detection method and Kalman prediction, is developed. Kalman prediction reduces the searching region for the detection method and, in turn, the detection method provides the more precise measurement for Kalman prediction. They bring out the best in each other. Results of experiments show that this framework is competent to track infrared point targets.

  7. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies.

    PubMed

    DeRolph, Christopher R; Schramm, Michael P; Bevelhimer, Mark S

    2016-10-01

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multi-faceted explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, we were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements are functions of a range of factors, from biophysical to socio-political. Project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation. PMID:27280379

  8. Predicting environmental mitigation requirements for hydropower projects through the integration of biophysical and socio-political geographies

    DOE PAGESBeta

    Bevelhimer, Mark S.; DeRolph, Christopher R.; Schramm, Michael P.

    2016-06-06

    Uncertainty about environmental mitigation needs at existing and proposed hydropower projects makes it difficult for stakeholders to minimize environmental impacts. Hydropower developers and operators desire tools to better anticipate mitigation requirements, while natural resource managers and regulators need tools to evaluate different mitigation scenarios and order effective mitigation. Here we sought to examine the feasibility of using a suite of multidisciplinary explanatory variables within a spatially explicit modeling framework to fit predictive models for future environmental mitigation requirements at hydropower projects across the conterminous U.S. Using a database comprised of mitigation requirements from more than 300 hydropower project licenses, wemore » were able to successfully fit models for nearly 50 types of environmental mitigation and to apply the predictive models to a set of more than 500 non-powered dams identified as having hydropower potential. The results demonstrate that mitigation requirements have been a result of a range of factors, from biological and hydrological to political and cultural. Furthermore, project developers can use these models to inform cost projections and design considerations, while regulators can use the models to more quickly identify likely environmental issues and potential solutions, hopefully resulting in more timely and more effective decisions on environmental mitigation.« less

  9. The Proteome Folding Project: Proteome-scale prediction of structure and function

    PubMed Central

    Drew, Kevin; Winters, Patrick; Butterfoss, Glenn L.; Berstis, Viktors; Uplinger, Keith; Armstrong, Jonathan; Riffle, Michael; Schweighofer, Erik; Bovermann, Bill; Goodlett, David R.; Davis, Trisha N.; Shasha, Dennis; Malmström, Lars; Bonneau, Richard

    2011-01-01

    The incompleteness of proteome structure and function annotation is a critical problem for biologists and, in particular, severely limits interpretation of high-throughput and next-generation experiments. We have developed a proteome annotation pipeline based on structure prediction, where function and structure annotations are generated using an integration of sequence comparison, fold recognition, and grid-computing-enabled de novo structure prediction. We predict protein domain boundaries and three-dimensional (3D) structures for protein domains from 94 genomes (including human, Arabidopsis, rice, mouse, fly, yeast, Escherichia coli, and worm). De novo structure predictions were distributed on a grid of more than 1.5 million CPUs worldwide (World Community Grid). We generated significant numbers of new confident fold annotations (9% of domains that are otherwise unannotated in these genomes). We demonstrate that predicted structures can be combined with annotations from the Gene Ontology database to predict new and more specific molecular functions. PMID:21824995

  10. Predicting and Mapping Potential Whooping Crane Stopover Habitat to Guide Site Selection for Wind Energy Projects

    EPA Science Inventory

    Migration is one of the most poorly understood components of a bird’s life cycle. For that reason, migratory stopover habitats are often not part of conservation planning and may be overlooked when planning new development projects. This project highlights and addresses an overl...

  11. Performance predictions for mechanical excavators in Yucca Mountain tuffs; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ozdemir, L.; Gertsch, L.; Neil, D.; Friant, J.

    1992-09-01

    The performances of several mechanical excavators are predicted for use in the tuffs at Yucca Mountain: Tunnel boring machines, the Mobile Miner, a roadheader, a blind shaft borer, a vertical wheel shaft boring machine, raise drills, and V-Moles. Work summarized is comprised of three parts: Initial prediction using existing rock physical property information; Measurement of additional rock physical properties; and Revision of the initial predictions using the enhanced database. The performance predictions are based on theoretical and empirical relationships between rock properties and the forces-experienced by rock cutters and bits during excavation. Machine backup systems and excavation design aspects, such as curves and grades, are considered in determining excavator utilization factors. Instanteous penetration rate, advance rate, and cutter costs are the fundamental performance indicators.

  12. Predictive In Vitro Screening of Environmental Chemicals – The ToxCast Project

    EPA Science Inventory

    ToxCast, the United States Environmental Protection Agency’s chemical prioritization research program, is developing methods for utilizing computational chemistry and bioactivity profiling to predict potential for toxicity and prioritize limited testing resources (www.epa.gov/toc...

  13. Analysis and Prediction of User Editing Patterns in Ontology Development Projects

    PubMed Central

    Wang, Hao; Tudorache, Tania; Dou, Dejing; Noy, Natalya F.; Musen, Mark A.

    2014-01-01

    The development of real-world ontologies is a complex undertaking, commonly involving a group of domain experts with different expertise that work together in a collaborative setting. These ontologies are usually large scale and have complex structures. To assist in the authoring process, ontology tools are key at making the editing process as streamlined as possible. Being able to predict confidently what the users are likely to do next as they edit an ontology will enable us to focus and structure the user interface accordingly and to facilitate more efficient interaction and information discovery. In this paper, we use data mining, specifically the association rule mining, to investigate whether we are able to predict the next editing operation that a user will make based on the change history. We simulated and evaluated continuous prediction across time using sliding window model. We used the association rule mining to generate patterns from the ontology change logs in the training window and tested these patterns on logs in the adjacent testing window. We also evaluated the impact of different training and testing window sizes on the prediction accuracies. At last, we evaluated our prediction accuracies across different user groups and different ontologies. Our results indicate that we can indeed predict the next editing operation a user is likely to make. We will use the discovered editing patterns to develop a recommendation module for our editing tools, and to design user interface components that better fit with the user editing behaviors. PMID:26052350

  14. Retrospective Exposure Estimation and Predicted versus Observed Serum Perfluorooctanoic Acid Concentrations for Participants in the C8 Health Project

    PubMed Central

    Vieira, Verónica M.; Ryan, P. Barry; Steenland, Kyle; Bartell, Scott M.

    2011-01-01

    Background: People living or working in eastern Ohio and western West Virginia have been exposed to perfluorooctanoic acid (PFOA) released by DuPont Washington Works facilities. Objectives: Our objective was to estimate historical PFOA exposures and serum concentrations experienced by 45,276 non-occupationally exposed participants in the C8 Health Project who consented to share their residential histories and a 2005–2006 serum PFOA measurement. Methods: We estimated annual PFOA exposure rates for each individual based on predicted calibrated water concentrations and predicted air concentrations using an environmental fate and transport model, individual residential histories, and maps of public water supply networks. We coupled individual exposure estimates with a one-compartment absorption, distribution, metabolism, and excretion (ADME) model to estimate time-dependent serum concentrations. Results: For all participants (n = 45,276), predicted and observed median serum concentrations in 2005–2006 are 14.2 and 24.3 ppb, respectively [Spearman’s rank correlation coefficient (rs) = 0.67]. For participants who provided daily public well water consumption rate and who had the same residence and workplace in one of six municipal water districts for 5 years before the serum sample (n = 1,074), predicted and observed median serum concentrations in 2005–2006 are 32.2 and 40.0 ppb, respectively (rs = 0.82). Conclusions: Serum PFOA concentrations predicted by linked exposure and ADME models correlated well with observed 2005–2006 human serum concentrations for C8 Health Project participants. These individualized retrospective exposure and serum estimates are being used in a variety of epidemiologic studies being conducted in this region. PMID:21813367

  15. Model Predictions via History Matching of CO2 Plume Migration at the Sleipner Project, Norwegian North Sea

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Lu, P.; Zhu, C.; Zhang, Z.; Agarwal, R.

    2013-12-01

    The Sleipner Project in Norway is the world's first industrial-scale geological carbon storage project. Starting in 1996, CO2 separated from natural gas has been injected into the Utsira Sand at the rate of approximately 1 million metric ton of CO2 per year. To date, more than ~15 Mt of CO2 has been injected. Seismic surveys of the site were conducted prior to injection in 1994, and then repeated in 1999, 2001, 2004, 2006, and 2008. These surveys have produced high fidelity 4D seismic data that delineated the CO2 plume migration history. Therefore, the Sleipner Project provides a somewhat unique opportunity to simulate the dynamics of CO2 plume migration in a real geological system, which in turn gives insights into the uncertainties in CO2 plume migration prediction. The results will complement the numerous 'concept models' proliferated in the literature, which use ideal geometry and homogenous geological properties. We have developed a multi-phase reactive flow reservoir model of Layer 9 of the Utsira Sand for the Sleipner project using the Computer Modeling Group's reservoir simulator GEM and DOE TOUGH2 simulator. We calibrated the model in both the simulators against the time-lapsed seismic monitoring data. Our simulation results match with the extents of CO2 plume migration history from 1999 to 2008. The successful match with historic plume development was aided by introducing permeability anistropy and a second feeder to Layer 9. Predicted gas saturation, thickness of the CO2 accumulation and CO2 solubility in brine are also comparable with interpretations of the seismic data in the literature. The model in both the simulators calculated that ~9% of total injected CO2 is dissolved in brine, which is comparable to estimates (5-10%) based on seismic data interpretation. Our reservoir model was based on Statoil's geological model of the Utsira formation and grid mesh. Our simulation results illustrate that the actual behaviors of the injected CO2 plume conform

  16. Reducing Uncertainties in Model Predictions via History Matching of CO2 Plume Migration at the Sleipner Project, Norwegian North Sea

    NASA Astrophysics Data System (ADS)

    Lu, P.; Zhu, C.; Aagaard, P.

    2012-12-01

    The Sleipner Project in Norway is the world's first industrial-scale geological carbon sequestration project. Starting in 1996, CO2 separated from natural gas has been injected into the Utsira Sand at the rate of approximately 1 million metric ton of CO2 per year. To date, more than ~15 Mt of CO2 has been injected. Seismic surveys of the site were conducted prior to injection in 1994, and then repeated in 1999, 2001, 2004, 2006, and 2008; 2010 survey data have not yet been released by StatOil. These surveys have produced high fidelity 4D seismic data that delineated the CO2 plume migration history. Therefore, the Sleipner Project provides a somewhat unique opportunity to simulate the dynamics of CO2 plume migration in a real geological system, which in turn gives insights into the uncertainties in CO2 plume migration prediction. The results will complement the numerous "concept models" proliferated in the literature, which use ideal geometry and homogenous geological properties. We have developed a multi-phase reactive flow reservoir model of Layer 9 of the Utsira Sand for the Sleipner project using the Computer Modeling Group's reservoir simulator GEM®. We calibrated the model against the time-lapsed seismic monitoring data. Our simulation results match with the extents of CO2 plume migration history from 1999 to 2008. The successful match with historic plume development was aided by introducing permeability anistropy and a second feeder to Layer 9. Predicted gas saturation, thickness of the CO2 accumulation and CO2 solubility in brine are also comparable with interpretations of the seismic data in the literature. The model calculated that ~9% of total injected CO2 is dissolved in brine, which is comparable to estimates (5-10%) based on seismic data interpretation. Our reservoir model was based on StatOil's geological model of the Utsira formation and grid mesh. Our simulation results illustrate that the actual behaviors of the injected CO2 plume conform to the

  17. Derivation and Evaluation of a Risk-Scoring Tool to Predict Participant Attrition in a Lifestyle Intervention Project.

    PubMed

    Jiang, Luohua; Yang, Jing; Huang, Haixiao; Johnson, Ann; Dill, Edward J; Beals, Janette; Manson, Spero M; Roubideaux, Yvette

    2016-05-01

    Participant attrition in clinical trials and community-based interventions is a serious, common, and costly problem. In order to develop a simple predictive scoring system that can quantify the risk of participant attrition in a lifestyle intervention project, we analyzed data from the Special Diabetes Program for Indians Diabetes Prevention Program (SDPI-DP), an evidence-based lifestyle intervention to prevent diabetes in 36 American Indian and Alaska Native communities. SDPI-DP participants were randomly divided into a derivation cohort (n = 1600) and a validation cohort (n = 801). Logistic regressions were used to develop a scoring system from the derivation cohort. The discriminatory power and calibration properties of the system were assessed using the validation cohort. Seven independent factors predicted program attrition: gender, age, household income, comorbidity, chronic pain, site's user population size, and average age of site staff. Six factors predicted long-term attrition: gender, age, marital status, chronic pain, site's user population size, and average age of site staff. Each model exhibited moderate to fair discriminatory power (C statistic in the validation set: 0.70 for program attrition, and 0.66 for long-term attrition) and excellent calibration. The resulting scoring system offers a low-technology approach to identify participants at elevated risk for attrition in future similar behavioral modification intervention projects, which may inform appropriate allocation of retention resources. This approach also serves as a model for other efforts to prevent participant attrition. PMID:26768431

  18. Plate Boundaries and Earthquake Prediction. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  19. The Role of Social Relationships in Predicting Loneliness: The National Social Life, Health, and Aging Project

    ERIC Educational Resources Information Center

    Shiovitz-Ezra, Sharon; Leitsch, Sara A.

    2010-01-01

    The authors explore associations between objective and subjective social network characteristics and loneliness in later life, using data from the National Social Life, Health, and Aging Project, a nationally representative sample of individuals ages 57 to 85 in the United States. Hierarchical linear regression was used to examine the associations…

  20. Short communication: Projecting milk yield using best prediction and the MilkBot lactation model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy and precision of three lactation models was estimated by summarizing means and variability in projection error for next-test milk and actual 305-d milk yield (M305) for 50-day intervals in a large DHIA data set. Lactations were grouped by breed (Holstein, Jersey and crossbred) and parit...

  1. Adapting WEPP (Water Erosion Prediction Project) for forest watershed erosion modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There has been an increasing public concern over forest stream pollution by excessive sedimentation resulting from human activities. Adequate and reliable erosion simulation tools are urgently needed for sound forest resources management. Computer models for predicting watershed runoff and erosion h...

  2. Applications systems verification and transfer project. Volume 8: Satellite snow mapping and runoff prediction handbook

    NASA Technical Reports Server (NTRS)

    Bowley, C. J.; Barnes, J. C.; Rango, A.

    1981-01-01

    The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.

  3. Development of new geoinformation methods for modelling and prediction of sea level change over different timescales - overview of the project

    NASA Astrophysics Data System (ADS)

    Niedzielski, T.; Włosińska, M.; Miziński, B.; Hewelt, M.; Migoń, P.; Kosek, W.; Priede, I. G.

    2012-04-01

    The poster aims to provide a broad scientific audience with a general overview of a project on sea level change modelling and prediction that has just commenced at the University of Wrocław, Poland. The initiative that the project fits, called the Homing Plus programme, is organised by the Foundation for Polish Science and financially supported by the European Union through the European Regional Development Fund and the Innovative Economy Programme. There are two key research objectives of the project that complement each other. First, emphasis is put on modern satellite altimetric gridded time series from the Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO) repository. Daily sea level anomaly maps, access to which in near-real time is courtesy of AVISO, are being steadily downloaded every day to our local server in Wroclaw, Poland. These data will be processed within a general framework of modelling and prediction of sea level change in short, medium and long term. Secondly, sea level change over geological time is scrutinised in order to cover very long time scales that go far beyond a history of altimetric and tide-gauge measurements. The aforementioned approaches comprise a few tasks that aim to solve the following detailed problems. Within the first one, our objective is to seek spatio-temporal dependencies in the gridded sea level anomaly time series. Subsequently, predictions that make use of such cross-correlations shall be derived, and near-real time service for automatic update with validation will be implemented. Concurrently, (i.e. apart from spatio-temporal dependencies and their use in the process of forecasting variable sea level topography), threshold models shall be utilised for predicting the El Niño/Southern Oscillation (ENSO) signal that is normally present in sea level anomaly time series of the equatorial Pacific. Within the second approach, however, the entirely different methods are proposed. Links between

  4. Predictive medicine: outcomes, challenges and opportunities in the Synergy-COPD project

    PubMed Central

    2014-01-01

    Background Chronic Obstructive Pulmonary Disease (COPD) is a major challenge for healthcare. Heterogeneities in clinical manifestations and in disease progression are relevant traits in COPD with impact on patient management and prognosis. It is hypothesized that COPD heterogeneity results from the interplay of mechanisms governing three conceptually different phenomena: 1) pulmonary disease, 2) systemic effects of COPD and 3) co-morbidity clustering. Objectives To assess the potential of systems medicine to better understand non-pulmonary determinants of COPD heterogeneity. To transfer acquired knowledge to healthcare enhancing subject-specific health risk assessment and stratification to improve management of chronic patients. Method Underlying mechanisms of skeletal muscle dysfunction and of co-morbidity clustering in COPD patients were explored with strategies combining deterministic modelling and network medicine analyses using the Biobridge dataset. An independent data driven analysis of co-morbidity clustering examining associated genes and pathways was done (ICD9-CM data from Medicare, 13 million people). A targeted network analysis using the two studies: skeletal muscle dysfunction and co-morbidity clustering explored shared pathways between them. Results (1) Evidence of abnormal regulation of pivotal skeletal muscle biological pathways and increased risk for co-morbidity clustering was observed in COPD; (2) shared abnormal pathway regulation between skeletal muscle dysfunction and co-morbidity clustering; and, (3) technological achievements of the projects were: (i) COPD Knowledge Base; (ii) novel modelling approaches; (iii) Simulation Environment; and, (iv) three layers of Clinical Decision Support Systems. Conclusions The project demonstrated the high potential of a systems medicine approach to address COPD heterogeneity. Limiting factors for the project development were identified. They were relevant to shape strategies fostering 4P Medicine for

  5. Performance of the operational high-resolution numerical weather predictions of the Daphne project

    NASA Astrophysics Data System (ADS)

    Tegoulias, Ioannis; Pytharoulis, Ioannis; Karacostas, Theodore; Kartsios, Stergios; Kotsopoulos, Stelios; Bampzelis, Dimitrios

    2015-04-01

    In the framework of the DAPHNE project, the Department of Meteorology and Climatology (http://meteo.geo.auth.gr) of the Aristotle University of Thessaloniki, Greece, utilizes the nonhydrostatic Weather Research and Forecasting model with the Advanced Research dynamic solver (WRF-ARW) in order to produce high-resolution weather forecasts over Thessaly in central Greece. The aim of the DAPHNE project is to tackle the problem of drought in this area by means of Weather Modification. Cloud seeding assists the convective clouds to produce rain more efficiently or reduce hailstone size in favour of raindrops. The most favourable conditions for such a weather modification program in Thessaly occur in the period from March to October when convective clouds are triggered more frequently. Three model domains, using 2-way telescoping nesting, cover: i) Europe, the Mediterranean sea and northern Africa (D01), ii) Greece (D02) and iii) the wider region of Thessaly (D03; at selected periods) at horizontal grid-spacings of 15km, 5km and 1km, respectively. This research work intents to describe the atmospheric model setup and analyse its performance during a selected period of the operational phase of the project. The statistical evaluation of the high-resolution operational forecasts is performed using surface observations, gridded fields and radar data. Well established point verification methods combined with novel object based upon these methods, provide in depth analysis of the model skill. Spatial characteristics are adequately captured but a variable time lag between forecast and observation is noted. Acknowledgments: This research work has been co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness

  6. The ChemScreen project to design a pragmatic alternative approach to predict reproductive toxicity of chemicals.

    PubMed

    van der Burg, Bart; Wedebye, Eva Bay; Dietrich, Daniel R; Jaworska, Joanna; Mangelsdorf, Inge; Paune, Eduard; Schwarz, Michael; Piersma, Aldert H; Kroese, E Dinant

    2015-08-01

    There is a great need for rapid testing strategies for reproductive toxicity testing, avoiding animal use. The EU Framework program 7 project ChemScreen aimed to fill this gap in a pragmatic manner preferably using validated existing tools and place them in an innovative alternative testing strategy. In our approach we combined knowledge on critical processes affected by reproductive toxicants with knowledge on the mechanistic basis of such effects. We used in silico methods for prescreening chemicals for relevant toxic effects aiming at reduced testing needs. For those chemicals that need testing we have set up an in vitro screening panel that includes mechanistic high throughput methods and lower throughput assays that measure more integrative endpoints. In silico pharmacokinetic modules were developed for rapid exposure predictions via diverse exposure routes. These modules to match in vitro and in vivo exposure levels greatly improved predictivity of the in vitro tests. As a further step, we have generated examples how to predict reproductive toxicity of chemicals using available data. We have executed formal validations of panel constituents and also used more innovative manners to validate the test panel using mechanistic approaches. We are actively engaged in promoting regulatory acceptance of the tools developed as an essential step towards practical application, including case studies for read-across purposes. With this approach, a significant saving in animal use and associated costs seems very feasible. PMID:25656794

  7. NASA's Evolutionary Xenon Thruster (NEXT) Project Qualification Propellant Throughput Milestone: Performance, Erosion, and Thruster Service Life Prediction After 450 kg

    NASA Technical Reports Server (NTRS)

    Herman, Daniel A.

    2010-01-01

    The NASA s Evolutionary Xenon Thruster (NEXT) program is tasked with significantly improving and extending the capabilities of current state-of-the-art NSTAR thruster. The service life capability of the NEXT ion thruster is being assessed by thruster wear test and life-modeling of critical thruster components, such as the ion optics and cathodes. The NEXT Long-Duration Test (LDT) was initiated to validate and qualify the NEXT thruster propellant throughput capability. The NEXT thruster completed the primary goal of the LDT; namely to demonstrate the project qualification throughput of 450 kg by the end of calendar year 2009. The NEXT LDT has demonstrated 28,500 hr of operation and processed 466 kg of xenon throughput--more than double the throughput demonstrated by the NSTAR flight-spare. Thruster performance changes have been consistent with a priori predictions. Thruster erosion has been minimal and consistent with the thruster service life assessment, which predicts the first failure mode at greater than 750 kg throughput. The life-limiting failure mode for NEXT is predicted to be loss of structural integrity of the accelerator grid due to erosion by charge-exchange ions.

  8. Zsyntax: A Formal Language for Molecular Biology with Projected Applications in Text Mining and Biological Prediction

    PubMed Central

    Boniolo, Giovanni; D'Agostino, Marcello; Di Fiore, Pier Paolo

    2010-01-01

    We propose a formal language that allows for transposing biological information precisely and rigorously into machine-readable information. This language, which we call Zsyntax (where Z stands for the Greek word ζωή, life), is grounded on a particular type of non-classical logic, and it can be used to write algorithms and computer programs. We present it as a first step towards a comprehensive formal language for molecular biology in which any biological process can be written and analyzed as a sort of logical “deduction”. Moreover, we illustrate the potential value of this language, both in the field of text mining and in that of biological prediction. PMID:20209084

  9. SWAT system performance predictions. Project report. [SWAT (Short-Wavelength Adaptive Techniques)

    SciTech Connect

    Parenti, R.R.; Sasiela, R.J.

    1993-03-10

    In the next phase of Lincoln Laboratory's SWAT (Short-Wavelength Adaptive Techniques) program, the performance of a 241-actuator adaptive-optics system will be measured using a variety of synthetic-beacon geometries. As an aid in this experimental investigation, a detailed set of theoretical predictions has also been assembled. The computational tools that have been applied in this study include a numerical approach in which Monte-Carlo ray-trace simulations of accumulated phase error are developed, and an analytical analysis of the expected system behavior. This report describes the basis of these two computational techniques and compares their estimates of overall system performance. Although their regions of applicability tend to be complementary rather than redundant, good agreement is usually obtained when both sets of results can be derived for the same engagement scenario.... Adaptive optics, Phase conjugation, Atmospheric turbulence Synthetic beacon, Laser guide star.

  10. Development of Procedures for Assessing the Impact of Vocational Education Research and Development on Vocational Education (Project IMPACT). Volume 8--A Field Study of Predicting Impact of Research and Development Projects in Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Malhorta, Man Mohanlal

    As part of Project IMPACT's effort to identify and develop procedures for complying with the impact requirements of Public Law 94-482, a field study was conducted to identify and validate variables and their order of importance in predicting and evaluating impact of research and development (R&D) projects in vocational and technical education.…

  11. Joint Applications Pilot of the National Climate Predictions and Projections Platform and the North Central Climate Science Center: Delivering climate projections on regional scales to support adaptation planning

    NASA Astrophysics Data System (ADS)

    Ray, A. J.; Ojima, D. S.; Morisette, J. T.

    2012-12-01

    The DOI North Central Climate Science Center (NC CSC) and the NOAA/NCAR National Climate Predictions and Projections (NCPP) Platform and have initiated a joint pilot study to collaboratively explore the "best available climate information" to support key land management questions and how to provide this information. NCPP's mission is to support state of the art approaches to develop and deliver comprehensive regional climate information and facilitate its use in decision making and adaptation planning. This presentation will describe the evolving joint pilot as a tangible, real-world demonstration of linkages between climate science, ecosystem science and resource management. Our joint pilot is developing a deliberate, ongoing interaction to prototype how NCPP will work with CSCs to develop and deliver needed climate information products, including translational information to support climate data understanding and use. This pilot also will build capacity in the North Central CSC by working with NCPP to use climate information used as input to ecological modeling. We will discuss lessons to date on developing and delivering needed climate information products based on this strategic partnership. Four projects have been funded to collaborate to incorporate climate information as part of an ecological modeling project, which in turn will address key DOI stakeholder priorities in the region: Riparian Corridors: Projecting climate change effects on cottonwood and willow seed dispersal phenology, flood timing, and seedling recruitment in western riparian forests. Sage Grouse & Habitats: Integrating climate and biological data into land management decision models to assess species and habitat vulnerability Grasslands & Forests: Projecting future effects of land management, natural disturbance, and CO2 on woody encroachment in the Northern Great Plains The value of climate information: Supporting management decisions in the Plains and Prairie Potholes LCC. NCCSC's role in

  12. Prenatal maternal stress predicts autism traits in 6½ year-old children: Project Ice Storm.

    PubMed

    Walder, Deborah J; Laplante, David P; Sousa-Pires, Alexandra; Veru, Franz; Brunet, Alain; King, Suzanne

    2014-10-30

    Research implicates prenatal maternal stress (PNMS) as a risk factor for neurodevelopmental disorders; however few studies report PNMS effects on autism risk in offspring. We examined, prospectively, the degree to which objective and subjective elements of PNMS explained variance in autism-like traits among offspring, and tested moderating effects of sex and PNMS timing in utero. Subjects were 89 (46F/43M) children who were in utero during the 1998 Quebec Ice Storm. Soon after the storm, mothers completed questionnaires on objective exposure and subjective distress, and completed the Autism Spectrum Screening Questionnaire (ASSQ) for their children at age 6½. ASSQ scores were higher among boys than girls. Greater objective and subjective PNMS predicted higher ASSQ independent of potential confounds. An objective-by-subjective interaction suggested that when subjective PNMS was high, objective PNMS had little effect; whereas when subjective PNMS was low, objective PNMS strongly affected ASSQ scores. A timing-by-objective stress interaction suggested objective stress significantly affected ASSQ in first-trimester exposed children, though less so with later exposure. The final regression explained 43% of variance in ASSQ scores; the main effect of sex and the sex-by-PNMS interactions were not significant. Findings may help elucidate neurodevelopmental origins of non-clinical autism-like traits from a dimensional perspective. PMID:24907222

  13. Predicting the spatial extent of injection-induced zones of enhanced permeability at the Northwest Geysers EGS Demonstration Project

    SciTech Connect

    Rutqvist, J.; Oldenburg, C.M.; Dobson, P.F.

    2010-02-01

    We present the results of coupled thermal, hydraulic, and mechanical (THM) modeling of a proposed stimulation injection associated with an Enhanced Geothermal System (EGS) demonstration project at the northwest part of The Geysers geothermal field, California. The project aims at creating an EGS by directly and systematically injecting cool water at relatively low pressure into a known High Temperature (about 280 to 350 C) Zone (HTZ) located under the conventional (240 C) steam reservoir at depths below 3 km. Accurate micro-earthquake monitoring from the start of the injection will be used as a tool for tracking the development of the EGS. We first analyzed historic injection and micro-earthquake data from an injection well (Aidlin 11), located about 3 miles to the west of the new EGS demonstration area. Thereafter, we used the same modeling approach to predict the likely extent of the zone of enhanced permeability for a proposed initial injection in two wells (Prati State 31 and Prati 32) at the new EGS demonstration area. Our modeling indicates that the proposed injection scheme will provide additional steam production in the area by creating a zone of permeability enhancement extending about 0.5 km from each injection well which will connect to the overlying conventional steam reservoir.

  14. Constructing Predictive Estimates for Worker Exposure to Radioactivity During Decommissioning: Analysis of Completed Decommissioning Projects - Master Thesis

    SciTech Connect

    Dettmers, Dana Lee; Eide, Steven Arvid

    2002-10-01

    An analysis of completed decommissioning projects is used to construct predictive estimates for worker exposure to radioactivity during decommissioning activities. The preferred organizational method for the completed decommissioning project data is to divide the data by type of facility, whether decommissioning was performed on part of the facility or the complete facility, and the level of radiation within the facility prior to decommissioning (low, medium, or high). Additional data analysis shows that there is not a downward trend in worker exposure data over time. Also, the use of a standard estimate for worker exposure to radioactivity may be a best estimate for low complete storage, high partial storage, and medium reactor facilities; a conservative estimate for some low level of facility radiation facilities (reactor complete, research complete, pits/ponds, other), medium partial process facilities, and high complete research facilities; and an underestimate for the remaining facilities. Limited data are available to compare different decommissioning alternatives, so the available data are reported and no conclusions can been drawn. It is recommended that all DOE sites and the NRC use a similar method to document worker hours, worker exposure to radiation (person-rem), and standard industrial accidents, injuries, and deaths for all completed decommissioning activities.

  15. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    NASA Astrophysics Data System (ADS)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  16. Impact of Different Topographic Corrections on Prediction Accuracy of Foliage Projective Cover (fpc) in a Topographically Complex Terrain

    NASA Astrophysics Data System (ADS)

    Ediriweera, S.; Pathirana, S.; Danaher, T.; Nichols, D.; Moffiet, T.

    2012-07-01

    Quantitative retrieval of land surface biological parameters (e.g. foliage projective cover [FPC] and Leaf Area Index) is crucial for forest management, ecosystem modelling, and global change monitoring applications. Currently, remote sensing is a widely adopted method for rapid estimation of surface biological parameters in a landscape scale. Topographic correction is a necessary pre-processing step in the remote sensing application for topographically complex terrain. Selection of a suitable topographic correction method on remotely sensed spectral information is still an unresolved problem. The purpose of this study is to assess the impact of topographic corrections on the prediction of FPC in hilly terrain using an established regression model. Five established topographic corrections [C, Minnaert, SCS, SCS+C and processing scheme for standardised surface reflectance (PSSSR)] were evaluated on Landsat TM5 acquired under low and high sun angles in closed canopied subtropical rainforest and eucalyptus dominated open canopied forest, north-eastern Australia. The effectiveness of methods at normalizing topographic influence, preserving biophysical spectral information, and internal data variability were assessed by statistical analysis and by comparing field collected FPC data. The results of statistical analyses show that SCS+C and PSSSR perform significantly better than other corrections, which were on less overcorrected areas of faintly illuminated slopes. However, the best relationship between FPC and Landsat spectral responses was obtained with the PSSSR by producing the least residual error. The SCS correction method was poor for correction of topographic effect in predicting FPC in topographically complex terrain.

  17. Palomar project: predicting school renouncing dropouts, using the artificial neural networks as a support for educational policy decisions.

    PubMed

    Carbone, V; Piras, G

    1998-02-01

    The "Palomar" project confronts two problem situations that are partly independent and partly connected to the Italian schooling system: unstable participation in school such as drop out and educational guidance. Our concern is that of a set of phenomena which consists of ceasing compulsory education, repetition of a year at school, school "drop outs", irregular compulsory attendance and delays in the course of studies. The "Palomar" project is designed to offer educators and administrators who want to effectively intervene with these complex problems to furnish school guidance services as an instrument able to: 1. Predict: creating a system able to predict in advance (not in a "cause-effect" way but as an approximation): a) which students are at "risk" for school destabilization or failure; b) what are the prototypical characteristics of these students; c) which students among those studied are more likely to "destabilize" or fail in school; in which course of study does each student have the greatest chance of success; d) which, among the variables studied and appropriately weighted for each student, will predict the successful grade, analyzed for each possible course of studies. 2. Optimize: selecting and focusing on a student on the basis of the information given. It is possible: a) to point out which personal factors (relational, familial, student, disciplinary, economical) need to be reinforced in order to improve the school performances of each selected student, both to prevent or limit "dropping out" desertion or failure and to raise the performances in the chosen school course as much as possible; b) on the basis of what was mentioned above, to simulate the possible support measures to increase the efficacy of the considered intervention; c) to choose for each student the appropriate intervention strategy capable of obtaining the maximum result and the maximum efficacy in the given conditions. 3. Verify: when the strategy of intervention has been decided

  18. The eTOX Data-Sharing Project to Advance in Silico Drug-Induced Toxicity Prediction

    PubMed Central

    Cases, Montserrat; Briggs, Katharine; Steger-Hartmann, Thomas; Pognan, François; Marc, Philippe; Kleinöder, Thomas; Schwab, Christof H.; Pastor, Manuel; Wichard, Jörg; Sanz, Ferran

    2014-01-01

    The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects) and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP) protection and set up of adequate controlled vocabularies) and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds). In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys) are presented as decision support knowledge-based tools for drug development process at an early stage. PMID:25405742

  19. The eTOX data-sharing project to advance in silico drug-induced toxicity prediction.

    PubMed

    Cases, Montserrat; Briggs, Katharine; Steger-Hartmann, Thomas; Pognan, François; Marc, Philippe; Kleinöder, Thomas; Schwab, Christof H; Pastor, Manuel; Wichard, Jörg; Sanz, Ferran

    2014-01-01

    The high-quality in vivo preclinical safety data produced by the pharmaceutical industry during drug development, which follows numerous strict guidelines, are mostly not available in the public domain. These safety data are sometimes published as a condensed summary for the few compounds that reach the market, but the majority of studies are never made public and are often difficult to access in an automated way, even sometimes within the owning company itself. It is evident from many academic and industrial examples, that useful data mining and model development requires large and representative data sets and careful curation of the collected data. In 2010, under the auspices of the Innovative Medicines Initiative, the eTOX project started with the objective of extracting and sharing preclinical study data from paper or pdf archives of toxicology departments of the 13 participating pharmaceutical companies and using such data for establishing a detailed, well-curated database, which could then serve as source for read-across approaches (early assessment of the potential toxicity of a drug candidate by comparison of similar structure and/or effects) and training of predictive models. The paper describes the efforts undertaken to allow effective data sharing intellectual property (IP) protection and set up of adequate controlled vocabularies) and to establish the database (currently with over 4000 studies contributed by the pharma companies corresponding to more than 1400 compounds). In addition, the status of predictive models building and some specific features of the eTOX predictive system (eTOXsys) are presented as decision support knowledge-based tools for drug development process at an early stage. PMID:25405742

  20. The KIAPS global NWP model development project at the Korea Institute of Atmospheric Prediction Systems (KIAPS.org)

    NASA Astrophysics Data System (ADS)

    Kim, Young-Joon; Shin, Dong-Wook; Jin, Emilia; Oh, Tae-Jin; Song, Hyo-Jong; Song, In-Sun

    2013-04-01

    A nine-year project to develop Korea's own global Numerical Weather Prediction (NWP) system was launched in 2011 by the Korea Meteorological Administration (KMA) with the total funding of about 100 million US dollars. For the task, the Korea Institute of Atmospheric Prediction Systems (KIAPS) was founded by KMA as an independent, non-profit organization. The project consists of three main stages. The first stage (2011-2013) is to set up the Institute, recruit researchers, lay out plans for the research and development, and design the basic structure and explore/develop core NWP technologies. The second stage (2014-2016) aims at developing the basic modules for the dynamical core, physical parameterizations and data assimilation systems as well as the applied module for the system framework and couplers to connect the basic modules and external models, respectively, in a systematic and efficient way. The third stage (2017-2019) is for validating the prototype NWP system built in stage 2, including necessary post-processing systems, by selecting/improving modules and refining/finalizing the system for operational use at KMA. KIAPS designed key modules for the dynamical core by adopting existing and/or developing new cores, and developed a barotropic model first and a baroclinic model later with code parallelization and optimization in mind. Various physical parameterization schemes, including those used operationally in NWP models as well as those developed by Korean scientists, are being evaluated and improved by using single-column and LES models, and explicit simulations, etc. The control variables for variational data assimilation systems, the testbeds for observational data pre-processing systems, have been designed, the linear models for a barotropic system have been constructed, and the modules for cost function minimization have been developed. The module framework, which is flexible for prognostic and diagnostic variables, is being developed, the I

  1. The Oxfordshire Community Stroke Project classification system predicts clinical outcomes following intravenous thrombolysis: a prospective cohort study

    PubMed Central

    Yang, Yuling; Wang, Anxin; Zhao, Xingquan; Wang, Chunxue; Liu, Liping; Zheng, Huaguang; Wang, Yongjun; Cao, Yibin; Wang, Yilong

    2016-01-01

    Background The Oxfordshire Community Stroke Project (OCSP) classification system is a simple stroke classification system that can be used to predict clinical outcomes. In this study, we compare the safety and efficacy of intravenous thrombolysis in Chinese stroke patients categorized using the OCSP classification system. Patients and methods We collected data from the Thrombolysis Implementation and Monitoring of Acute Ischemic Stroke in China registry. A total of 1,115 patients treated with intravenous thrombolysis with alteplase within 4.5 hours of stroke onset were included. Symptomatic intracranial hemorrhage (SICH), mortality, and 90-day functional outcomes were compared between the stroke patients with different stroke subtypes. Results Of the 1,115 patients included in the cohort, 197 (17.67%) were classified with total anterior circulation infarct (TACI), 700 (62.78%) with partial anterior circulation infarct, 153 (13.72%) with posterior circulation infarct, and 65 (5.83%) with lacunar infarct. After multivariable adjustment, compared to the patients with non-TACI, those with TACI had a significantly increased risk of SICH (odds ratio [OR] 8.80; 95% confidence interval [CI] 2.84–27.25, P<0.001), higher mortality (OR 5.24; 95% CI 3.19–8.62; P<0.001), and poor functional independence (OR 0.38; 95% CI 0.26–0.56; P<0.001) at 3-month follow-up. Conclusion After thrombolysis, the patients with TACI exhibited greater SICH, a higher mortality rate, and worse 3-month clinical outcomes compared with the patients with non-TACI. The OCSP classification system may help clinicians predict the safety and efficacy of thrombolysis. PMID:27418829

  2. Watershed-scale evaluation of the Water Erosion Prediction Project (WEPP) model in the Lake Tahoe basin

    NASA Astrophysics Data System (ADS)

    Brooks, Erin S.; Dobre, Mariana; Elliot, William J.; Wu, Joan Q.; Boll, Jan

    2016-02-01

    Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to forested watersheds, we developed and assessed new approaches for simulating streamflow and sediment transport from large watersheds using WEPP. We created specific algorithms to spatially distribute soil, climate, and management input files for all the subwatersheds within the basin. The model enhancements were tested on five geologically and climatically diverse watersheds in the Lake Tahoe basin, USA. The model was run with minimal calibration to assess WEPP's ability as a physically-based model to predict streamflow and sediment delivery. The performance of the model was examined against 17 years of observed snow water equivalent depth, streamflow, and sediment load data. Only region-wide baseflow recession parameters related to the geology of the basin were calibrated with observed streamflow data. Close agreement between simulated and observed snow water equivalent, streamflow, and the distribution of fine (<20 μm) and coarse (>20 μm) sediments was achieved at each of the major watersheds located in the high-precipitation regions of the basin. Sediment load was adequately simulated in the drier watersheds; however, annual streamflow was overestimated. With the exception of the drier eastern region, the model demonstrated no loss in accuracy when applied without calibration to multiple watersheds across Lake Tahoe basin demonstrating the utility of the model as a management tool in gauged and ungauged basins.

  3. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    SciTech Connect

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository.

  4. Prediction of radiographic progression in synovitis-positive joints on maximum intensity projection of magnetic resonance imaging in rheumatoid arthritis.

    PubMed

    Akai, Takanori; Taniguchi, Daigo; Oda, Ryo; Asada, Maki; Toyama, Shogo; Tokunaga, Daisaku; Seno, Takahiro; Kawahito, Yutaka; Fujii, Yosuke; Ito, Hirotoshi; Fujiwara, Hiroyoshi; Kubo, Toshikazu

    2016-04-01

    Contrast-enhanced magnetic resonance imaging with maximum intensity projection (MRI-MIP) is an easy, useful imaging method to evaluate synovitis in rheumatoid hands. However, the prognosis of synovitis-positive joints on MRI-MIP has not been clarified. The aim of this study was to evaluate the relationship between synovitis visualized by MRI-MIP and joint destruction on X-rays in rheumatoid hands. The wrists, metacarpophalangeal (MP) joints, and proximal interphalangeal (PIP) joints of both hands (500 joints in total) were evaluated in 25 rheumatoid arthritis (RA) patients. Synovitis was scored from grade 0 to 2 on the MRI-MIP images. The Sharp/van der Heijde score and Larsen grade were used for radiographic evaluation. The relationships between the MIP score and the progression of radiographic scores and between the MIP score and bone marrow edema on MRI were analyzed using the trend test. As the MIP score increased, the Sharp/van der Heijde score and Larsen grade progressed severely. The rate of bone marrow edema-positive joints also increased with higher MIP scores. MRI-MIP imaging of RA hands is a clinically useful method that allows semi-quantitative evaluation of synovitis with ease and can be used to predict joint destruction. PMID:26861034

  5. WEPPCAT: An Online tool for assessing and managing the potential impacts of climate change on sediment loading to streams using the Water Erosion Prediction Project (WEPP) Model

    EPA Science Inventory

    WEPPCAT is an on-line tool that provides a flexible capability for creating user-determined climate change scenarios for assessing the potential impacts of climate change on sediment loading to streams using the USDA’s Water Erosion Prediction Project (WEPP) Model. In combination...

  6. The value of selected in vitro and in silico methods to predict acute oral toxicity in a regulatory context: results from the European Project ACuteTox.

    PubMed

    Prieto, P; Kinsner-Ovaskainen, A; Stanzel, S; Albella, B; Artursson, P; Campillo, N; Cecchelli, R; Cerrato, L; Díaz, L; Di Consiglio, E; Guerra, A; Gombau, L; Herrera, G; Honegger, P; Landry, C; O'Connor, J E; Páez, J A; Quintas, G; Svensson, R; Turco, L; Zurich, M G; Zurbano, M J; Kopp-Schneider, A

    2013-06-01

    ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.). PMID:22922246

  7. The problem of what to expect when you are expecting regional change -- Different evaluation strategies of regional prediction and projection performance in the NCPP framework

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.

    2012-12-01

    Ensembles of climate model experiments together with means and trends in instrumental records generally build the basis for evaluation of predictions and projections of regional climate change. Most are drawn from mean climatological changes and trends, and some describe how changes in modes of variability are simulated. But how good are these regional and/or mode changes from models? It is clear that at the regional scale also internal variability needs to be taken into account. This is the case for identifying the forced changes in the real world, but it is also critical when evaluating a model-based prediction or projection. The key question is about what part of the observed variability and change we can, and should, expect models to reproduce. This problem is not trivial and requires a host of conditional considerations covering different time scales. Next to the instrumental reference observations, even paleoclimatic information from well-dated and verified reconstructions can be of use for important elements of this evaluation, including a more complete representation of the full range of variability as well as potential information of systematic structural response in the climate system to radiative perturbations. This presentation provides an overview of how the National Climate Predictions and Projections platform is currently developing a catalog of strategies to evaluate performance in regional climate outlooks across seasonal to decadal and centennial time scales, and how new research can enrich and extend the tools for a scientifically sound evaluation of what to expect when one is expecting regional climate change.

  8. An Analysis of Predicted vs. Monitored Space Heat Energy Use in 120 Homes : Residential Construction Demonstration Project Cycle II.

    SciTech Connect

    Douglass, John G.; Young, Marvin; Washington State Energy Office.

    1991-10-01

    The SUNDAY thermal simulation program was used to predict space heat energy consumption for 120 energy efficient homes. The predicted data were found to explain 43.8 percent of the variation in monitored space heat consumption. Using a paired Student's to test, no statistically significant difference could be found between mean predicted space heat and monitored space heat for the entire sample of homes. The homes were grouped into seven classes, sub-samples by total heat loss coefficient. An intermediate class (UA = 300--350 Btu/{degrees}F) was found to significantly over-predict space heat by 25 percent. The same class was over-predicted by 16 percent in the analogous Cycle 1 research, but the sample size was smaller and this was not found to be statistically significant. Several variables that were not directly included as inputs to the simulation were examined with an analysis of covariance model for their ability to improve the simulation's prediction of space heat. The variables having the greatest effect were conditioned floor area, heating system type, and foundation type. The model was able to increase the coefficient of determination from 0.438 to 0.670; a 54 percent increase. While the SUNDAY simulation program to aggregate is able to predict space heat consumption, it should be noted that there is a considerable amount of variation in both the monitored space heat consumption and the SUNDAY predictions. The ability of the program to accurately model an individual house will be constrained by both the quality of input variables and the range of occupant behavior. These constraints apply to any building model.

  9. An Analysis of Predicted vs. Monitored Space Heat Energy Use in 120 Homes :Residential Construction Demonstration Project Cycle II.

    SciTech Connect

    Douglass, John G.; Young, Marvin; Washington State Energy Office.

    1991-10-01

    The SUNDAY thermal simulation program was used to predict space heat energy consumption for 120 energy efficient homes. The predicted data were found to explain 43.8 percent of the variation in monitored space heat consumption. Using a paired Student`s to test, no statistically significant difference could be found between mean predicted space heat and monitored space heat for the entire sample of homes. The homes were grouped into seven classes, sub-samples by total heat loss coefficient. An intermediate class (UA = 300--350 Btu/{degrees}F) was found to significantly over-predict space heat by 25 percent. The same class was over-predicted by 16 percent in the analogous Cycle 1 research, but the sample size was smaller and this was not found to be statistically significant. Several variables that were not directly included as inputs to the simulation were examined with an analysis of covariance model for their ability to improve the simulation`s prediction of space heat. The variables having the greatest effect were conditioned floor area, heating system type, and foundation type. The model was able to increase the coefficient of determination from 0.438 to 0.670; a 54 percent increase. While the SUNDAY simulation program to aggregate is able to predict space heat consumption, it should be noted that there is a considerable amount of variation in both the monitored space heat consumption and the SUNDAY predictions. The ability of the program to accurately model an individual house will be constrained by both the quality of input variables and the range of occupant behavior. These constraints apply to any building model.

  10. Next generation paradigm for urban pluvial flood modelling, prediction, management and vulnerability reduction - Interaction between RainGain and Blue Green Dream projects

    NASA Astrophysics Data System (ADS)

    Maksimovic, C.

    2012-04-01

    The effects of climate change and increasing urbanisation call for a new paradigm for efficient planning, management and retrofitting of urban developments to increase resilience to climate change and to maximize ecosystem services. Improved management of urban floods from all sources in required. Time scale for well documented fluvial and coastal floods allows for timely response but surface (pluvial) flooding caused by intense local storms had not been given appropriate attention, Pitt Review (UK). Urban surface floods predictions require fine scale data and model resolutions. They have to be tackled locally by combining central inputs (meteorological services) with the efforts of the local entities. Although significant breakthrough in modelling of pluvial flooding was made there is a need to further enhance short term prediction of both rainfall and surface flooding. These issues are dealt with in the EU Iterreg project Rain Gain (RG). Breakthrough in urban flood mitigation can only be achieved by combined effects of advanced planning design, construction and management of urban water (blue) assets in interaction with urban vegetated areas' (green) assets. Changes in design and operation of blue and green assets, currently operating as two separate systems, is urgently required. Gaps in knowledge and technology will be introduced by EIT's Climate-KIC Blue Green Dream (BGD) project. The RG and BGD projects provide synergy of the "decoupled" blue and green systems to enhance multiple benefits to: urban amenity, flood management, heat island, biodiversity, resilience to drought thus energy requirements, thus increased quality of urban life at lower costs. Urban pluvial flood management will address two priority areas: Short Term rainfall Forecast and Short term flood surface forecast. Spatial resolution of short term rainfall forecast below 0.5 km2 and lead time of a few hours are needed. Improvements are achievable by combining data sources of raingauge networks

  11. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  12. Development of a Stochastic Inversion Tool To Optimize Agreement Between The Observed And Predicted Seismic Response To CO2 Injection/Migration in the Weyburn-Midale Project

    SciTech Connect

    Ramirez, A L; Hao, Y; White, D; Carle, S; Dyer, K; Yang, X; Mcnab, W; Foxall, W; Johnson, J

    2009-12-02

    During Phase 1 of the Weyburn Project (2000-2004), 4D reflection seismic data were used to map CO{sub 2} migration within the Midale reservoir, while an extensive fluid sampling program documented the geochemical evolution triggered by CO{sub 2}-brine-oil-mineral interactions. The aim of this task (3b.11) is to exploit these existing seismic and geochemical data sets, augmented by CO{sub 2}/H{sub 2}O injection and HC/H{sub 2}O production data toward optimizing the reservoir model and thereby improving site characterization and dependent predictions of long-term CO{sub 2} storage in the Weyburn-Midale reservoir. Our initial project activities have concentrated on developing a stochastic inversion method that will identify reservoir models that optimize agreement between the observed and predicted seismic response. This report describes the technical approach we have followed, the data that supports it, and associated implementation activities. The report fulfills deliverable D1 in the project's statement of work. Future deliverables will describe the development of the stochastic inversion tool that uses geochemical data to optimize the reservoir model.

  13. Project TALENT Five-Year Follow-Up Studies, Predicting Development of Young Adults. Interim Report 5.

    ERIC Educational Resources Information Center

    Cooley, William W.; Lohnes, Paul R.

    The primary purpose of this monograph is to describe the relationship between adolescent personality and the educational and vocational development of young adults, criteria for the latter being developed from the Project TALENT follow-up studies. This relationship seeking is set in a context of career development theory and a concern for guidance…

  14. The Statistical Predictability of the Academic Performance of Registered Nursing Students at Macomb. Project No. 0141-77.

    ERIC Educational Resources Information Center

    Stankovich, Mary Jo

    A study was conducted at Macomb County Community College to determine whether there was a significant relationship between grades earned in individual nursing courses and the scores earned on corresponding subsets of the state board exam for nursing graduates and also whether a nursing student's success could be predicted from admissions…

  15. Can Online Discussion Participation Predict Group Project Performance? Investigating the Roles of Linguistic Features and Participation Patterns

    ERIC Educational Resources Information Center

    Yoo, Jaebong; Kim, Jihie

    2014-01-01

    Although many college courses adopt online tools such as Q&A online discussion boards, there is no easy way to measure or evaluate their effect on learning. As a part of supporting instructional assessment of online discussions, we investigate a predictive relation between characteristics of discussion contributions and student performance.…

  16. Testing projected wild bee distributions in agricultural habitats: predictive power depends on species traits and habitat type.

    PubMed

    Marshall, Leon; Carvalheiro, Luísa G; Aguirre-Gutiérrez, Jesús; Bos, Merijn; de Groot, G Arjen; Kleijn, David; Potts, Simon G; Reemer, Menno; Roberts, Stuart; Scheper, Jeroen; Biesmeijer, Jacobus C

    2015-10-01

    Species distribution models (SDM) are increasingly used to understand the factors that regulate variation in biodiversity patterns and to help plan conservation strategies. However, these models are rarely validated with independently collected data and it is unclear whether SDM performance is maintained across distinct habitats and for species with different functional traits. Highly mobile species, such as bees, can be particularly challenging to model. Here, we use independent sets of occurrence data collected systematically in several agricultural habitats to test how the predictive performance of SDMs for wild bee species depends on species traits, habitat type, and sampling technique. We used a species distribution modeling approach parametrized for the Netherlands, with presence records from 1990 to 2010 for 193 Dutch wild bees. For each species, we built a Maxent model based on 13 climate and landscape variables. We tested the predictive performance of the SDMs with independent datasets collected from orchards and arable fields across the Netherlands from 2010 to 2013, using transect surveys or pan traps. Model predictive performance depended on species traits and habitat type. Occurrence of bee species specialized in habitat and diet was better predicted than generalist bees. Predictions of habitat suitability were also more precise for habitats that are temporally more stable (orchards) than for habitats that suffer regular alterations (arable), particularly for small, solitary bees. As a conservation tool, SDMs are best suited to modeling rarer, specialist species than more generalist and will work best in long-term stable habitats. The variability of complex, short-term habitats is difficult to capture in such models and historical land use generally has low thematic resolution. To improve SDMs' usefulness, models require explanatory variables and collection data that include detailed landscape characteristics, for example, variability of crops and

  17. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  18. Final report for LDRD project {open_quotes}A new approach to protein function and structure prediction{close_quotes}

    SciTech Connect

    Phillips, C.A.

    1997-03-01

    This report describes the research performed under the laboratory-Directed Research and Development (LDRD) grant {open_quotes}A new approach to protein function and structure prediction{close_quotes}, funded FY94-6. We describe the goals of the research, motivate and list our improvements to the state of the art in multiple sequence alignment and phylogeny (evolutionary tree) construction, but leave technical details to the six publications resulting from this work. At least three algorithms for phylogeny construction or tree consensus have been implemented and used by researchers outside of Sandia.

  19. Operational strategy for soil concentration predictions of strontium/yttrium-90 and cesium-137 in surface soil at the West Valley Demonstration Project site

    SciTech Connect

    Myers, J.A.

    1995-06-05

    There are difficulties associated with the assessment of the interpretation of field measurements, determination of guideline protocols and control and disposal of low level radioactive contaminated soil in the environmental health physics field. Questions are raised among scientists and in public forums concerning the necessity and high costs of large area soil remediation versus the risks of low-dose radiation health effects. As a result, accurate soil activity assessments become imperative in decontamination situations. The West Valley Demonstration Project (WVDP), a US Department of Energy facility located in West Valley, New York is managed and operated by West Valley Nuclear Services Co., Inc. (WVNS). WVNS has identified contaminated on-site soil areas with a mixed variety of radionuclides (primarily fission product). Through the use of data obtained from a previous project performed during the summer of 1994 entitled ``Field Survey Correlation and Instrumentation Response for an In Situ Soil Measurement Program`` (Myers), the WVDP offers a unique research opportunity to investigate the possibility of soil concentration predictions based on exposure or count rate responses returned from a survey detector probe. In this study, correlations are developed between laboratory measured soil beta activity and survey probe response for the purposes of determining the optimal detector for field use and using these correlations to establish predictability of soil activity levels.

  20. Performance prediction of mechanical excavators from linear cutter tests on Yucca Mountain welded tuffs; Yucca Mountain Site Characterization Project

    SciTech Connect

    Gertsch, R.; Ozdemir, L.

    1992-09-01

    The performances of mechanical excavators are predicted for excavations in welded tuff. Emphasis is given to tunnel boring machine evaluations based on linear cutting machine test data obtained on samples of Topopah Spring welded tuff. The tests involve measurement of forces as cutters are applied to the rock surface at certain spacing and penetrations. Two disc and two point-attack cutters representing currently available technology are thus evaluated. The performance predictions based on these direct experimental measurements are believed to be more accurate than any previous values for mechanical excavation of welded tuff. The calculations of performance are predicated on minimizing the amount of energy required to excavate the welded tuff. Specific energy decreases with increasing spacing and penetration, and reaches its lowest at the widest spacing and deepest penetration used in this test program. Using the force, spacing, and penetration data from this experimental program, the thrust, torque, power, and rate of penetration are calculated for several types of mechanical excavators. The results of this study show that the candidate excavators will require higher torque and power than heretofore estimated.

  1. Predicting the future distribution of Polar Bear Habitat in the polar basin from resource selection functions applied to 21st century general circulation model projections of sea ice

    USGS Publications Warehouse

    Durner, George M.; Douglas, David C.; Nielson, Ryan M.; Amstrup, Steven C.; McDonald, Trent L.

    2007-01-01

    Predictions of polar bear (Ursus maritimus) habitat distribution in the Arctic polar basin during the 21st century were developed to help understand the likely consequences of anticipated sea ice reductions on polar bear populations. We used location data from satellite-collared polar bears and environmental data (e.g., bathymetry, coastlines, and sea ice) collected between 1985–1995 to build habitat use models called Resource Selection Functions (RSF). The RSFs described habitats polar bears preferred in each of four seasons: summer (ice minimum), autumn (growth), winter (ice maximum) and spring (melt). When applied to the model source data and to independent data (1996–2006), the RSFs consistently identified habitats most frequently used by polar bears. We applied the RSFs to monthly maps of 21st century sea ice concentration predicted by 10 general circulation models (GCM) described in the International Panel of Climate Change Fourth Assessment Report. The 10 GCMs we used had high concordance between their simulations of 20th century summer sea ice extent and the actual ice extent derived from passive microwave satellite observations. Predictions of the amount and rate of change in polar bear habitat varied among GCMs, but all GCMs predicted net habitat losses in the polar basin during the 21st century. Projected losses in the highest-valued RSF habitat (optimal habitat) were greatest in the peripheral seas of the polar basin, especially the Chukchi Sea and Barents Sea. Losses were least in high-latitude regions where RSFs predicted an initial increase in optimal habitat followed by a modest decline. The largest seasonal reductions in habitat were predicted for spring and summer. Average area of optimal polar bear habitat during summer in the polar basin declined from an observed 1.0 million km2 in 1985–1995 (baseline) to a projected multi-model average of 0.58 million km2 in 2045–2054 (-42% change), 0.36 million km2 in 2070–2079 (-64% change), and 0

  2. Evaluation and development of hydrological parameterisations for the atmosphere, ocean and land surface coupled model developed by the UK Environmental Prediction (UKEP) Prototype project

    NASA Astrophysics Data System (ADS)

    Martinez-de la Torre, Alberto; Blyth, Eleanor; Ashton, Heather; Lewis, Huw

    2016-04-01

    The UKEP project brings together atmosphere, ocean and land surface models and scientist to build a coupled prediction system for the UK at 1.5 km scale. JULES (Joint UK Land-Environment Simulator) is the land surface model that generates runoff and simulates soil hydrology within the coupled prediction system. Here we present an evaluation of JULES performance at producing river flow for 13 selected catchments in Great Britain, where we use daily river flow observations at the catchment outlets. The evaluation is based on the Nush-Sutcliffe metric and bias. Results suggest that the inclusion of a new linear topographic slope dependency in the S0 parameter of the PDM (Probability Distributed Model, scheme that generates saturation excess runoff at the land surface when the soil water storage reaches S0), improves results for all catchments, constraining the surface runoff production for flatter catchments during rainy episodes. The new hydrological configuration developed offline using the JULES model has been implemented in the coupled prediction system for an intense winter storm case study. We found significant changes in accumulated runoff and total column soil moisture, and results consistent with the offline experiments with an increase in surface runoff on the high slopes of Scotland.

  3. Prediction of Inhibitory Activity of Epidermal Growth Factor Receptor Inhibitors Using Grid Search-Projection Pursuit Regression Method

    PubMed Central

    Du, Hongying; Hu, Zhide; Bazzoli, Andrea; Zhang, Yang

    2011-01-01

    The epidermal growth factor receptor (EGFR) protein tyrosine kinase (PTK) is an important protein target for anti-tumor drug discovery. To identify potential EGFR inhibitors, we conducted a quantitative structure–activity relationship (QSAR) study on the inhibitory activity of a series of quinazoline derivatives against EGFR tyrosine kinase. Two 2D-QSAR models were developed based on the best multi-linear regression (BMLR) and grid-search assisted projection pursuit regression (GS-PPR) methods. The results demonstrate that the inhibitory activity of quinazoline derivatives is strongly correlated with their polarizability, activation energy, mass distribution, connectivity, and branching information. Although the present investigation focused on EGFR, the approach provides a general avenue in the structure-based drug development of different protein receptor inhibitors. PMID:21811593

  4. USGS "iCoast - Did the Coast Change?" Project: Crowd-Tagging Aerial Photographs to Improve Coastal Change Prediction Models

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Poore, B. S.; Plant, N. G.; Stockdon, H. F.; Morgan, K.; Snell, R.

    2014-12-01

    The U.S. Geological Survey (USGS) has been acquiring oblique aerial photographs of the coast before and after major storms since 1995 and has amassed a database of over 140,000 photographs of the Gulf, Atlantic, and Pacific coasts. USGS coastal scientists use these photographs to document and characterize coastal change caused by storms. The images can also be used to evaluate the accuracy of predictive models of coastal erosion. However, the USGS does not have the personnel to manually analyze all of the photographs taken after a storm. Also, computers cannot yet automatically identify damages and geomorphic changes to the coast from the oblique aerial photographs. There is a high public interest in accessing the limited number of pre- and post-storm photographic pairs the USGS is currently able to share. Recent federal policies that encourage open data and open innovation initiatives have resulted in many federal agencies developing new ways of using citizen science and crowdsourcing techniques to share data and collaborate with the public to accomplish large tasks. The USGS launched a crowdsourcing application in June 2014 called "iCoast - Did the Coast Change?" (http://coastal.er.usgs.gov/icoast) to allow citizens to help USGS scientists identify changes to the coast by comparing USGS aerial photographs taken before and after storms, and then selecting pre-defined tags like "dune scarp" and "sand on road." The tags are accompanied by text definitions and pictorial examples of these coastal morphology terms and serve to informally and passively educate users about coastal hazards. The iCoast application facilitates greater citizen awareness of coastal change and is an educational resource for teachers and students interested in learning about coastal vulnerability. We expect that the citizen observations from iCoast will assist with probabilistic model development to produce more accurate predictions of coastal vulnerability.

  5. Clinical and Biologic Features Predictive of Survival After Relapse of Neuroblastoma: A Report From the International Neuroblastoma Risk Group Project

    PubMed Central

    London, Wendy B.; Castel, Victoria; Monclair, Tom; Ambros, Peter F.; Pearson, Andrew D.J.; Cohn, Susan L.; Berthold, Frank; Nakagawara, Akira; Ladenstein, Ruth L.; Iehara, Tomoko; Matthay, Katherine K.

    2011-01-01

    Purpose Survival after neuroblastoma relapse is poor. Understanding the relationship between clinical and biologic features and outcome after relapse may help in selection of optimal therapy. Our aim was to determine which factors were significantly predictive of postrelapse overall survival (OS) in patients with recurrent neuroblastoma—particularly whether time from diagnosis to first relapse (TTFR) was a significant predictor of OS. Patients and Methods Patients with first relapse/progression were identified in the International Neuroblastoma Risk Group (INRG) database. Time from study enrollment until first event and OS time starting from first event were calculated. Cox regression models were used to calculate the hazard ratio of increased death risk and perform survival tree regression. TTFR was tested in a multivariable Cox model with other factors. Results In the INRG database (N = 8,800), 2,266 patients experienced first progression/relapse. Median time to relapse was 13.2 months (range, 1 day to 11.4 years). Five-year OS from time of first event was 20% (SE, ± 1%). TTFR was statistically significantly associated with OS time in a nonlinear relationship; patients with TTFR of 36 months or longer had the lowest risk of death, followed by patients who relapsed in the period of 0 to less than 6 months or 18 to 36 months. Patients who relapsed between 6 and 18 months after diagnosis had the highest risk of death. TTFR, age, International Neuroblastoma Staging System stage, and MYCN copy number status were independently predictive of postrelapse OS in multivariable analysis. Conclusion Age, stage, MYCN status, and TTFR are significant prognostic factors for postrelapse survival and may help in the design of clinical trials evaluating novel agents. PMID:21768459

  6. Tailoring dam structures to water quality predictions in new reservoir projects: assisting decision-making using numerical modeling.

    PubMed

    Marcé, Rafael; Moreno-Ostos, Enrique; García-Barcina, José Ma; Armengol, Joan

    2010-06-01

    Selection of reservoir location, the floodable basin forest handling, and the design of dam structures devoted to water supply (e.g. water outlets) constitute relevant features which strongly determine water quality and frequently demand management strategies to be adopted. Although these crucial aspects should be carefully examined during dam design before construction, currently the development of ad hoc limnological studies tailoring dam location and dam structures to the water quality characteristics expected in the future reservoir is not typical practice. In this study, we use numerical simulation to assist on the design of a new dam project in Spain with the aim of maximizing the quality of the water supplied by the future reservoir. First, we ran a well-known coupled hydrodynamic and biogeochemical dynamic numerical model (DYRESM-CAEDYM) to simulate the potential development of anoxic layers in the future reservoir. Then, we generated several scenarios corresponding to different potential hydraulic conditions and outlet configurations. Second, we built a simplified numerical model to simulate the development of the hypolimnetic oxygen content during the maturation stage after the first reservoir filling, taking into consideration the degradation of the terrestrial organic matter flooded and the adoption of different forest handling scenarios. Results are discussed in terms of reservoir design and water quality management. The combination of hypolimnetic withdrawal from two deep outlets and the removal of all the valuable terrestrial vegetal biomass before flooding resulted in the best water quality scenario. PMID:20199843

  7. Lessons learned from the National Climate Predictions and Projections (NCPP) platform Workshop on Quantitative Evaluation of Downscaling 2013

    NASA Astrophysics Data System (ADS)

    Guentchev, G.

    2013-12-01

    The mission of NCPP is to accelerate the provision of climate information on regional and local scale for use in adaptation planning and decision making through collaborative participation of a community of scientists and practitioners. A major focus is the development of a capability for objective and quantitative evaluation of downscaled climate information in support of applications. NCPP recognizes the importance of focusing this evaluation effort on real-world applications and the necessity to work closely with the user community to deliver usable evaluations and guidance. This summer NCPP organized our first workshop on quantitative evaluation of downscaled climate datasets (http://earthsystemcog.org/projects/downscaling-2013/). Workshop participants included representatives from downscaling efforts, applications partners from the health, ecological, agriculture and water resources impacts communities, and people working on data infrastructure, metadata, and standards development. The workshop exemplifies NCPP's approach of collaborative and participatory problem-solving where scientists are working together with practitioners to develop applications related evaluation. The set of observed and downscaled datasets included for evaluation in the workshop were assessed using a variety of metrics to elucidate the statistical characteristics of temperature and precipitation time series. In addition, the downscaled datasets were evaluated in terms of their representation of indices relevant to the participating applications working groups, more specifically related to human health and ecological impacts. The presentation will focus on sharing the lessons we learned from our workshop.

  8. Factors That Predict Financial Sustainability of Community Coalitions: Five Years of Findings from the PROSPER Partnership Project

    PubMed Central

    Greenberg, Mark T.; Feinberg, Mark E.; Johnson, Lesley E.; Perkins, Daniel F.; Welsh, Janet A.; Spoth, Richard L.

    2014-01-01

    This study is a longitudinal investigation of the PROSPER partnership model designed to evaluate the level of sustainability funding by community prevention teams, including which factors impact teams’ generation of sustainable funding. Community teams were responsible for choosing, implementing with quality, and sustaining evidence-based programs (EBPs) intended to reduce substance misuse and promote positive youth and family development. Fourteen US rural communities and small towns were studied. Data were collected from PROSPER community team members (N=164) and Prevention Coordinators (N=10), over a 5-year period. Global and specific aspects of team functioning were assessed over 6 waves. Outcome measures were the total funds (cash and in-kind) raised to implement prevention programs. All 14 community teams were sustained for the first five years. However, there was substantial variability in the amount of funds raised and these differences were predicted by earlier and concurrent team functioning and by team sustainability planning. Given the sufficient infrastructure and ongoing technical assistance provided by the PROSPER partnership model, local sustainability of EBPs is achievable. PMID:24706195

  9. Projecting Risk: The Importance of the HCR-20 Risk Management Scale in Predicting Outcomes with Forensic Patients.

    PubMed

    Vitacco, Michael J; Tabernik, Holly E; Zavodny, Denis; Bailey, Karen; Waggoner, Christina

    2016-03-01

    The present study evaluates data from 116 forensic inpatients who underwent violent risk assessments, which included the Historical, Clinical, Risk-20 (HCR-20), from 2006 to 2013 as part of an opportunity to be conditionally discharged from state forensic facilities. Of the 116 inpatients, 58 were never released, 39 were released and returned to a hospital, and 19 were released and never returned. Results from analyses of variance and multinomial logistic regression found the risk management (R) scale of the HCR-20 successfully predicted group membership in that higher scores were associated with a greater likelihood of not being released from a forensic facility or returning to a forensic facility after release. The results of this study indicate that clinicians should consider community-based risk variables when evaluating forensic patients for potential return to the community. This research demonstrates that clinicians failing to fully consider dynamic risk factors associated with community integration jeopardize the quality and thoroughness of their violence risk assessment with regards to readiness for release. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27009396

  10. REDUCING UNCERTAINTIES IN MODEL PREDICTIONS VIA HISTORY MATCHING OF CO2 MIGRATION AND REACTIVE TRANSPORT MODELING OF CO2 FATE AT THE SLEIPNER PROJECT

    SciTech Connect

    Zhu, Chen

    2015-03-31

    An important question for the Carbon Capture, Storage, and Utility program is “can we adequately predict the CO2 plume migration?” For tracking CO2 plume development, the Sleipner project in the Norwegian North Sea provides more time-lapse seismic monitoring data than any other sites, but significant uncertainties still exist for some of the reservoir parameters. In Part I, we assessed model uncertainties by applying two multi-phase compositional simulators to the Sleipner Benchmark model for the uppermost layer (Layer 9) of the Utsira Sand and calibrated our model against the time-lapsed seismic monitoring data for the site from 1999 to 2010. Approximate match with the observed plume was achieved by introducing lateral permeability anisotropy, adding CH4 into the CO2 stream, and adjusting the reservoir temperatures. Model-predicted gas saturation, CO2 accumulation thickness, and CO2 solubility in brine—none were used as calibration metrics—were all comparable with the interpretations of the seismic data in the literature. In Part II & III, we evaluated the uncertainties of predicted long-term CO2 fate up to 10,000 years, due to uncertain reaction kinetics. Under four scenarios of the kinetic rate laws, the temporal and spatial evolution of CO2 partitioning into the four trapping mechanisms (hydrodynamic/structural, solubility, residual/capillary, and mineral) was simulated with ToughReact, taking into account the CO2-brine-rock reactions and the multi-phase reactive flow and mass transport. Modeling results show that different rate laws for mineral dissolution and precipitation reactions resulted in different predicted amounts of trapped CO2 by carbonate minerals, with scenarios of the conventional linear rate law for feldspar dissolution having twice as much mineral trapping (21% of the injected CO2) as scenarios with a Burch-type or Alekseyev et al.–type rate law for feldspar dissolution (11%). So far, most reactive transport modeling (RTM) studies for

  11. Water pollution risk simulation and prediction in the main canal of the South-to-North Water Transfer Project

    NASA Astrophysics Data System (ADS)

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Cheng, Xi

    2014-11-01

    The middle route of the South-to-North Water Transfer Project (MRP) will divert water to Beijing Tuancheng Lake from Taocha in the Danjiangkou reservoir located in the Hubei province of China. The MRP is composed of a long canal and complex hydraulic structures and will transfer water in open channel areas to provide drinking water for Beijing, Shijiazhuang and other cities under extremely strict water quality requirements. A large number of vehicular accidents, occurred on the many highway bridges across the main canal would cause significant water pollution in the main canal. To ensure that water quality is maintained during the diversion process, the effects of pollutants on water quality due to sudden pollution accidents were simulated and analyzed in this paper. The MIKE11 HD module was used to calculate the hydraulic characteristics of the 42-km Xishi-to-Beijuma River channel of the MRP. Six types of hydraulic structures, including inverted siphons, gates, highway bridges, culverts and tunnels, were included in this model. Based on the hydrodynamic model, the MIKE11 AD module, which is one-dimensional advection dispersion model, was built for TP, NH3-N, CODMn and F. The validated results showed that the computed values agreed well with the measured values. In accordance with transportation data across the Dianbei Highway Bridge, the effects of traffic accidents on the bridge on water quality were analyzed. Based on simulated scenarios with three discharge rates (ranged from 12 m3/s to 17 m3/s, 40 m3/s, and 60 m3/s) and three pollution loading concentration levels (5 t, 10 t and 20 t) when trucks spill their contents (i.e., phosphate fertilizer, cyanide, oil and chromium solution) into the channel, emergency measures were proposed. Reasonable solutions to ensure the water quality with regard to the various types of pollutants were proposed, including treating polluted water, maintaining materials, and personnel reserves.

  12. Predicting future US water yield and ecosystem productivity by linking an ecohydrological model to WRF dynamically downscaled climate projections

    NASA Astrophysics Data System (ADS)

    Sun, S.; Sun, G.; Cohen, E.; McNulty, S. G.; Caldwell, P.; Duan, K.; Zhang, Y.

    2015-12-01

    Quantifying the potential impacts of climate change on water yield and ecosystem productivity (i.e., carbon balances) is essential to developing sound watershed restoration plans, and climate change adaptation and mitigation strategies. This study links an ecohydrological model (Water Supply and Stress Index, WaSSI) with WRF (Weather Research and Forecasting Model) dynamically downscaled climate projections of the HadCM3 model under the IPCC SRES A2 emission scenario. We evaluated the future (2031-2060) changes in evapotranspiration (ET), water yield (Q) and gross primary productivity (GPP) from the baseline period of 1979-2007 across the 82 773 watersheds (12 digit Hydrologic Unit Code level) in the conterminous US (CONUS), and evaluated the future annual and monthly changes of hydrology and ecosystem productivity for the 18 Water Resource Regions (WRRs) or 2-digit HUCs. Across the CONUS, the future multi-year means show increases in annual precipitation (P) of 45 mm yr-1 (6 %), 1.8 °C increase in temperature (T), 37 mm yr-1 (7 %) increase in ET, 9 mm yr-1 (3 %) increase in Q, and 106 g C m-2 yr-1 (9 %) increase in GPP. Response to climate change was highly variable across the 82, 773 watersheds, but in general, the majority would see consistent increases in all variables evaluated. Over half of the 82 773 watersheds, mostly found in the northeast and the southern part of the southwest would have an increase in annual Q (>100 mm yr-1 or 20 %). This study provides an integrated method and example for comprehensive assessment of the potential impacts of climate change on watershed water balances and ecosystem productivity at high spatial and temporal resolutions. Results will be useful for policy-makers and land managers in formulating appropriate watershed-specific strategies for sustaining water and carbon sources in the face of climate change.

  13. Project: "Project!"

    ERIC Educational Resources Information Center

    Grayson, Katherine

    2007-01-01

    In November 2006, the editors of "Campus Technology" launched their first-ever High-Resolution Projection Study, to find out if the latest in projector technology could really make a significant difference in teaching, learning, and educational innovation on US campuses. The author and her colleagues asked campus educators, technologists, and…

  14. Prediction and Detection of Land Surface Deformation Associated with CO2 Injection at the FutureGen 2.0 Carbon Capture and Storage Project Site

    NASA Astrophysics Data System (ADS)

    Strickland, C. E.; Spane, F.; Bonneville, A.; Murray, C. J.; Nguyen, B. N.; Vermeul, V. R.; Gilmore, T. J.

    2014-12-01

    The FutureGen 2.0 Project will inject 22 MMT of supercritical CO2 into the Mt Simon sandstone reservoir utilizing four deep-injection wells and a comprehensive monitoring program, which includes surface deformation monitoring. Analytical and numerical modeling analysis were both performed to predict potential vertical elevation changes based on simulated pressure changes and geomechanical properties for the targeted injection zone. Pressure changes due to continuous CO2 injection of 1.1 MMT of CO2/year over 20 years were obtained using the STOMP-CO2 numerical simulator. Injection zone elastic properties were calculated primarily from wireline geomechanical survey results that were obtained from the initial FutureGen 2.0 stratigraphic borehole. The continuous wireline geomechanical log elastic property results were utilized to estimate model layer thickness and rock compressibility for the various injection zone/model layers and were compared with hydrologic characterization results. Compressibility estimates obtained both from hydrologically based, in-situ tests, together with limited laboratory core samples provided similarly low compressibility results (1.0510-7 to 4.9210-7 psi-1) to those derived from the geomechanical wireline surveys. The predicted surface deformation was then estimated using two parallel modelling approaches. First, an analytical Biot-based, poro-elastic model was used to calculate an equivalent vertical displacement at land surface from the expected pore pressure increase and predicted injection zone model layer rock compressibility. The second method utilized a fully 3-D geomechanical modelling analysis to calculate the expected deformation at the surface using a STOMP-CO2/ABAQUS® sequentially coupled simulator. The predicted surface uplift after 20-years of continuous injection for both methods indicated a maximum deformation of approximately 20-25 mm, with most of the deformation occurring during the first 2 years. Surface deformation

  15. Climate prediction and predictability

    NASA Astrophysics Data System (ADS)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  16. Ground-Based Cloud and Atmospheric Boundary Layer Observations for the Project: High Definition Clouds and Precipitation for Advancing Climate Prediction, HD(CP)2

    NASA Astrophysics Data System (ADS)

    Hirsikko, A.; Ebell, K.; Ulrich, U.; Schween, J. H.; Bohn, B.; Görsdorf, U.; Leinweber, R.; Päschke, E.; Baars, H.; Seifert, P.; Klein Baltink, H.

    2014-12-01

    The German research initiative ''High Definition Clouds and Precipitation for advancing Climate Prediction, HD(CP)2'' aims for an improved representation of clouds and precipitation in climate models. Model development and its evaluation require comprehensive observational datasets. A specific work package was established to create uniform and documented observational datasets for the HD(CP)2 data base. Datasets included ground-based remote-sensing (Doppler lidars, ceilometers, microwave radiometers, and cloud radars) and in-situ (meteorological and radiation sensors) measurements. Four supersites (Jülich ObservatorY for Cloud Evolution (JOYCE), Lindenberg Meteorological Observatory - Richard Assmann Observatory (RAO), and Leipzig Aerosol and Cloud Remote Observations System (LACROS) in Germany, and Cabauw experimental site for atmospheric research (Cesar) in the Netherlands) are finalizing the operational procedures to provide quality controlled (and calibrated if possible) remote-sensing and in-situ observations, retrievals on atmospheric boundary layer state (e.g. winds, mixing layer height, humidity and temperature), and cloud macro and micro physical properties with uncertainty estimations or at least quality flags. During the project new processing and retrieval methods were developed if no commonly agreed or satisfying methods were available. Especially, large progress was made concerning uncertainty estimation and automated quality control. Additionally, the data from JOYCE are used in a radiative closure studies under cloudy conditions to evaluate retrievals of cloud properties. The current status of work progress will be presented.

  17. Glyphosate Use Predicts ADHD Hospital Discharges in the Healthcare Cost and Utilization Project Net (HCUPnet): A Two-Way Fixed-Effects Analysis

    PubMed Central

    Fluegge, Keith R.; Fluegge, Kyle R.

    2015-01-01

    There has been considerable international study on the etiology of rising mental disorders, such as attention-deficit hyperactivity disorder (ADHD), in human populations. As glyphosate is the most commonly used herbicide in the world, we sought to test the hypothesis that glyphosate use in agriculture may be a contributing environmental factor to the rise of ADHD in human populations. State estimates for glyphosate use and nitrogen fertilizer use were obtained from the U.S. Geological Survey (USGS). We queried the Healthcare Cost and Utilization Project net (HCUPNET) for state-level hospitalization discharge data in all patients for all-listed ADHD from 2007 to 2010. We used rural-urban continuum codes from the USDA-Economic Research Service when exploring the effect of urbanization on the relationship between herbicide use and ADHD. Least squares dummy variable (LSDV) method and within method using two-way fixed effects was used to elucidate the relationship between glyphosate use and all-listed ADHD hospital discharges. We show that a one kilogram increase in glyphosate use, in particular, in one year significantly positively predicts state-level all-listed ADHD discharges, expressed as a percent of total mental disorders, the following year (coefficient = 5.54E-08, p<.01). A study on the effect of urbanization on the relationship between glyphosate and ADHD indicates that the relationship is marginally significantly positive after multiple comparison correction only in urban U.S. counties (p<.025). Furthermore, total glyphosate use is strongly positively associated with total farm use of nitrogen fertilizers from 1992 to 2006 (p<.001). We present evidence from the biomedical research literature of a plausible link among glyphosate, nitrogen dysbiosis and ADHD. Glyphosate use is a significant predictor of state hospitalizations for all-listed ADHD hospital discharges, with the effect concentrated in urban U.S. counties. This effect is seen even after controlling

  18. Glyphosate Use Predicts ADHD Hospital Discharges in the Healthcare Cost and Utilization Project Net (HCUPnet): A Two-Way Fixed-Effects Analysis.

    PubMed

    Fluegge, Keith R; Fluegge, Kyle R

    2015-01-01

    There has been considerable international study on the etiology of rising mental disorders, such as attention-deficit hyperactivity disorder (ADHD), in human populations. As glyphosate is the most commonly used herbicide in the world, we sought to test the hypothesis that glyphosate use in agriculture may be a contributing environmental factor to the rise of ADHD in human populations. State estimates for glyphosate use and nitrogen fertilizer use were obtained from the U.S. Geological Survey (USGS). We queried the Healthcare Cost and Utilization Project net (HCUPNET) for state-level hospitalization discharge data in all patients for all-listed ADHD from 2007 to 2010. We used rural-urban continuum codes from the USDA-Economic Research Service when exploring the effect of urbanization on the relationship between herbicide use and ADHD. Least squares dummy variable (LSDV) method and within method using two-way fixed effects was used to elucidate the relationship between glyphosate use and all-listed ADHD hospital discharges. We show that a one kilogram increase in glyphosate use, in particular, in one year significantly positively predicts state-level all-listed ADHD discharges, expressed as a percent of total mental disorders, the following year (coefficient = 5.54E-08, p<.01). A study on the effect of urbanization on the relationship between glyphosate and ADHD indicates that the relationship is marginally significantly positive after multiple comparison correction only in urban U.S. counties (p<.025). Furthermore, total glyphosate use is strongly positively associated with total farm use of nitrogen fertilizers from 1992 to 2006 (p<.001). We present evidence from the biomedical research literature of a plausible link among glyphosate, nitrogen dysbiosis and ADHD. Glyphosate use is a significant predictor of state hospitalizations for all-listed ADHD hospital discharges, with the effect concentrated in urban U.S. counties. This effect is seen even after controlling

  19. Prediction of Pseudo relative velocity response spectra at Yucca Mountain for underground nuclear explosions conducted in the Pahute Mesa testing area at the Nevada testing site; Yucca Mountain Site Characterization Project

    SciTech Connect

    Phillips, J.S.

    1991-12-01

    The Yucca Mountain Site Characterization Project (YMP), managed by the Office of Geologic Disposal of the Office of Civilian Radioactive Waste Management of the US Department of Energy, is examining the feasibility of siting a repository for commercial, high-level nuclear wastes at Yucca Mountain on and adjacent to the Nevada Test Site (NTS). This work, intended to extend our understanding of the ground motion at Yucca Mountain resulting from testing of nuclear weapons on the NTS, was funded by the Yucca Mountain project and the Military Applications Weapons Test Program. This report summarizes one aspect of the weapons test seismic investigations conducted in FY88. Pseudo relative velocity response spectra (PSRV) have been calculated for a large body of surface ground motions generated by underground nuclear explosions. These spectra have been analyzed and fit using multiple linear regression techniques to develop a credible prediction technique for surface PSRVs. In addition, a technique for estimating downhole PSRVs at specific stations is included. A data summary, data analysis, prediction development, prediction evaluation, software summary and FORTRAN listing of the prediction technique are included in this report.

  20. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  1. Composite risk scores and composite endpoints in the risk prediction of outcomes in anticoagulated patients with atrial fibrillation. The Loire Valley Atrial Fibrillation Project.

    PubMed

    Banerjee, A; Fauchier, L; Bernard-Brunet, A; Clementy, N; Lip, G Y H

    2014-03-01

    Several validated risk stratification schemes for prediction of ischaemic stroke (IS)/thromboembolism (TE) and major bleeding are available for patients with non-valvular atrial fibrillation (NVAF). On the basis for multiple common risk factors for IS/TE and bleeding, it has been suggested that composite risk prediction scores may be more practical and user-friendly than separate scores for bleeding and IS/TE. In a long-term prospective hospital registry of anticoagulated patients with newly diagnosed AF, we compared the predictive value of existing risk prediction scores as well as composite risk scores, and also compared these risk scoring systems using composite endpoints. Endpoint 1 was the simple composite of IS and major bleeds. Endpoint 2 was based on a composite of IS plus intracerebral haemorrhage (ICH). Endpoint 3 was based on weighted coefficients for IS/TE and ICH. Endpoint 4 was a composite of stroke, cardiovascular death, TE and major bleeding. The incremental predictive value of these scores over CHADS2 (as reference) for composite endpoints was assessed using c-statistic, net reclassification improvement (NRI) and integrated discrimination improvement (IDI). Of 8,962 eligible individuals, 3,607 (40.2%) had NVAF and were on OAC at baseline. There were no statistically significant differences between the c-statistics of the various risk scores, compared with the CHADS2 score, regardless of the endpoint. For the various risk scores and various endpoints, NRI and IDI did not show significant improvement (≥1%), compared with the CHADS2 score. In conclusion, composite risk scores did not significantly improve risk prediction of endpoints in patients with NVAF, regardless of how endpoints were defined. This would support individualised prediction of IS/TE and bleeding separately using different separate risk prediction tools, and not the use of composite scores or endpoints for everyday 'real world' clinical practice, to guide decisions on

  2. Project summaries

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.

  3. Evaluation of Projection Methods to Predict Wetlands Area Sizes: the Wetlands Inventory of the United States (sample Date Correction, Land Classification)

    NASA Astrophysics Data System (ADS)

    Terrazas-Gonzalez, Gerardo H.

    This research concerns different methods that can be applied for projection of wetlands areas at selected times. A method described by Frayer (1987) for a stratified random sampling design. The method was developed by W. E. Frayer in collaboration with D. C. Bowden and can be used in surveys where the sampling units have been measured at two different times, tsb1 and tsb2. A change matrix giving the amount of each wetland type at time tsb1 that is in each of the wetland types at time tsb2 is obtained for each sampling unit. Projections are based on a mean annual stratum matrix of changes. Methods of evaluating the reliability of FBSB projections have not been given previously. One objective of this project is to provide variance estimators for the FBSB projection method using jackknife and bootstrap techniques. Direct analytic techniques appear to require an unrealistic amount of time to develop given the complexity of the estimator. Interest in projections at an arbitrary time led to a more general description of the stratum basis estimator to include t < tsb1 and t between tsb1 and tsb2. Variations in the measurement times tsb1 and tsb2 among sampling units within stratum and other considerations including the complexity of the stratum basis estimator, motivated the use of simpler estimators. Two classes of methods for making projections are given. The first class of estimators are functions of summed estimated change matrices while the second class of estimators are functions of products of normalized estimated change matrices. Estimators within each of the classes are further differentiated by whether the changes matrices are on annual or observed time period (tsb2-tsb1) basis. The projections procedures address two different objectives. One objective is the estimation of the total of wetlands areas at a given time. The other objective is to estimate the amount of change among wetland types between 2 given times. All the methods can be used for both objectives

  4. Predicting the effects of climate change on ecosystems and wildlife habitat in northwest Alaska: results from the WildCast project

    USGS Publications Warehouse

    DeGange, Anthony R.; Marcot, Bruce G.; Lawler, James; Jorgenson, Torre; Winfree, Robert

    2014-01-01

    We used a modeling framework and a recent ecological land classification and land cover map to predict how ecosystems and wildlife habitat in northwest Alaska might change in response to increasing temperature. Our results suggest modest increases in forest and tall shrub ecotypes in Northwest Alaska by the end of this century thereby increasing habitat for forest-dwelling and shrub-using birds and mammals. Conversely, we predict declines in several more open low shrub, tussock, and meadow ecotypes favored by many waterbird, shorebird, and small mammal species.

  5. EPA's ToxCast Project: Lessons learned and future directions for use of HTS in predicting in vivo toxicology -- A Chemical Perspective

    EPA Science Inventory

    U.S. EPA’s ToxCast and the related Tox21 projects are employing high-throughput screening (HTS) technologies to profile thousands of chemicals, which in turn serve as probes of a wide diversity of targets, pathways and mechanisms related to toxicity. Initial models relating ToxCa...

  6. DEVELOPMENT AND EVALUATION OF TIME- AND HEALTH-RELEVANT MONITORING METHODS FOR PM 2.5 EPISODE PREDICTION (SALT LAKE CITY EMPACT PROJECT)

    EPA Science Inventory

    Scope: Primary project goals are: (a) evaluate usefulness of a newly-developed, real-time, continuous monitor for total (nonvolatile plus semivolatile) PM2.5 mass, and particularly time- and health-relevance of this method as compared to other measurements of PM paramete...

  7. PROJECT CS-1082, INFORMATION AND TECHNOLOGY TOOLS FOR ASSESSMENT AND PREDICTION OF THE POTENTIAL EFFECTS OF MILITARY NOISE ON MARINE MAMMALS

    EPA Science Inventory

    CS-1082 was a FY98 New Start. Our broad objective is to transition information about effects of DoD sound types on marine mammal auditory anatomy and acoustic ecology to predictive models and mitigation tools. Currently, the DoD lacks scientifically defensible information concern...

  8. Predicting Breed Composition Using Breed Frequencies of 50,000 Markers from the U.S. Meat Animal Research Center 2,000 Bull Project

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Our objective was to evaluate whether breed composition of crossbred cattle could be predicted using reference breed frequencies of SNP markers on the BovineSNP50 array. Semen DNA samples of over 2,000 bulls from 16 common commercial beef breeds were genotyped using the array and used to estimate cu...

  9. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  10. On Earthquake Prediction in Japan

    PubMed Central

    UYEDA, Seiya

    2013-01-01

    Japan’s National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author’s view, are mainly interested in securing funds for seismology — on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  11. On earthquake prediction in Japan.

    PubMed

    Uyeda, Seiya

    2013-01-01

    Japan's National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author's view, are mainly interested in securing funds for seismology - on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  12. Use of Cumulative Degradation Factor Prediction and Life Test Result of the Thruster Gimbal Assembly Actuator for the Dawn Flight Project

    NASA Technical Reports Server (NTRS)

    Lo, C. John; Brophy, John R.; Etters, M. Andy; Ramesham, Rajeshuni; Jones, William R., Jr.; Jansen, Mark J.

    2009-01-01

    The Dawn Ion Propulsion System is the ninth project in NASA s Discovery Program. The Dawn spacecraft is being developed to enable the scientific investigation of the two heaviest main-belt asteroids, Vesta and Ceres. Dawn is the first mission to orbit two extraterrestrial bodies, and the first to orbit a main-belt asteroid. The mission is enabled by the onboard Ion Propulsion System (IPS) to provide the post-launch delta-V. The three Ion Engines of the IPS are mounted on Thruster Gimbal Assembly (TGA), with only one engine operating at a time for this 10-year mission. The three TGAs weigh 14.6 kg.

  13. Predicting the likelihood of future sexual recidivism: pilot study findings from a California sex offender risk project and cross-validation of the Static-99.

    PubMed

    Sreenivasan, Shoba; Garrick, Thomas; Norris, Randall; Cusworth-Walker, Sarah; Weinberger, Linda E; Essres, Garrett; Turner, Susan; Fain, Terry

    2007-01-01

    Pilot findings on 137 California sex offenders followed up over 10 years after release from custody (excluding cases in which legal jurisdiction expired) are presented. The sexual recidivism rate, very likely inflated by sample selection, was 31 percent at five years and 40 percent at 10 years. Cumulatively, markers of sexual deviance (multiple victim types) and criminality (prior parole violations and prison terms) led to improved prediction of sexual recidivism (receiver operating characteristic [ROC] = .71, r = .46) than singly (multiple victim types: ROC = .60, r = .31; prior parole violations and prison terms: ROC = .66, r = .37). Long-term Static-99 statistical predictive accuracy for sexual recidivism was lower in our sample (ROC = .62, r =.24) than the values presented in the developmental norms. Sexual recidivism rates were higher in our study for Static-99 scores of 2 and 3 than in the developmental sample, and lower for scores of 4 and 6. Given failures to replicate developmental norms, the Static-99 method of ranking sexual recidivism risk warrants caution when applied to individual offenders. PMID:18086738

  14. Orthostatic Hypotension and Elevated Resting Heart Rate Predict Low-Energy Fractures in the Population: The Malmö Preventive Project

    PubMed Central

    Hamrefors, Viktor; Härstedt, Maria; Holmberg, Anna; Rogmark, Cecilia; Sutton, Richard; Melander, Olle; Fedorowski, Artur

    2016-01-01

    Background Autonomic disorders of the cardiovascular system, such as orthostatic hypotension and elevated resting heart rate, predict mortality and cardiovascular events in the population. Low-energy-fractures constitute a substantial clinical problem that may represent an additional risk related to such autonomic dysfunction. Aims To test the association between orthostatic hypotension, resting heart rate and incidence of low-energy-fractures in the general population. Methods and Results Using multivariable-adjusted Cox regression models we investigated the association between orthostatic blood pressure response, resting heart rate and first incident low-energy-fracture in a population-based, middle-aged cohort of 33 000 individuals over 25 years follow-up. The median follow-up time from baseline to first incident fracture among the subjects that experienced a low energy fracture was 15.0 years. A 10 mmHg orthostatic decrease in systolic blood pressure at baseline was associated with 5% increased risk of low-energy-fractures (95% confidence interval 1.01–1.10) during follow-up, whereas the resting heart rate predicted low-energy-fractures with an effect size of 8% increased risk per 10 beats-per-minute (1.05–1.12), independently of the orthostatic response. Subjects with a resting heart rate exceeding 68 beats-per-minute had 18% (1.10–1.26) increased risk of low-energy-fractures during follow-up compared with subjects with a resting heart rate below 68 beats-per-minute. When combining the orthostatic response and resting heart rate, there was a 30% risk increase (1.08–1.57) of low-energy-fractures between the extremes, i.e. between subjects in the fourth compared with the first quartiles of both resting heart rate and systolic blood pressure-decrease. Conclusion Orthostatic blood pressure decline and elevated resting heart rate independently predict low-energy fractures in a middle-aged population. These two measures of subclinical cardiovascular

  15. Applications systems verification and transfer project. Volume 1: Operational applications of satellite snow cover observations: Executive summary. [usefulness of satellite snow-cover data for water yield prediction

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1981-01-01

    Both LANDSAT and NOAA satellite data were used in improving snowmelt runoff forecasts. When the satellite snow cover data were tested in both empirical seasonal runoff estimation and short term modeling approaches, a definite potential for reducing forecast error was evident. A cost benefit analysis run in conjunction with the snow mapping indicated a $36.5 million annual benefit accruing from a one percent improvement in forecast accuracy using the snow cover data for the western United States. The annual cost of employing the system would be $505,000. The snow mapping has proven that satellite snow cover data can be used to reduce snowmelt runoff forecast error in a cost effective manner once all operational satellite data are available within 72 hours after acquisition. Executive summaries of the individual snow mapping projects are presented.

  16. Integrative Pathway Analysis of Metabolic Signature in Bladder Cancer: A Linkage to The Cancer Genome Atlas Project and Prediction of Survival

    PubMed Central

    von Rundstedt, Friedrich-Carl; Rajapakshe, Kimal; Ma, Jing; Arnold, James M.; Gohlke, Jie; Putluri, Vasanta; Krishnapuram, Rashmi; Piyarathna, D. Badrajee; Lotan, Yair; Gödde, Daniel; Roth, Stephan; Störkel, Stephan; Levitt, Jonathan M.; Michailidis, George; Sreekumar, Arun; Lerner, Seth P.; Coarfa, Cristian; Putluri, Nagireddy

    2016-01-01

    Purpose We used targeted mass spectrometry to study the metabolic fingerprint of urothelial cancer and determine whether the biochemical pathway analysis gene signature would have a predictive value in independent cohorts of patients with bladder cancer. Materials and Methods Pathologically evaluated, bladder derived tissues, including benign adjacent tissue from 14 patients and bladder cancer from 46, were analyzed by liquid chromatography based targeted mass spectrometry. Differential metabolites associated with tumor samples in comparison to benign tissue were identified by adjusting the p values for multiple testing at a false discovery rate threshold of 15%. Enrichment of pathways and processes associated with the metabolic signature were determined using the GO (Gene Ontology) Database and MSigDB (Molecular Signature Database). Integration of metabolite alterations with transcriptome data from TCGA (The Cancer Genome Atlas) was done to identify the molecular signature of 30 metabolic genes. Available outcome data from TCGA portal were used to determine the association with survival. Results We identified 145 metabolites, of which analysis revealed 31 differential metabolites when comparing benign and tumor tissue samples. Using the KEGG (Kyoto Encyclopedia of Genes and Genomes) Database we identified a total of 174 genes that correlated with the altered metabolic pathways involved. By integrating these genes with the transcriptomic data from the corresponding TCGA data set we identified a metabolic signature consisting of 30 genes. The signature was significant in its prediction of survival in 95 patients with a low signature score vs 282 with a high signature score (p = 0.0458). Conclusions Targeted mass spectrometry of bladder cancer is highly sensitive for detecting metabolic alterations. Applying transcriptome data allows for integration into larger data sets and identification of relevant metabolic pathways in bladder cancer progression. PMID:26802582

  17. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  18. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  19. Collaborative Project. Understanding the effects of tides and eddies on the ocean dynamics, sea ice cover and decadal/centennial climate prediction using the Regional Arctic Climate Model (RACM)

    SciTech Connect

    Hutchings, Jennifer; Joseph, Renu

    2013-09-14

    The goal of this project is to develop an eddy resolving ocean model (POP) with tides coupled to a sea ice model (CICE) within the Regional Arctic System Model (RASM) to investigate the importance of ocean tides and mesoscale eddies in arctic climate simulations and quantify biases associated with these processes and how their relative contribution may improve decadal to centennial arctic climate predictions. Ocean, sea ice and coupled arctic climate response to these small scale processes will be evaluated with regard to their influence on mass, momentum and property exchange between oceans, shelf-basin, ice-ocean, and ocean-atmosphere. The project will facilitate the future routine inclusion of polar tides and eddies in Earth System Models when computing power allows. As such, the proposed research addresses the science in support of the BER’s Climate and Environmental Sciences Division Long Term Measure as it will improve the ocean and sea ice model components as well as the fully coupled RASM and Community Earth System Model (CESM) and it will make them more accurate and computationally efficient.

  20. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  1. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  2. Mortality Factors in Geriatric Blunt Trauma Patients: Creation of a Highly Predictive Statistical Model for Mortality Using 50,765 Consecutive Elderly Trauma Admissions from the National Sample Project

    PubMed Central

    HRANJEC, TJASA; SAWYER, ROBERT G.; YOUNG, JEFFREY S.; SWENSON, BRIAN R.; CALLAND, JAMES F.

    2013-01-01

    Elderly patients are at high risk for mortality after injury. We hypothesized that trauma benchmarking efforts would benefit from development of a geriatric-specific model for risk-adjusted analyses of trauma center outcomes. A total of 57,973 records of elderly patients (age older than 65 years), which met our selection criteria, were submitted to the National Trauma Database and included within the National Sample Project between 2003 and 2006. These cases were used to construct a multivariable logistic regression model, which was compared with the American College of Surgeons Committee on Trauma’s Trauma Quality Improvement Project’s (TQIP) existing model. Additional spline regression analyses were performed to further objectively quantify the physiologic differences between geriatric patients and their younger counterparts. The geriatric-specific and TQIP mortality models shared several covariates: age, Injury Severity Score, motor component of the Glasgow Coma Scale, and systolic blood pressure. Our model additionally used temperature and the presence of mechanical ventilation. Our geriatric-specific regression mode generated a superior c-statistic as compared with the TQIP approximation (0.85 vs 0.77; P = 0.048). Spline analyses demonstrated that elderly patients appear to be less likely to tolerate relative hypotension with higher observed mortality at initial systolic blood pressures of 90 to 130 mmHg. Although the TQIP model includes a single age component, these data suggest that each variable needs to be adjusted for age to more accurately predict mortality in the elderly. Clearly, a separate geriatric model for predicting outcomes is not only warranted, but necessary. PMID:23265126

  3. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    EPA Science Inventory

    Humans potentially are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Many of these chemicals never have been tested for their ability to interact with the es...

  4. National Climate Predictions and Projections (NCPP)

    NASA Astrophysics Data System (ADS)

    Anderson, D. E.; DeLuca, C.

    2011-12-01

    The NCPP Supports state-of-the-art approaches to develop and deliver comprehensive regional climate information and facilitate its use in decision making and adaptation planning. NCPP is a community enterprise where climate information users, infrastructure developers, and scientists come together in a collaborative problem solving environment. We will describe the evolving infrastructure and tools under development through open source, open access collaboration across US government agencies, universities and international interactions.

  5. Project Wild (Project Tame).

    ERIC Educational Resources Information Center

    Siegenthaler, David

    For 37 states in the United States, Project Wild has become an officially sanctioned, distributed and funded "environemtnal and conservation education program." For those who are striving to implement focused, sequential, learning programs, as well as those who wish to promote harmony through a non-anthropocentric world view, Project Wild may…

  6. Graphing Predictions

    ERIC Educational Resources Information Center

    Connery, Keely Flynn

    2007-01-01

    Graphing predictions is especially important in classes where relationships between variables need to be explored and derived. In this article, the author describes how his students sketch the graphs of their predictions before they begin their investigations on two laboratory activities: Distance Versus Time Cart Race Lab and Resistance; and…

  7. Predictive Evaluation

    ERIC Educational Resources Information Center

    Scriven, Michael

    2007-01-01

    Noting that there has been extensive discussion of the relation of evaluation to: (1) research; (2) explanations (a.k.a. theory-driven, logic model, or realistic evaluation); and (3) recommendations, the author introduces: (4) prediction. He advocates that unlike the first three concepts, prediction is necessarily part of most kinds of evaluation,…

  8. Practical aspects of geological prediction

    SciTech Connect

    Mallio, W.J.; Peck, J.H.

    1981-11-01

    Nuclear waste disposal requires that geology be a predictive science. The prediction of future events rests on (1) recognizing the periodicity of geologic events; (2) defining a critical dimension of effect, such as the area of a drainage basin, the length of a fault trace, etc; and (3) using our understanding of active processes the project the frequency and magnitude of future events in the light of geological principles. Of importance to nuclear waste disposal are longer term processes such as continental denudation and removal of materials by glacial erosion. Constant testing of projections will allow the practical limits of predicting geological events to be defined. 11 refs.

  9. Comparison of initial perturbation methods for the mesoscale ensemble prediction system of the Meteorological Research Institute for the WWRP Beijing 2008 Olympics Research and Development Project (B08RDP)

    NASA Astrophysics Data System (ADS)

    Saito, Kazuo; Hara, Masahiro; Kunii, Masaru; Seko, Hiromu; Yamaguchi, Munehiko

    2011-05-01

    Different initial perturbation methods for the mesoscale ensemble prediction were compared by the Meteorological Research Institute (MRI) as a part of the intercomparison of mesoscale ensemble prediction systems (EPSs) of the World Weather Research Programme (WWRP) Beijing 2008 Olympics Research and Development Project (B08RDP). Five initial perturbation methods for mesoscale ensemble prediction were developed for B08RDP and compared at MRI: (1) a downscaling method of the Japan Meteorological Agency (JMA)'s operational one-week EPS (WEP), (2) a targeted global model singular vector (GSV) method, (3) a mesoscale model singular vector (MSV) method based on the adjoint model of the JMA non-hydrostatic model (NHM), (4) a mesoscale breeding growing mode (MBD) method based on the NHM forecast and (5) a local ensemble transform (LET) method based on the local ensemble transform Kalman filter (LETKF) using NHM. These perturbation methods were applied to the preliminary experiments of the B08RDP Tier-1 mesoscale ensemble prediction with a horizontal resolution of 15 km. To make the comparison easier, the same horizontal resolution (40 km) was employed for the three mesoscale model-based initial perturbation methods (MSV, MBD and LET). The GSV method completely outperformed the WEP method, confirming the advantage of targeting in mesoscale EPS. The GSV method generally performed well with regard to root mean square errors of the ensemble mean, large growth rates of ensemble spreads throughout the 36-h forecast period, and high detection rates and high Brier skill scores (BSSs) for weak rains. On the other hand, the mesoscale model-based initial perturbation methods showed good detection rates and BSSs for intense rains. The MSV method showed a rapid growth in the ensemble spread of precipitation up to a forecast time of 6 h, which suggests suitability of the mesoscale SV for short-range EPSs, but the initial large growth of the perturbation did not last long. The

  10. Predicting Hurricanes with Supercomputers

    SciTech Connect

    2010-01-01

    Hurricane Emily, formed in the Atlantic Ocean on July 10, 2005, was the strongest hurricane ever to form before August. By checking computer models against the actual path of the storm, researchers can improve hurricane prediction. In 2010, NOAA researchers were awarded 25 million processor-hours on Argonne's BlueGene/P supercomputer for the project. Read more at http://go.usa.gov/OLh

  11. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  12. Super high-resolution mesoscale weather prediction

    NASA Astrophysics Data System (ADS)

    Saito, K.; Tsuyuki, T.; Seko, H.; Kimura, F.; Tokioka, T.; Kuroda, T.; Duc, L.; Ito, K.; Oizumi, T.; Chen, G.; Ito, J.; the Spire Field 3 Mesoscale Nwp Group

    2013-08-01

    A five-year research project of high performance regional numerical weather prediction is underway as one of the five research fields of the Strategic Programs for Innovative Research (SPIRE). The ultimate goal of the project is to demonstrate feasibility of precise prediction of severe weather phenomena using the K-computer. Three sub-themes of the project are shown with achievements at the present and developments in the near future.

  13. Successful Predictions

    NASA Astrophysics Data System (ADS)

    Pierrehumbert, R.

    2012-12-01

    In an observational science, it is not possible to test hypotheses through controlled laboratory experiments. One can test parts of the system in the lab (as is done routinely with infrared spectroscopy of greenhouse gases), but the collective behavior cannot be tested experimentally because a star or planet cannot be brought into the lab; it must, instead, itself be the lab. In the case of anthropogenic global warming, this is all too literally true, and the experiment would be quite exciting if it weren't for the unsettling fact that we and all our descendents for the forseeable future will have to continue making our home in the lab. There are nonetheless many routes though which the validity of a theory of the collective behavior can be determined. A convincing explanation must not be a"just-so" story, but must make additional predictions that can be verified against observations that were not originally used in formulating the theory. The field of Earth and planetary climate has racked up an impressive number of such predictions. I will also admit as "predictions" statements about things that happened in the past, provided that observations or proxies pinning down the past climate state were not available at the time the prediction was made. The basic prediction that burning of fossil fuels would lead to an increase of atmospheric CO2, and that this would in turn alter the Earth's energy balance so as to cause tropospheric warming, is one of the great successes of climate science. It began in the lineage of Fourier, Tyndall and Arrhenius, and was largely complete with the the radiative-convective modeling work of Manabe in the 1960's -- all well before the expected warming had progressed far enough to be observable. Similarly, long before the increase in atmospheric CO2 could be detected, Bolin formulated a carbon cycle model and used it to predict atmospheric CO2 out to the year 2000; the actual values come in at the high end of his predicted range, for

  14. Methods of Predicting Solid Waste Characteristics.

    ERIC Educational Resources Information Center

    Boyd, Gail B.; Hawkins, Myron B.

    The project summarized by this report involved a preliminary design of a model for estimating and predicting the quantity and composition of solid waste and a determination of its feasibility. The novelty of the prediction model is that it estimates and predicts on the basis of knowledge of materials and quantities before they become a part of the…

  15. Climate Modeling and Prediction at NSIPP

    NASA Technical Reports Server (NTRS)

    Suarez, Max; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The talk will review modeling and prediction efforts undertaken as part of NASA's Seasonal to Interannual Prediction Project (NSIPP). The focus will be on atmospheric model results, including its use for experimental seasonal prediction and the diagnostic analysis of climate anomalies. The model's performance in coupled experiments with land and atmosphere models will also be discussed.

  16. ENSO predictability

    NASA Astrophysics Data System (ADS)

    Larson, Sarah Michelle

    The overarching goal of this work is to explore seasonal El Nino -- Southern Oscillation (ENSO) predictability. More specifically, this work investigates how intrinsic variability affects ENSO predictability using a state-of-the-art climate model. Topics related to the effects of systematic model errors and external forcing are not included in this study. Intrinsic variability encompasses a hierarchy of temporal and spatial scales, from high frequency small-scale noise-driven processes including coupled instabilities to low frequency large-scale deterministic climate modes. The former exemplifies what can be considered intrinsic "noise" in the climate system that hinders predictability by promoting rapid error growth whereas the latter often provides the slow thermal ocean inertia that supplies the coupled ENSO system with predictability. These two ends of the spectrum essentially provide the lower and upper bounds of ENSO predictability that can be attributed to internal variability. The effects of noise-driven coupled instabilities on sea surface temperature (SST) predictability in the ENSO region is quantified by utilizing a novel coupled model methodology paired with an ensemble approach. The experimental design allows for rapid growth of intrinsic perturbations that are not prescribed. Several cases exhibit sufficiently rapid growth to produce ENSO-like final states that do not require a previous ENSO event, large-scale wind trigger, or subsurface heat content precursor. Results challenge conventional ENSO theory that considers the subsurface precursor as a necessary condition for ENSO. Noise-driven SST error growth exhibits strong seasonality and dependence on the initialization month. A dynamical analysis reveals that much of the error growth behavior is linked to the seasonal strength of the Bjerknes feedback in the model, indicating that the noise-induced perturbations grow via an ENSO-like mechanism. The daily error fields reveal that persistent

  17. Dropout Prediction.

    ERIC Educational Resources Information Center

    Curtis, Jonathan; And Others

    Secondary school students who drop out of school are put at great social and economic disadvantage. If potential dropouts can be identified early, prevention may be possible. To construct a prediction model which, through readily available school information, will aid in the identification of students likely to drop out, schools in the Austin,…

  18. Projects Work!

    ERIC Educational Resources Information Center

    Textor, Martin R.

    2005-01-01

    The great educational value of projects is emphasized by contrasting negative aspects of the life of today's children with the goals of project work. This is illustrated by a project "Shopping." It is shown what children are learning in such projects and what the advantages of project work are. Relevant topic areas, criteria for selecting a…

  19. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  20. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  1. 1986-87 atomic mass predictions

    SciTech Connect

    Haustein, P.E.

    1987-12-10

    A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conference (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Moeller and Nix; Moeller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jaenecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.

  2. VIPER project

    NASA Technical Reports Server (NTRS)

    Kershaw, John

    1990-01-01

    The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.

  3. Evaluating the Predictive Value of Growth Prediction Models

    ERIC Educational Resources Information Center

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  4. Stock Market Project.

    ERIC Educational Resources Information Center

    Distel, Brenda D.

    This project is designed to teach students the process of buying stocks and to tracking their investments over the course of a semester. The goals of the course are to teach students about the relationships between conditions in the economy and the stock market; to predict the effect of an economic event on a specific stock or industry; to relate…

  5. Shop Projects.

    ERIC Educational Resources Information Center

    Patton, Bob

    Vocational agriculture teachers in Oklahoma prepared the shop project drawings which comprise the document. Seventy-one projects, with lists of required materials, diagrams, and measurements, are included. Construction projects fall into six categories (number of projects in parentheses): Trailers (5), racks (3), livestock production projects…

  6. Final Technical Report: Increasing Prediction Accuracy.

    SciTech Connect

    King, Bruce Hardison; Hansen, Clifford; Stein, Joshua

    2015-12-01

    PV performance models are used to quantify the value of PV plants in a given location. They combine the performance characteristics of the system, the measured or predicted irradiance and weather at a site, and the system configuration and design into a prediction of the amount of energy that will be produced by a PV system. These predictions must be as accurate as possible in order for finance charges to be minimized. Higher accuracy equals lower project risk. The Increasing Prediction Accuracy project at Sandia focuses on quantifying and reducing uncertainties in PV system performance models.

  7. Word prediction

    SciTech Connect

    Rumelhart, D.E.; Skokowski, P.G.; Martin, B.O.

    1995-05-01

    In this project we have developed a language model based on Artificial Neural Networks (ANNs) for use in conjunction with automatic textual search or speech recognition systems. The model can be trained on large corpora of text to produce probability estimates that would improve the ability of systems to identify words in a sentence given partial contextual information. The model uses a gradient-descent learning procedure to develop a metric of similarity among terms in a corpus, based on context. Using lexical categories based on this metric, a network can then be trained to do serial word probability estimation. Such a metric can also be used to improve the performance of topic-based search by allowing retrieval of information that is related to desired topics even if no obvious set of key words unites all the retrieved items.

  8. The contoured auricular projection graft for nasal tip projection.

    PubMed

    Porter, J P; Tardy, M E; Cheng, J

    1999-01-01

    In all rhinoplasty surgery, the universal need exists to increase, decrease, or preserve existing tip projection. When proper tip projection is lacking, a variety of techniques are useful for improving projection. We describe a valuable technique for tip projection, particularly useful and indicated in the Asian rhinoplasty, African American rhinoplasty, and in certain revision rhinoplasties. In the past 15 years, the senior author (M.E.T.) has used the contoured auricular projection graft in selected patients for achieving satisfactory tip projection in patients with blunted tips. The aesthetic outcomes have been predictable, pleasing, and reliable for the long term. Precision pocket preparation for auricular conchal cartilage graft placement is key to symmetry and projection of the final outcome. The results yielded a rounded nasal tip that may be more natural-appearing in Asians, African Americans, and selected patients with revision rhinoplasty. The contoured auricular projection graft provides a highly useful graft for the nasal tip. PMID:10937122

  9. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

    NASA Technical Reports Server (NTRS)

    Paluska, A.; Pavoni, N.

    1983-01-01

    Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

  10. A Course in... Model Predictive Control.

    ERIC Educational Resources Information Center

    Arkun, Yaman; And Others

    1988-01-01

    Describes a graduate engineering course which specializes in model predictive control. Lists course outline and scope. Discusses some specific topics and teaching methods. Suggests final projects for the students. (MVL)

  11. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  12. Initial Value Predictability of Intrinsic Oceanic Modes and Implications for Decadal Prediction over North America

    SciTech Connect

    Branstator, Grant

    2014-12-09

    The overall aim of our project was to quantify and characterize predictability of the climate as it pertains to decadal time scale predictions. By predictability we mean the degree to which a climate forecast can be distinguished from the climate that exists at initial forecast time, taking into consideration the growth of uncertainty that occurs as a result of the climate system being chaotic. In our project we were especially interested in predictability that arises from initializing forecasts from some specific state though we also contrast this predictability with predictability arising from forecasting the reaction of the system to external forcing – for example changes in greenhouse gas concentration. Also, we put special emphasis on the predictability of prominent intrinsic patterns of the system because they often dominate system behavior. Highlights from this work include: • Development of novel methods for estimating the predictability of climate forecast models. • Quantification of the initial value predictability limits of ocean heat content and the overturning circulation in the Atlantic as they are represented in various state of the artclimate models. These limits varied substantially from model to model but on average were about a decade with North Atlantic heat content tending to be more predictable than North Pacific heat content. • Comparison of predictability resulting from knowledge of the current state of the climate system with predictability resulting from estimates of how the climate system will react to changes in greenhouse gas concentrations. It turned out that knowledge of the initial state produces a larger impact on forecasts for the first 5 to 10 years of projections. • Estimation of tbe predictability of dominant patterns of ocean variability including well-known patterns of variability in the North Pacific and North Atlantic. For the most part these patterns were predictable for 5 to 10 years. • Determination of

  13. Solar prediction and intelligent machines

    NASA Technical Reports Server (NTRS)

    Johnson, Gordon G.

    1987-01-01

    The solar prediction program is aimed at reducing or eliminating the need to throughly understand the process previously developed and to still be able to produce a prediction. Substantial progress was made in identifying the procedures to be coded as well as testing some of the presently coded work. Another project involves work on developing ideas and software that should result in a machine capable of learning as well as carrying on an intelligent conversation over a wide range of topics. The underlying idea is to use primitive ideas and construct higher order ideas from these, which can then be easily related one to another.

  14. Graduate Student Project: Operations Management Product Plan

    ERIC Educational Resources Information Center

    Fish, Lynn

    2007-01-01

    An operations management product project is an effective instructional technique that fills a void in current operations management literature in product planning. More than 94.1% of 286 graduates favored the project as a learning tool, and results demonstrate the significant impact the project had in predicting student performance. The author…

  15. A mathematical model for predicting the probability of acute mortality in a human population exposed to accidentally released airborne radionuclides. Final report for Phase I of the project: early effects of inhaled radionuclides

    SciTech Connect

    Filipy, R.E.; Borst, F.J.; Cross, F.T.; Park, J.F.; Moss, O.R.

    1980-06-01

    The report presents a mathematical model for the purpose of predicting the fraction of human population which would die within 1 year of an accidental exposure to airborne radionuclides. The model is based on data from laboratory experiments with rats, dogs and baboons, and from human epidemiological data. Doses from external, whole-body irradiation and from inhaled, alpha- and beta-emitting radionuclides are calculated for several organs. The probabilities of death from radiation pneumonitis and from bone marrow irradiation are predicted from doses accumulated within 30 days of exposure to the radioactive aerosol. The model is compared with existing similar models under hypothetical exposure conditions. Suggestions for further experiments with inhaled radionuclides are included.

  16. Final project report

    SciTech Connect

    Nitin S. Baliga and Leroy Hood

    2008-11-12

    The proposed overarching goal for this project was the following: Data integration, simulation and visualization will facilitate metabolic and regulatory network prediction, exploration, and formulation of hypotheses. We stated three specific aims to achieve the overarching goal of this project: (1) Integration of multiple levels of information such as mRNA and protein levels, predicted protein-protein interactions/associations and gene function will enable construction of models describing environmental response and dynamic behavior. (2) Flexible tools for network inference will accelerate our understanding of biological systems. (3) Flexible exploration and queries of model hypotheses will provide focus and reveal novel dependencies. The underlying philosophy of these proposed aims is that an iterative cycle of experiments, experimental design, and verification will lead to a comprehensive and predictive model that will shed light on systems level mechanisms involved in responses elicited by living systems upon sensing a change in their environment. In the previous years report we demonstrated considerable progress in development of data standards, regulatory network inference and data visualization and exploration. We are pleased to report that several manuscripts describing these procedures have been published in top international peer reviewed journals including Genome Biology, PNAS, and Cell. The abstracts of these manuscripts are given and they summarize our accomplishments in this project.

  17. A Game Theoretic Approach to Cyber Attack Prediction

    SciTech Connect

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  18. Project Success.

    ERIC Educational Resources Information Center

    Meredith, Larry D.

    Project Success consists of after-school, weekend, and summer educational programs geared toward minority and disadvantaged students to increase their numbers seeking postsecondary education from the Meadville, Pennsylvania area. The project is funded primarily through the Edinboro University of Pennsylvania, whose administration is committed to…

  19. Project SEED.

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1986

    1986-01-01

    Reports on Project SEED (Summer Educational Experience for the Disadvantaged) a project in which high school students from low-income families work in summer jobs in a variety of academic, industrial, and government research labs. The program introduces the students to career possibilities in chemistry and to the advantages of higher education.…

  20. Project EASIER.

    ERIC Educational Resources Information Center

    Alvord, David J.; Tack, Leland R.; Dallam, Jerald W.

    1998-01-01

    Describes the development of Project EASIER, a collaborative electronic-data interchange for networking Iowa local school districts, education agencies, community colleges, universities, and the Department of Education. The primary goal of this project is to develop and implement a system for collection of student information for state and federal…

  1. Project FAST.

    ERIC Educational Resources Information Center

    Essexville-Hampton Public Schools, MI.

    Described are components of Project FAST (Functional Analysis Systems Training) a nationally validated project to provide more effective educational and support services to learning disordered children and their regular elementary classroom teachers. The program is seen to be based on a series of modules of delivery systems ranging from mainstream…

  2. Multimodel Decadal Predictability of the Subpolar Gyre

    NASA Astrophysics Data System (ADS)

    Wouters, B.; Hazeleger, W.; van Oldenborgh, G. J.; Drijfhout, S.

    2012-04-01

    Multimodel decadal predictions made within the THOR project are presented. The THOR project focusses on the AMOC. The ocean analyses show that the AMOC may have increased slightly up to the 1990s after which a reduction took place associated with a reduction of Labrador Sea Water formation. However, the AMOC is not directly observed, hence the focus will shift to observed ocean phenomana. These include the Atlantic Multidecadal Variability, the interhemispheric dipole, Labrador Sea Water formation and the Great Salinity Anomaly. It is shown that the interhemispheric dipole and the Atlantic Multidecadal Variability is predictable up to 9 years ahead. The upper ocean heat content is even better predictable. It appears to be hard to predict the Labrador Sea Water formation and the propagation of salinity anomalies in the subpolar gyre. Finally, the predictability is partly originating from the external forcing by changing greenhouse gas concentrations and aerosols.

  3. Hydrologic Ensemble Prediction: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Schaake, J.; Bradley, A.

    2005-12-01

    Ensemble forecast techniques are beginning to be used for hydrological prediction by operational hydrological services throughout the world. These techniques are attractive because they allow effects of a wide range of sources of uncertainty on hydrological forecasts to be accounted for. Not only does ensemble prediction in hydrology offer a general approach to probabilistic prediction, it offers a significant new approach to improve hydrological forecast accuracy as well. But, there are many scientific challenges that must be overcome to provide users with high quality hydrologic ensemble forecasts. A new international project the Hydrologic Ensemble Prediction Experiment (HEPEX) was started last year to organize the scientific community to meet these challenges. Its main objective is to bring the international hydrological community together with the meteorological community to demonstrate how to produce reliable hydrological ensemble for decisions for the benefit of public health and safety, the economy and the environment. Topics that will be addressed by the HEPEX scientific community include techniques for using weather and climate information in hydrologic prediction systems, new methods in hydrologic prediction, data assimilation issues in hydrology and hydrometeorology, verification and correction of ensemble weather and hydrologic forecasts, and better quantification of uncertainty in hydrological prediction. As pathway for addressing these topics, HEPEX will set up demonstration test bed projects and compile data sets for the intercomparison of coupled systems for atmospheric and hydrologic forecasting, and their assessment for meeting end users' needs for decision-making. Test bed projects have been proposed in North and South America, Europe, and Asia, and have a focus ranging from short-range flood forecasting to seasonal predictions for water supply. For example, within the United States, ongoing activities in seasonal prediction as part of the GEWEX

  4. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. Part 2: Theoretical development of a dynamic model and application to rain fade durations and tolerable control delays for fade countermeasures

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1987-01-01

    A dynamic rain attenuation prediction model is developed for use in obtaining the temporal characteristics, on time scales of minutes or hours, of satellite communication link availability. Analagous to the associated static rain attenuation model, which yields yearly attenuation predictions, this dynamic model is applicable at any location in the world that is characterized by the static rain attenuation statistics peculiar to the geometry of the satellite link and the rain statistics of the location. Such statistics are calculated by employing the formalism of Part I of this report. In fact, the dynamic model presented here is an extension of the static model and reduces to the static model in the appropriate limit. By assuming that rain attenuation is dynamically described by a first-order stochastic differential equation in time and that this random attenuation process is a Markov process, an expression for the associated transition probability is obtained by solving the related forward Kolmogorov equation. This transition probability is then used to obtain such temporal rain attenuation statistics as attenuation durations and allowable attenuation margins versus control system delay.

  5. Fracture Toughness Prediction for MWCNT Reinforced Ceramics

    SciTech Connect

    Henager, Charles H.; Nguyen, Ba Nghiep

    2013-09-01

    This report describes the development of a micromechanics model to predict fracture toughness of multiwall carbon nanotube (MWCNT) reinforced ceramic composites to guide future experimental work for this project. The modeling work described in this report includes (i) prediction of elastic properties, (ii) development of a mechanistic damage model accounting for matrix cracking to predict the composite nonlinear stress/strain response to tensile loading to failure, and (iii) application of this damage model in a modified boundary layer (MBL) analysis using ABAQUS to predict fracture toughness and crack resistance behavior (R-curves) for ceramic materials containing MWCNTs at various volume fractions.

  6. Practical lessons from protein structure prediction

    PubMed Central

    Ginalski, Krzysztof; Grishin, Nick V.; Godzik, Adam; Rychlewski, Leszek

    2005-01-01

    Despite recent efforts to develop automated protein structure determination protocols, structural genomics projects are slow in generating fold assignments for complete proteomes, and spatial structures remain unknown for many protein families. Alternative cheap and fast methods to assign folds using prediction algorithms continue to provide valuable structural information for many proteins. The development of high-quality prediction methods has been boosted in the last years by objective community-wide assessment experiments. This paper gives an overview of the currently available practical approaches to protein structure prediction capable of generating accurate fold assignment. Recent advances in assessment of the prediction quality are also discussed. PMID:15805122

  7. Predicting fish population response to instream flows

    SciTech Connect

    Studley, T.K.; Baldridge, J.E.; Railsback, S.F.

    1996-10-01

    A cooperative research program initiated by Pacific Gas and Electric is described. The goals of the project are to determine if trout populations respond to changes in base streamflows in a predictible manner, and to evaluate and improve the methods used to predict rainbow and brown trout population responses under altered flow regimes. Predictive methods based on computer models of the Physical Habitat Simulation System are described, and predictions generated for four diversions and creeks are tabulated. Baseline data indicates that instream flow assessments can be improved by using guild criteria in streams with competing species and including additional limiting factors (low recruitment, high winter flow, and high stream temperatures) in the analyses.

  8. Making detailed predictions makes (some) predictions worse

    NASA Astrophysics Data System (ADS)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  9. Geodynamics Project

    ERIC Educational Resources Information Center

    Drake, Charles L.

    1977-01-01

    Describes activities of Geodynamics Project of the Federal Council on Science and Technology, such as the application of multichannel seismic-reflection techniques to study the nature of the deep crust and upper mantle. (MLH)

  10. Project Soar.

    ERIC Educational Resources Information Center

    Austin, Marion

    1982-01-01

    Project Soar, a Saturday enrichment program for gifted students (6-14 years old), allows students to work intensively in a single area of interest. Examples are cited of students' work in crewel embroidery, creative writing, and biochemistry. (CL)

  11. Project Reptile!

    ERIC Educational Resources Information Center

    Diffily, Deborah

    2001-01-01

    Integrating curriculum is important in helping children make connections within and among areas. Presents a class project for kindergarten children which came out of the students' interests and desire to build a reptile exhibit. (ASK)

  12. Predicting the course of disease.

    PubMed

    Krakauer, H; Jacoby, I

    1993-01-01

    The ability to predict the course of disease and the effect of interventions is critical to effective medical practice and health care management. In this analysis, we sought to test whether available clinical data and analytic methodologies can be used to accurately predict the time course of the probability of death after hospital admission and the probability of readmission following discharge for patients with acute myocardial infarction or pulmonary disease. We grouped patients by selected physiologic characteristics and made time-to-event predictions using multiple regression models. These predictions were compared with observed probabilities calculated using the actuarial or life-table method. Predictions made with the Bailey-Makeham model generally replicated observed experience. They accurately accounted for substantial differences in the patterns of death and readmission and accurately delineated the effects of therapies, after adjustment for patient risk. These results were validated by analyses of resampled populations that differed in case mix from the source population. We believe that using such models to project the course of disease and the effects of treatment on that course in defined classes of patients should facilitate the development of practice guidelines for patient care and the management of health care resources. PMID:8314601

  13. Downstream prediction using a nonlinear prediction method

    NASA Astrophysics Data System (ADS)

    Adenan, N. H.; Noorani, M. S. M.

    2013-11-01

    The estimation of river flow is significantly related to the impact of urban hydrology, as this could provide information to solve important problems, such as flooding downstream. The nonlinear prediction method has been employed for analysis of four years of daily river flow data for the Langat River at Kajang, Malaysia, which is located in a downstream area. The nonlinear prediction method involves two steps; namely, the reconstruction of phase space and prediction. The reconstruction of phase space involves reconstruction from a single variable to the m-dimensional phase space in which the dimension m is based on optimal values from two methods: the correlation dimension method (Model I) and false nearest neighbour(s) (Model II). The selection of an appropriate method for selecting a combination of preliminary parameters, such as m, is important to provide an accurate prediction. From our investigation, we gather that via manipulation of the appropriate parameters for the reconstruction of the phase space, Model II provides better prediction results. In particular, we have used Model II together with the local linear prediction method to achieve the prediction results for the downstream area with a high correlation coefficient. In summary, the results show that Langat River in Kajang is chaotic, and, therefore, predictable using the nonlinear prediction method. Thus, the analysis and prediction of river flow in this area can provide river flow information to the proper authorities for the construction of flood control, particularly for the downstream area.

  14. Projection displays

    NASA Astrophysics Data System (ADS)

    Chiu, George L.; Yang, Kei H.

    1998-08-01

    Projection display in today's market is dominated by cathode ray tubes (CRTs). Further progress in this mature CRT projector technology will be slow and evolutionary. Liquid crystal based projection displays have gained rapid acceptance in the business market. New technologies are being developed on several fronts: (1) active matrix built from polysilicon or single crystal silicon; (2) electro- optic materials using ferroelectric liquid crystal, polymer dispersed liquid crystals or other liquid crystal modes, (3) micromechanical-based transducers such as digital micromirror devices, and grating light valves, (4) high resolution displays to SXGA and beyond, and (5) high brightness. This article reviews the projection displays from a transducer technology perspective along with a discussion of markets and trends.

  15. The 1986-87 atomic mass predictions

    NASA Astrophysics Data System (ADS)

    Haustein, P. E.

    1987-12-01

    A project to perform a comprehensive update of the atomic mass predictions has recently been concluded and will be published shortly in Atomic Data and Nuclear Data Tables. The project evolved from an ongoing comparison between available mass predictions and reports of newly measured masses of isotopes throughout the mass surface. These comparisons have highlighted a variety of features in current mass models which are responsible for predictions that diverge from masses determined experimentally. The need for a comprehensive update of the atomic mass predictions was therefore apparent and the project was organized and began at the last mass conference (AMCO-VII). Project participants included: Pape and Anthony; Dussel, Caurier and Zuker; Möller and Nix; Möller, Myers, Swiatecki and Treiner; Comay, Kelson, and Zidon; Satpathy and Nayak; Tachibana, Uno, Yamada and Yamada; Spanier and Johansson; Jänecke and Masson; and Wapstra, Audi and Hoekstra. An overview of the new atomic mass predictions may be obtained by written request.

  16. Optimal Prediction of Clocks from Finite Data

    NASA Technical Reports Server (NTRS)

    Greenhall, Charles A.

    2005-01-01

    This talk is about optimal linear prediction of processes with stationary dth increments, which serve as a class of models for random clock disturbances. The predictor is obtained by orthogonal projection on the affine space of estimators whose errors are invariant to additive polynomials of degree < d. The projection conditions give a system of linear equations thatcan be solved straightforwardly for the regression coefficients. If the data are equally spaced, then the predictor can be obtained by an extension of Levinson's algorithm.

  17. Cloudnet Project

    DOE Data Explorer

    Hogan, Robin

    2008-01-15

    Cloudnet is a research project supported by the European Commission. This project aims to use data obtained quasi-continuously for the development and implementation of cloud remote sensing synergy algorithms. The use of active instruments (lidar and radar) results in detailed vertical profiles of important cloud parameters which cannot be derived from current satellite sensing techniques. A network of three already existing cloud remote sensing stations (CRS-stations) will be operated for a two year period, activities will be co-ordinated, data formats harmonised and analysis of the data performed to evaluate the representation of clouds in four major european weather forecast models.

  18. LLAMA Project

    NASA Astrophysics Data System (ADS)

    Arnal, E. M.; Abraham, Z.; Giménez de Castro, G.; de Gouveia dal Pino, E. M.; Larrarte, J. J.; Lepine, J.; Morras, R.; Viramonte, J.

    2014-10-01

    The project LLAMA, acronym of Long Latin American Millimetre Array is very briefly described in this paper. This project is a joint scientific and technological undertaking of Argentina and Brazil on the basis of an equal investment share, whose mail goal is both to install and to operate an observing facility capable of exploring the Universe at millimetre and sub/millimetre wavelengths. This facility will be erected in the argentinean province of Salta, in a site located at 4830m above sea level.

  19. Maximum Capital Project Management.

    ERIC Educational Resources Information Center

    Adams, Matt

    2002-01-01

    Describes the stages of capital project planning and development: (1) individual capital project submission; (2) capital project proposal assessment; (3) executive committee; and (4) capital project execution. (EV)

  20. NDLGS project update

    SciTech Connect

    Lienert, Thomas J; Sutton, Jacob O; Piltch, Martin S; Lujan, Dennis J

    2011-01-14

    Recent results for laser and ESD processing for the NDLGS project will be reviewed. Conclusions are: (1) Short mix passes have profound effect on window T; (2) Multiple drill and re-weld at single location has been shown to be feasible and successful; (3) Kapton beam profiling method has been successfully developed. Comparison of 100 mm and 120 mm lenses gives reasonable and consistent results; (4) Manifold pumpdown data has been presented; (5) ESO results can be accurately predicted once a repeatable efficiency has been established; and (6) The electrode-workpiece geometry may play an important on ESO efficiency. Experiments are planned to investigate these effects.

  1. Predicting evolutionary dynamics

    NASA Astrophysics Data System (ADS)

    Balazsi, Gabor

    We developed an ordinary differential equation-based model to predict the evolutionary dynamics of yeast cells carrying a synthetic gene circuit. The predicted aspects included the speed at which the ancestral genotype disappears from the population; as well as the types of mutant alleles that establish in each environmental condition. We validated these predictions by experimental evolution. The agreement between our predictions and experimental findings suggests that cellular and population fitness landscapes can be useful to predict short-term evolution.

  2. Project CLASS.

    ERIC Educational Resources Information Center

    McBain, Susan L.; And Others

    Project CLASS (Competency-Based Live-Ability Skills) uses a series of 60 modules to teach life survival skills to adults with low-level reading ability--especially Adult Basic Education/English as a Second Language students. Two versions of the modules have been developed: one for use with teacher-directed instruction and another for independent…

  3. Limnological Projects.

    ERIC Educational Resources Information Center

    Hambler, David J.; Dixon, Jean M.

    1982-01-01

    Describes collection of quantitative samples of microorganisms and accumulation of physical data from a pond over a year. Provides examples of how final-year degree students have used materials and data for ecological projects (involving mainly algae), including their results/conclusions. Also describes apparatus and reagents used in the student…

  4. Project Schoolflight

    ERIC Educational Resources Information Center

    Owen, Ben

    1975-01-01

    Describes "Project School Flight" which is an idea originated by the Experimental Aircraft Association to provide the opportunity for young people to construct a light aircraft in the schools as part of a normal class. Address included of Experimental Aircraft Association for interested persons. (BR)

  5. Project Boomerang

    ERIC Educational Resources Information Center

    King, Allen L.

    1975-01-01

    Describes an experimental project on boomerangs designed for an undergraduate course in classical mechanics. The students designed and made their own boomerangs, devised their own procedures, and carried out suitable measurements. Presents some of their data and a simple analysis for the two-bladed boomerang. (Author/MLH)

  6. Project COLD.

    ERIC Educational Resources Information Center

    Kazanjian, Wendy C.

    1982-01-01

    Describes Project COLD (Climate, Ocean, Land, Discovery) a scientific study of the Polar Regions, a collection of 35 modules used within the framework of existing subjects: oceanography, biology, geology, meterology, geography, social science. Includes a partial list of topics and one activity (geodesic dome) from a module. (Author/SK)

  7. Project Documerica

    ERIC Educational Resources Information Center

    Journal of College Science Teaching, 1972

    1972-01-01

    The Environmental Protection Agency has started a project to actually picture the environmental movement in the United States. This is an attempt to make the public aware of the air pollution in their area or state and to acquaint them with the effects of air cleaning efforts. (PS)

  8. Tedese Project

    NASA Astrophysics Data System (ADS)

    Buforn, E.; Davila, J. Martin; Bock, G.; Pazos, A.; Udias, A.; Hanka, W.

    The TEDESE (Terremotos y Deformacion Cortical en el Sur de España) project is a joint project of the Universidad Complutense de Madrid (UCM) and Real Instituto y Observatorio de la Armada de San Fernando, Cadiz (ROA) supported by the Spanish Ministerio de Ciencia y Tecnologia with the participation of the GeoforschungZen- trum, Potsdam (GFZ). The aim is to carry out a study of the characteristics of the oc- currence and mechanism of earthquakes together with measurements of crustal struc- ture and deformations in order to obtain an integrated evaluation of seismic risk in southern Spain from. As part of this project a temporal network of 10 broad-band seismological stations, which will complete those already existing in the zone, have been installed in southern Spain and northern Africa for one year beginning in October 2001. The objectives of the project are the study in detail of the focal mechanisms of earthquakes in this area, of structural in crust and upper mantle, of seismic anisotropy in crust and mantle as indicator for tectonic deformation processed and the measure- ments of crustal deformations using techniques with permanent GPS and SLR stations and temporary GPS surveys. From these studies, seismotectonic models and maps will be elaborated and seismic risk in the zone will be evaluated.

  9. Project Reconstruct.

    ERIC Educational Resources Information Center

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  10. Projected Identities

    ERIC Educational Resources Information Center

    Anderson, Mark Alan

    2006-01-01

    This article presents the idea behind Projected Identities, an art activity wherein students fuse art-making processes and digital image manipulations in a series of exploratory artistic self-examinations. At some point in every person's life they've been told something hard to forget. Students might, for example, translate phrases like, "Good…

  11. Thanksgiving Project

    ERIC Educational Resources Information Center

    Hilden, Pauline

    1976-01-01

    A teacher describes a Thanksgiving project in which 40 educable mentally retarded students (6-13 years old) made and served their own dinner of stew, butter, bread, ice cream, and pie, and in the process learned about social studies, cooking, and proper meal behavior. (CL)

  12. PROJECT RESPECT

    EPA Science Inventory

    Project RESPECT was a national study evaluating the efficacy of HIV prevention counseling in changing high risk sexual behaviors and preventing new sexually transmitted diseases (STDs) and HIV. The trial enrolled men and women who came for diagnosis and treatment of an STD to one...

  13. Project Katrina

    ERIC Educational Resources Information Center

    Aghayan, Carol; Schellhaas, Andree; Wayne, Angela; Burts, Diane C.; Buchanan, Teresa K.; Benedict, Joan

    2005-01-01

    This article describes a spontaneous project that emerged from a group of 3- and 4-year-old children in Louisiana after Hurricane Katrina. The article describes how the teachers adapted the classroom and curriculum to meet the diverse needs of children who were evacuees, as well as those children who were affected in other ways by the…

  14. Project Notes

    ERIC Educational Resources Information Center

    School Science Review, 1977

    1977-01-01

    Listed and described are student A-level biology projects in the following areas: Angiosperm studies (e.g., factors affecting growth of various plants), 7; Bacterial studies, 1; Insect studies, 2; Fish studies, 1; Mammal studies, 1; Human studies, 1; Synecology studies, 2; Environmental studies, 2; and Enzyme studies, 1. (CS)

  15. Project Narrative

    SciTech Connect

    Driscoll, Mary C.

    2012-07-12

    The Project Narrative describes how the funds from the DOE grant were used to purchase equipment for the biology, chemistry, physics and mathematics departments. The Narrative also describes how the equipment is being used. There is also a list of the positive outcomes as a result of having the equipment that was purchased with the DOE grant.

  16. Project Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1979

    1979-01-01

    Listed are 32 biology A-level projects, categorized by organisms studied as follows: algae (1), bryophytes (1), angiosperms (14), fungi (1), flatworms (1), annelids (2), molluscs (1), crustaceans (2), insects (4), fish (2), mammals (1), humans (1); and one synecological study. (CS)

  17. Project Paiute

    ERIC Educational Resources Information Center

    Dearmin, Evalyn Titus

    1977-01-01

    Working with the Humboldt County School District, the Fort McDermitt Indian Education Committee, and four Paiute Teacher aides, the University of Nevada developed a three-component project: a bilingual/bicultural reading text for K-4 Paiutes; an in-service training program in Native American education; and a pilot bilingual curriculum. (JC)

  18. Project Succeed.

    ERIC Educational Resources Information Center

    Patterson, John

    Project Succeed is a program for helping failure- and dropout-oriented pupils to improve their school achievement. Attendance and assignment completion are the key behaviors for enhancing achievement. Behavior modification and communications procedures are used to bring about the desired changes. Treatment procedures include current assessment…

  19. Project SUCCEED.

    ERIC Educational Resources Information Center

    Yarger, Sam; Klingner, Janette

    This paper describes Project SUCCEED (School University Community Coalition for Excellence in Education). The coalition includes the University of Miami School of Education, the University of Miami College of Arts and Sciences, Miami-Dade County Public Schools, and the Miami Museum of Science. The goal is to provide a comprehensive approach to…

  20. Project CAST.

    ERIC Educational Resources Information Center

    Charles County Board of Education, La Plata, MD. Office of Special Education.

    The document outlines procedures for implementing Project CAST (Community and School Together), a community-based career education program for secondary special education students in Charles County, Maryland. Initial sections discuss the role of a learning coordinator, (including relevant travel reimbursement and mileage forms) and an overview of…

  1. Spent Nuclear Fuel project, project management plan

    SciTech Connect

    Fuquay, B.J.

    1995-10-25

    The Hanford Spent Nuclear Fuel Project has been established to safely store spent nuclear fuel at the Hanford Site. This Project Management Plan sets forth the management basis for the Spent Nuclear Fuel Project. The plan applies to all fabrication and construction projects, operation of the Spent Nuclear Fuel Project facilities, and necessary engineering and management functions within the scope of the project

  2. Cognitive Education Project. Summary Project.

    ERIC Educational Resources Information Center

    Mulcahy, Robert; And Others

    The Cognitive Education Project conducted a 3-year longitudinal evaluation of two cognitive education programs that were aimed at teaching thinking skills. The critical difference between the two experimental programs was that one, Feuerstein's Instrumental Enrichment (IE) method, was taught out of curricular content, while the other, the…

  3. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions. PMID:27286683

  4. Project Prometheus

    NASA Technical Reports Server (NTRS)

    Johnson, Steve

    2003-01-01

    Project Prometheus will enable a new paradigm in the scientific exploration of the Solar System. The proposed JIMO mission will start a new generation of missions characterized by more maneuverability, flexibility, power and lifetime. Project Prometheus organization is established at NASA Headquarters: 1.Organization established to carry out development of JIMO, nuclear power (radioisotope), and nuclear propulsion research. 2.Completed broad technology and national capacity assessments to inform decision making on planning and technology development. 3.Awarded five NRA s for nuclear propulsion research. 4.Radioisotope power systems in development, and Plutonium-238 being purchased from Russia. 5.Formulated science driven near-term and long-term plan for the safe utilization of nuclear propulsion based missions. 6.Completed preliminary studies (Pre-Phase A) of JIMO and other missions. 7.Initiated JIMO Phase A studies by Contractors and NASA.

  5. SIMBIOS Project

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; McClain, Charles R.; Busalacchi, Antonio J. (Technical Monitor)

    2001-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, NASA Research Announcement (NRAI) research status, satellite data processing, data product validation, and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project.

  6. Hydropower Projects

    SciTech Connect

    2015-04-02

    The Water Power Program helps industry harness this renewable, emissions-free resource to generate environmentally sustainable and cost-effective electricity. Through support for public, private, and nonprofit efforts, the Water Power Program promotes the development, demonstration, and deployment of advanced hydropower devices and pumped storage hydropower applications. These technologies help capture energy stored by diversionary structures, increase the efficiency of hydroelectric generation, and use excess grid energy to replenish storage reserves for use during periods of peak electricity demand. In addition, the Water Power Program works to assess the potential extractable energy from domestic water resources to assist industry and government in planning for our nation’s energy future. From FY 2008 to FY 2014, DOE’s Water Power Program announced awards totaling approximately $62.5 million to 33 projects focused on hydropower. Table 1 provides a brief description of these projects.

  7. Project MEDSAT

    NASA Astrophysics Data System (ADS)

    During the winter term of 1991, two design courses at the University of Michigan worked on a joint project, MEDSAT. The two design teams consisted of the Atmospheric, Oceanic, and Spacite System Design and Aerospace Engineering 483 (Aero 483) Aerospace System Design. In collaboration, they worked to produce MEDSAT, a satellite and scientific payload whose purpose was to monitor environmental conditions over Chiapas, Mexico. Information gained from the sensing, combined with regional data, would be used to determine the potential for malaria occurrence in that area. The responsibilities of AOSS 605 consisted of determining the remote sensing techniques, the data processing, and the method to translate the information into a usable output. Aero 483 developed the satellite configuration and the subsystems required for the satellite to accomplish its task. The MEDSAT project is an outgrowth of work already being accomplished by NASA's Biospheric and Disease Monitoring Program and Ames Research Center. NASA's work has been to develop remote sensing techniques to determine the abundance of disease carriers and now this project will place the techniques aboard a satellite. MEDSAT will be unique in its use of both a Synthetic Aperture Radar and visual/IR sensor to obtain comprehensive monitoring of the site. In order to create a highly feasible system, low cost was a high priority. To obtain this goal, a light satellite configuration launched by the Pegasus launch vehicle was used.

  8. Project MEDSAT

    NASA Technical Reports Server (NTRS)

    1991-01-01

    During the winter term of 1991, two design courses at the University of Michigan worked on a joint project, MEDSAT. The two design teams consisted of the Atmospheric, Oceanic, and Spacite System Design and Aerospace Engineering 483 (Aero 483) Aerospace System Design. In collaboration, they worked to produce MEDSAT, a satellite and scientific payload whose purpose was to monitor environmental conditions over Chiapas, Mexico. Information gained from the sensing, combined with regional data, would be used to determine the potential for malaria occurrence in that area. The responsibilities of AOSS 605 consisted of determining the remote sensing techniques, the data processing, and the method to translate the information into a usable output. Aero 483 developed the satellite configuration and the subsystems required for the satellite to accomplish its task. The MEDSAT project is an outgrowth of work already being accomplished by NASA's Biospheric and Disease Monitoring Program and Ames Research Center. NASA's work has been to develop remote sensing techniques to determine the abundance of disease carriers and now this project will place the techniques aboard a satellite. MEDSAT will be unique in its use of both a Synthetic Aperture Radar and visual/IR sensor to obtain comprehensive monitoring of the site. In order to create a highly feasible system, low cost was a high priority. To obtain this goal, a light satellite configuration launched by the Pegasus launch vehicle was used.

  9. Burnet Project

    PubMed Central

    Masellis, A.; Atiyeh, B.

    2009-01-01

    Summary The BurNet project, a pilot project of the Eumedis initiative, has become true. The Eumedis (EUro MEDiterranean Information Society) initiative is part of the MEDA programme of the EU to develop the Information Society in the Mediterranean area. In the health care sector, the objective of Eumedis is: the deployment of network-based solutions to interconnect - using userfriendly and affordable solutions - the actors at all levels of the "health care system" of the Euro-Mediterranean region. The Bur Net project interconnects 17 Burn Centres (BC) in the Mediterranean Area through an information network both to standardize courses of action in the field of prevention, treatment, and functional and psychological rehabilitation of burn patients and to coordinate interactions between BC and emergency rooms in peripheral hospitals using training/information activities and telemedicine to optimize first aid provided to burn patients before referral to a BC. Shared procedure protocols for prevention and the care and rehabilitation of patients, both at individual and mass level, will help to create an international specialized database and a Webbased teleconsultation system. PMID:21991176

  10. Ceramic Technology Project

    SciTech Connect

    Not Available

    1992-03-01

    The Ceramic Technology Project was developed by the USDOE Office of Transportation Systems (OTS) in Conservation and Renewable Energy. This project, part of the OTS's Materials Development Program, was developed to meet the ceramic technology requirements of the OTS's automotive technology programs. Significant accomplishments in fabricating ceramic components for the USDOE and NASA advanced heat engine programs have provided evidence that the operation of ceramic parts in high-temperature engine environments is feasible. These programs have also demonstrated that additional research is needed in materials and processing development, design methodology, and data base and life prediction before industry will have a sufficient technology base from which to produce reliable cost-effective ceramic engine components commercially. A five-year project plan was developed with extensive input from private industry. In July 1990 the original plan was updated through the estimated completion of development in 1993. The objective is to develop the industrial technology base required for reliable ceramics for application in advanced automotive heat engines. The project approach includes determining the mechanisms controlling reliability, improving processes for fabricating existing ceramics, developing new materials with increased reliability, and testing these materials in simulated engine environments to confirm reliability. Although this is a generic materials project, the focus is on the structural ceramics for advanced gas turbine and diesel engines, ceramic bearings and attachments, and ceramic coatings for thermal barrier and wear applications in these engines. To facilitate the rapid transfer of this technology to US industry, the major portion of the work is being done in the ceramic industry, with technological support from government laboratories, other industrial laboratories, and universities.

  11. Prediction of Gas Lubricated Foil Journal Bearing Performance

    NASA Technical Reports Server (NTRS)

    Carpino, Marc; Talmage, Gita

    2003-01-01

    This report summarizes the progress in the first eight months of the project. The objectives of this research project are to theoretically predict the steady operating conditions and the rotor dynamic coefficients of gas foil journal bearings. The project is currently on or ahead of schedule with the development of a finite element code that predicts steady bearing performance characteristics such as film thickness, pressure, load, and drag. Graphical results for a typical bearing are presented in the report. Project plans for the next year are discussed.

  12. PRESCHOOL PREDICTION AND PREVENTION OF LEARNING DISABILITIES.

    ERIC Educational Resources Information Center

    BEERY, KEITH E.

    THE OBJECTIVES OF THIS INITIAL REPORT OF A FOUR-YEAR PROJECT WERE (1) TO DEMONSTRATE A METHOD FOR THE PREDICTION AND PREVENTION OF LEARNING DISABILITIES, (2) TO FOSTER UNDERSTANDING OF CHILD DEVELOPMENT AMONG TEACHERS, PARENTS, AND PHYSICIANS. SUBJECTS WERE THE 3 1/2 TO 5 1/2 YEAR OLD CHILDREN OF AN ENTIRE SCHOOL DISTRICT. RESEARCHERS WERE…

  13. ANN prediction of the Dst index.

    NASA Astrophysics Data System (ADS)

    Pallocchia, G.; Amata, E.; Consolini, G.; Marcucci, M. F.; Bertello, I.

    We describe an artificial neural network algorithm for the prediction of the Dst index, developed in the framework of the Pilot Project on Space Weather Applications of the European Space Agency. We then discuss the need to develop a similar algorithm based on IMF data only and report on preliminary work and tests for such an algorithm.

  14. Study Predicts Dramatic Shifts in Enrollments.

    ERIC Educational Resources Information Center

    Evangelauf, Jean

    1991-01-01

    A new study detailing demographic shifts in the college-age population predicts growth in minority high school graduates and shrinkage or maintenance of White graduation rates. The report is the first to provide state-by-state figures on actual and projected graduates from 1986 through 1995 by racial and ethnic group. (MSE)

  15. Predicting Achievement and Motivation.

    ERIC Educational Resources Information Center

    Uguroglu, Margaret; Walberg, Herbert J.

    1986-01-01

    Motivation and nine other factors were measured for 970 students in grades five through eight in a study of factors predicting achievement and predicting motivation. Results are discussed. (Author/MT)

  16. Battery Life Predictive Model

    Energy Science and Technology Software Center (ESTSC)

    2009-12-31

    The Software consists of a model used to predict battery capacity fade and resistance growth for arbitrary cycling and temperature profiles. It allows the user to extrapolate from experimental data to predict actual life cycle.

  17. Predictability and prediction skill of the boreal summer intraseasonal oscillation in the Intraseasonal Variability Hindcast Experiment

    NASA Astrophysics Data System (ADS)

    Lee, Sun-Seon; Wang, Bin; Waliser, Duane E.; Neena, Joseph Mani; Lee, June-Yi

    2015-10-01

    Boreal summer intraseasonal oscillation (BSISO) is one of the dominant modes of intraseasonal variability of the tropical climate system, which has fundamental impacts on regional summer monsoons, tropical storms, and extra-tropical climate variations. Due to its distinctive characteristics, a specific metric for characterizing observed BSISO evolution and assessing numerical models' simulations has previously been proposed (Lee et al. in Clim Dyn 40:493-509, 2013). However, the current dynamical model's prediction skill and predictability have not been investigated in a multi-model framework. Using six coupled models in the Intraseasonal Variability Hindcast Experiment project, the predictability estimates and prediction skill of BSISO are examined. The BSISO predictability is estimated by the forecast lead day when mean forecast error becomes as large as the mean signal under the perfect model assumption. Applying the signal-to-error ratio method and using ensemble-mean approach, we found that the multi-model mean BSISO predictability estimate and prediction skill with strong initial amplitude (about 10 % higher than the mean initial amplitude) are about 45 and 22 days, respectively, which are comparable with the corresponding counterparts for Madden-Julian Oscillation during boreal winter (Neena et al. in J Clim 27:4531-4543, 2014a). The significantly lower BSISO prediction skill compared with its predictability indicates considerable room for improvement of the dynamical BSISO prediction. The estimated predictability limit is independent on its initial amplitude, but the models' prediction skills for strong initial amplitude is 6 days higher than the corresponding skill with the weak initial condition (about 15 % less than mean initial amplitude), suggesting the importance of using accurate initial conditions. The BSISO predictability and prediction skill are phase and season-dependent, but the degree of dependency varies with the models. It is important to

  18. Predicting Scholars' Scientific Impact

    PubMed Central

    Mazloumian, Amin

    2012-01-01

    We tested the underlying assumption that citation counts are reliable predictors of future success, analyzing complete citation data on the careers of scientists. Our results show that i) among all citation indicators, the annual citations at the time of prediction is the best predictor of future citations, ii) future citations of a scientist's published papers can be predicted accurately ( for a 1-year prediction, ) but iii) future citations of future work are hardly predictable. PMID:23185311

  19. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  20. Nonlinear Combustion Instability Prediction

    NASA Technical Reports Server (NTRS)

    Flandro, Gary

    2010-01-01

    The liquid rocket engine stability prediction software (LCI) predicts combustion stability of systems using LOX-LH2 propellants. Both longitudinal and transverse mode stability characteristics are calculated. This software has the unique feature of being able to predict system limit amplitude.

  1. Prediction in Multiple Regression.

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2000-01-01

    Presents the concept of prediction via multiple regression (MR) and discusses the assumptions underlying multiple regression analyses. Also discusses shrinkage, cross-validation, and double cross-validation of prediction equations and describes how to calculate confidence intervals around individual predictions. (SLD)

  2. Testing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Luen, Brad; Stark, Philip B.

    2008-01-01

    Statistical tests of earthquake predictions require a null hypothesis to model occasional chance successes. To define and quantify 'chance success' is knotty. Some null hypotheses ascribe chance to the Earth: Seismicity is modeled as random. The null distribution of the number of successful predictions - or any other test statistic - is taken to be its distribution when the fixed set of predictions is applied to random seismicity. Such tests tacitly assume that the predictions do not depend on the observed seismicity. Conditioning on the predictions in this way sets a low hurdle for statistical significance. Consider this scheme: When an earthquake of magnitude 5.5 or greater occurs anywhere in the world, predict that an earthquake at least as large will occur within 21 days and within an epicentral distance of 50 km. We apply this rule to the Harvard centroid-moment-tensor (CMT) catalog for 2000-2004 to generate a set of predictions. The null hypothesis is that earthquake times are exchangeable conditional on their magnitudes and locations and on the predictions - a common "nonparametric" assumption in the literature. We generate random seismicity by permuting the times of events in the CMT catalog. We consider an event successfully predicted only if (i) it is predicted and (ii) there is no larger event within 50 km in the previous 21 days. The P-value for the observed success rate is <0.001: The method successfully predicts about 5% of earthquakes, far better than 'chance' because the predictor exploits the clustering of earthquakes - occasional foreshocks - which the null hypothesis lacks. Rather than condition on the predictions and use a stochastic model for seismicity, it is preferable to treat the observed seismicity as fixed, and to compare the success rate of the predictions to the success rate of simple-minded predictions like those just described. If the proffered predictions do no better than a simple scheme, they have little value.

  3. Project Exodus

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Project Exodus is an in-depth study to identify and address the basic problems of a manned mission to Mars. The most important problems concern propulsion, life support, structure, trajectory, and finance. Exodus will employ a passenger ship, cargo ship, and landing craft for the journey to Mars. These three major components of the mission design are discussed separately. Within each component the design characteristics of structures, trajectory, and propulsion are addressed. The design characteristics of life support are mentioned only in those sections requiring it.

  4. Audiovisual biofeedback improves motion prediction accuracy

    PubMed Central

    Pollock, Sean; Lee, Danny; Keall, Paul; Kim, Taeho

    2013-01-01

    Purpose: The accuracy of motion prediction, utilized to overcome the system latency of motion management radiotherapy systems, is hampered by irregularities present in the patients’ respiratory pattern. Audiovisual (AV) biofeedback has been shown to reduce respiratory irregularities. The aim of this study was to test the hypothesis that AV biofeedback improves the accuracy of motion prediction. Methods: An AV biofeedback system combined with real-time respiratory data acquisition and MR images were implemented in this project. One-dimensional respiratory data from (1) the abdominal wall (30 Hz) and (2) the thoracic diaphragm (5 Hz) were obtained from 15 healthy human subjects across 30 studies. The subjects were required to breathe with and without the guidance of AV biofeedback during each study. The obtained respiratory signals were then implemented in a kernel density estimation prediction algorithm. For each of the 30 studies, five different prediction times ranging from 50 to 1400 ms were tested (150 predictions performed). Prediction error was quantified as the root mean square error (RMSE); the RMSE was calculated from the difference between the real and predicted respiratory data. The statistical significance of the prediction results was determined by the Student's t-test. Results: Prediction accuracy was considerably improved by the implementation of AV biofeedback. Of the 150 respiratory predictions performed, prediction accuracy was improved 69% (103/150) of the time for abdominal wall data, and 78% (117/150) of the time for diaphragm data. The average reduction in RMSE due to AV biofeedback over unguided respiration was 26% (p < 0.001) and 29% (p < 0.001) for abdominal wall and diaphragm respiratory motion, respectively. Conclusions: This study was the first to demonstrate that the reduction of respiratory irregularities due to the implementation of AV biofeedback improves prediction accuracy. This would result in increased efficiency of motion

  5. Drought Predictability and Prediction in a Changing Climate: Assessing Current Predictive Knowledge and Capabilities, User Requirements and Research Priorities

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried

    2011-01-01

    Drought is fundamentally the result of an extended period of reduced precipitation lasting anywhere from a few weeks to decades and even longer. As such, addressing drought predictability and prediction in a changing climate requires foremost that we make progress on the ability to predict precipitation anomalies on subseasonal and longer time scales. From the perspective of the users of drought forecasts and information, drought is however most directly viewed through its impacts (e.g., on soil moisture, streamflow, crop yields). As such, the question of the predictability of drought must extend to those quantities as well. In order to make progress on these issues, the WCRP drought information group (DIG), with the support of WCRP, the Catalan Institute of Climate Sciences, the La Caixa Foundation, the National Aeronautics and Space Administration, the National Oceanic and Atmospheric Administration, and the National Science Foundation, has organized a workshop to focus on: 1. User requirements for drought prediction information on sub-seasonal to centennial time scales 2. Current understanding of the mechanisms and predictability of drought on sub-seasonal to centennial time scales 3. Current drought prediction/projection capabilities on sub-seasonal to centennial time scales 4. Advancing regional drought prediction capabilities for variables and scales most relevant to user needs on sub-seasonal to centennial time scales. This introductory talk provides an overview of these goals, and outlines the occurrence and mechanisms of drought world-wide.

  6. Predicting the Orbits of Satellites with a TI-85 Calculator.

    ERIC Educational Resources Information Center

    Papay, Kate; And Others

    1996-01-01

    Describes a project that predicts the orbits of satellites using a TI-85 calculator. Enables students to achieve a richer understanding of longitude, latitude, time zones, orbital mechanics of satellites, and the terms associated with satellite tracking. (JRH)

  7. Predicting Predictable about Natural Catastrophic Extremes

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2015-04-01

    By definition, an extreme event is rare one in a series of kindred phenomena. Usually (e.g. in Geophysics), it implies investigating a small sample of case-histories with a help of delicate statistical methods and data of different quality, collected in various conditions. Many extreme events are clustered (far from independent) and follow fractal or some other "strange" distribution (far from uniform). Evidently, such an "unusual" situation complicates search and definition of reliable precursory behaviors to be used for forecast/prediction purposes. Making forecast/prediction claims reliable and quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" forecast/prediction outcomes, which cannot be obtained without an extended rigorous test of the candidate method. The set of errors ("success/failure" scores and space-time measure of alarms) and other information obtained in such a control test supplies us with data necessary to judge the candidate's potential as a forecast/prediction tool and, eventually, to find its improvements. This is to be done first in comparison against random guessing, which results confidence (measured in terms of statistical significance). Note that an application of the forecast/prediction tools could be very different in cases of different natural hazards, costs and benefits that determine risks, and, therefore, requires determination of different optimal strategies minimizing reliable estimates of realistic levels of accepted losses. In their turn case specific costs and benefits may suggest a modification of the forecast/prediction tools for a more adequate "optimal" application. Fortunately, the situation is not hopeless due to the state-of-the-art understanding of the complexity and non-linear dynamics of the Earth as a Physical System and pattern recognition approaches applied to available geophysical evidences, specifically, when intending to predict

  8. Solar Cycle Prediction

    NASA Technical Reports Server (NTRS)

    Pesnell, William Dean

    2011-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan your next vacation. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. I will describe the current state of solar cycle predictions and anticipate how those predictions could be made more accurate in the future.

  9. SIMBIOS Project

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; McClain, Charles R.

    2002-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, NASA Research Announcement (NRA) research status, satellite data processing, data product validation, and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project. The SIMBIOS Science Team Principal Investigators' (PIs) original contributions to this report are in chapters four and above. The purpose of these contributions is to describe the current research status of the SIMBIOS-NRA-96 funded research. The contributions are published as submitted, with the exception of minor edits to correct obvious grammatical or clerical errors.

  10. PORTNUS Project

    SciTech Connect

    Loyal, Rebecca E.

    2015-07-14

    The objective of the Portunus Project is to create large, automated offshore ports that will the pace and scale of international trade. Additionally, these ports would increase the number of U.S. domestic trade vessels needed, as the imported goods would need to be transported from these offshore platforms to land-based ports such as Boston, Los Angeles, and Newark. Currently, domestic trade in the United States can only be conducted by vessels that abide by the Merchant Marine Act of 1920 – also referred to as the Jones Act. The Jones Act stipulates that vessels involved in domestic trade must be U.S. owned, U.S. built, and manned by a crew made up of U.S. citizens. The Portunus Project would increase the number of Jones Act vessels needed, which raises an interesting economic concern. Are Jones Act ships more expensive to operate than foreign vessels? Would it be more economically efficient to modify the Jones Act and allow vessels manned by foreign crews to engage in U.S. domestic trade? While opposition to altering the Jones Act is strong, it is important to consider the possibility that ship-owners who employ foreign crews will lobby for the chance to enter a growing domestic trade market. Their success would mean potential job loss for thousands of Americans currently employed in maritime trade.

  11. Project Explorer

    NASA Technical Reports Server (NTRS)

    Dannenberg, K. K.; Henderson, A.; Lee, J.; Smith, G.; Stluka, E.

    1984-01-01

    PROJECT EXPLORER is a program that will fly student-developed experiments onboard the Space Shuttle in NASA's Get-Away Special (GAS) containers. The program is co-sponsored by the Alabama Space and Rocket Center, the Alabama-Mississippi Section of the American Institute of Aeronautics and Astronautics, Alabama A&M University and requires extensive support by the University of Alabama in Huntsville. A unique feature of this project will demonstrate transmissions to ground stations on amateur radio frequencies in English language. Experiments Nos. 1, 2, and 3 use the microgravity of space flight to study the solidification of lead-antimony and aluminum-copper alloys, the growth of potassium-tetracyanoplatinate hydrate crystals in an aqueous solution, and the germination of radish seeds. Flight results will be compared with Earth-based data. Experiment No. 4 features radio transmission and will also provide timing for the start of all other experiments. A microprocessor will obtain real-time data from all experiments as well as temperature and pressure measurements taken inside the canister. These data will be transmitted on previously announced amateur radio frequencies after they have been converted into the English language by a digitalker for general reception.

  12. Project Exodus

    NASA Technical Reports Server (NTRS)

    Bryant, Rodney (Compiler); Dillon, Jennifer (Compiler); Grewe, George (Compiler); Mcmorrow, Jim (Compiler); Melton, Craig (Compiler); Rainey, Gerald (Compiler); Rinko, John (Compiler); Singh, David (Compiler); Yen, Tzu-Liang (Compiler)

    1990-01-01

    A design for a manned Mars mission, PROJECT EXODUS is presented. PROJECT EXODUS incorporates the design of a hypersonic waverider, cargo ship and NIMF (nuclear rocket using indigenous Martian fuel) shuttle lander to safely carry out a three to five month mission on the surface of Mars. The cargo ship transports return fuel, return engine, surface life support, NIMF shuttle, and the Mars base to low Mars orbit (LMO). The cargo ship is powered by a nuclear electric propulsion (NEP) system which allows the cargo ship to execute a spiral trajectory to Mars. The waverider transports ten astronauts to Mars and back. It is launched from the Space Station with propulsion provided by a chemical engine and a delta velocity of 9 km/sec. The waverider performs an aero-gravity assist maneuver through the atmosphere of Venus to obtain a deflection angle and increase in delta velocity. Once the waverider and cargo ship have docked the astronauts will detach the landing cargo capsules and nuclear electric power plant and remotely pilot them to the surface. They will then descend to the surface aboard the NIMF shuttle. A dome base will be quickly constructed on the surface and the astronauts will conduct an exploratory mission for three to five months. They will return to Earth and dock with the Space Station using the waverider.

  13. Current affairs in earthquake prediction in Japan

    NASA Astrophysics Data System (ADS)

    Uyeda, Seiya

    2015-12-01

    As of mid-2014, the main organizations of the earthquake (EQ hereafter) prediction program, including the Seismological Society of Japan (SSJ), the MEXT Headquarters for EQ Research Promotion, hold the official position that they neither can nor want to make any short-term prediction. It is an extraordinary stance of responsible authorities when the nation, after the devastating 2011 M9 Tohoku EQ, most urgently needs whatever information that may exist on forthcoming EQs. Japan's national project for EQ prediction started in 1965, but it has made no success. The main reason for no success is the failure to capture precursors. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this stance has been further fortified by the 2011 M9 Tohoku Mega-quake. This paper tries to explain how this situation came about and suggest that it may in fact be a legitimate one which should have come a long time ago. Actually, substantial positive changes are taking place now. Some promising signs are arising even from cooperation of researchers with private sectors and there is a move to establish an "EQ Prediction Society of Japan". From now on, maintaining the high scientific standards in EQ prediction will be of crucial importance.

  14. SISCAL project

    NASA Astrophysics Data System (ADS)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    The first "ocean colour" sensor, Coastal Zone Color Scanner (CZCS), was launched in 1978. Oceanographers learnt a lot from CZCS but it remained a purely scientific sensor. In recent years, a new generation of satellite-borne earth observation (EO) instruments has been brought into space. These instruments combine high spectral and spatial resolution with revisiting rates of the order of one per day. More instruments with further increased spatial, spectral and temporal resolution will be available within the next years. In the meantime, evaluation procedures taking advantage of the capabilities of the new instruments were derived, allowing the retrieval of ecologically important parameters with higher accuracy than before. Space agencies are now able to collect and to process satellite data in real time and to disseminate them via the Internet. It is therefore meanwhile possible to envisage using EO operationally. In principle, a significant demand for EO data products on terrestrial or marine ecosystems exists both with public authorities (environmental protection, emergency management, natural resources management, national parks, regional planning, etc) and private companies (tourist industry, insurance companies, water suppliers, etc). However, for a number of reasons, many data products that can be derived from the new instruments and methods have not yet left the scientific community towards public or private end users. It is the intention of the proposed SISCAL (Satellite-based Information System on Coastal Areas and Lakes) project to contribute to the closure of the existing gap between space agencies and research institutions on one side and end users on the other side. To do so, we intend to create a data processor that automatically derives and subsequently delivers over the Internet, in Near-Real-Time (NRT), a number of data products tailored to individual end user needs. The data products will be generated using a Geographical Information System (GIS

  15. SISCAL project

    NASA Astrophysics Data System (ADS)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    The first "ocean colour" sensor, Coastal Zone Color Scanner (CZCS), was launched in 1978. Oceanographers learnt a lot from CZCS but it remained a purely scientific sensor. In recent years, a new generation of satellite-borne earth observation (EO) instruments has been brought into space. These instruments combine high spectral and spatial resolution with revisiting rates of the order of one per day. More instruments with further increased spatial, spectral and temporal resolution will be available within the next years. In the meantime, evaluation procedures taking advantage of the capabilities of the new instruments were derived, allowing the retrieval of ecologically important parameters with higher accuracy than before. Space agencies are now able to collect and to process satellite data in real time and to disseminate them via the Internet. It is therefore meanwhile possible to envisage using EO operationally. In principle, a significant demand for EO data products on terrestrial or marine ecosystems exists both with public authorities (environmental protection, emergency management, natural resources management, national parks, regional planning, etc) and private companies (tourist industry, insurance companies, water suppliers, etc). However, for a number of reasons, many data products that can be derived from the new instruments and methods have not yet left the scientific community towards public or private end users. It is the intention of the proposed SISCAL (Satellite-based Information System on Coastal Areas and Lakes) project to contribute to the closure of the existing gap between space agencies and research institutions on one side and end users on the other side. To do so, we intend to create a data processor that automatically derives and subsequently delivers over the Internet, in Near-Real-Time (NRT), a number of data products tailored to individual end user needs. The data products will be generated using a Geographical Information System (GIS

  16. Signal Prediction With Input Identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Chen, Ya-Chin

    1999-01-01

    A novel coding technique is presented for signal prediction with applications including speech coding, system identification, and estimation of input excitation. The approach is based on the blind equalization method for speech signal processing in conjunction with the geometric subspace projection theory to formulate the basic prediction equation. The speech-coding problem is often divided into two parts, a linear prediction model and excitation input. The parameter coefficients of the linear predictor and the input excitation are solved simultaneously and recursively by a conventional recursive least-squares algorithm. The excitation input is computed by coding all possible outcomes into a binary codebook. The coefficients of the linear predictor and excitation, and the index of the codebook can then be used to represent the signal. In addition, a variable-frame concept is proposed to block the same excitation signal in sequence in order to reduce the storage size and increase the transmission rate. The results of this work can be easily extended to the problem of disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. Simulations are included to demonstrate the proposed method.

  17. PREDICTIVE MODELS. Enhanced Oil Recovery Model

    SciTech Connect

    Ray, R.M.

    1992-02-26

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding; 2 carbon dioxide miscible flooding; 3 in-situ combustion; 4 polymer flooding; and 5 steamflood. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes. The IBM PC/AT version includes a plotting capability to produces a graphic picture of the predictive model results.

  18. EDSP Prioritization: Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) (SOT)

    EPA Science Inventory

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been te...

  19. Project Grandmaster

    Energy Science and Technology Software Center (ESTSC)

    2013-09-16

    The purpose of the Project Grandmaster Application is to allow individuals to opt-in and give the application access to data sources about their activities on social media sites. The application will cross-reference these data sources to build up a picture of each individual activities they discuss, either at present or in the past, and place this picture in reference to groups of all participants. The goal is to allow an individual to place themselves inmore » the collective and to understand how their behavior patterns fit with the group and potentially find changes to make, such as activities they weren’t already aware of or different groups of interest they might want to follow.« less

  20. Apollo Project

    NASA Technical Reports Server (NTRS)

    1966-01-01

    From Spaceflight Revolution: 'Top NASA officials listen to a LOPO briefing at Langley in December 1966. Sitting to the far right with his hand on his chin is Floyd Thompson. To the left sits Dr. George Mueller, NASA associate administrator for Manned Space Flight. On the wall is a diagram of the sites selected for the 'concentrated mission.' 'The most fundamental issue in the pre-mission planning for Lunar Orbiter was how the moon was to be photographed. Would the photography be 'concentrated' on a predetermined single target, or would it be 'distributed' over several selected targets across the moon's surface? On the answer to this basic question depended the successful integration of the entire mission plan for Lunar Orbiter.' The Lunar Orbiter Project made systematic photographic maps of the lunar landing sites. Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 337.

  1. Project Grandmaster

    SciTech Connect

    None, None

    2013-09-16

    The purpose of the Project Grandmaster Application is to allow individuals to opt-in and give the application access to data sources about their activities on social media sites. The application will cross-reference these data sources to build up a picture of each individual activities they discuss, either at present or in the past, and place this picture in reference to groups of all participants. The goal is to allow an individual to place themselves in the collective and to understand how their behavior patterns fit with the group and potentially find changes to make, such as activities they weren’t already aware of or different groups of interest they might want to follow.

  2. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; From Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  3. Apollo Project

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White described the simulator as follows: 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  4. Apollo Project

    NASA Technical Reports Server (NTRS)

    1963-01-01

    Track, Model 2 and Model 1, the 20-foot sphere. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) From Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966. 'The model system is designed so that a television camera is mounted on a camera boom on each transport cart and each cart system is shared by two models. The cart's travel along the tracks represents longitudinal motion along the plane of a nominal orbit, vertical travel of the camera boom represents latitude on out-of-plane travel, and horizontal travel of the camera boom represents altitude changes.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379.

  5. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  6. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Construction of Model 1 used in the LOLA simulator. This was a twenty-foot sphere which simulated for the astronauts what the surface of the moon would look like from 200 miles up. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White wrote: 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution: NASA Langley Research Center From Sputnik to Apollo, (Washington: NASA, 1995), p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  7. Apollo Project

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Artists used paintbrushes and airbrushes to recreate the lunar surface on each of the four models comprising the LOLA simulator. Project LOLA or Lunar Orbit and Landing Approach was a simulator built at Langley to study problems related to landing on the lunar surface. It was a complex project that cost nearly $2 million dollars. James Hansen wrote: 'This simulator was designed to provide a pilot with a detailed visual encounter with the lunar surface; the machine consisted primarily of a cockpit, a closed-circuit TV system, and four large murals or scale models representing portions of the lunar surface as seen from various altitudes. The pilot in the cockpit moved along a track past these murals which would accustom him to the visual cues for controlling a spacecraft in the vicinity of the moon. Unfortunately, such a simulation--although great fun and quite aesthetic--was not helpful because flight in lunar orbit posed no special problems other than the rendezvous with the LEM, which the device did not simulate. Not long after the end of Apollo, the expensive machine was dismantled.' (p. 379) Ellis J. White further described LOLA in his paper 'Discussion of Three Typical Langley Research Center Simulation Programs,' 'Model 1 is a 20-foot-diameter sphere mounted on a rotating base and is scaled 1 in. = 9 miles. Models 2,3, and 4 are approximately 15x40 feet scaled sections of model 1. Model 4 is a scaled-up section of the Crater Alphonsus and the scale is 1 in. = 200 feet. All models are in full relief except the sphere.' Published in James R. Hansen, Spaceflight Revolution, NASA SP-4308, p. 379; Ellis J. White, 'Discussion of Three Typical Langley Research Center Simulation Programs,' Paper presented at the Eastern Simulation Council (EAI's Princeton Computation Center), Princeton, NJ, October 20, 1966.

  8. Predictive systems ecology

    PubMed Central

    Evans, Matthew R.; Bithell, Mike; Cornell, Stephen J.; Dall, Sasha R. X.; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J.; Lewis, Simon L.; Mace, Georgina M.; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K. J.; Petchey, Owen; Smith, Matthew; Travis, Justin M. J.; Benton, Tim G.

    2013-01-01

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive. PMID:24089332

  9. Pyroshock prediction procedures

    NASA Astrophysics Data System (ADS)

    Piersol, Allan G.

    2002-05-01

    Given sufficient effort, pyroshock loads can be predicted by direct analytical procedures using Hydrocodes that analytically model the details of the pyrotechnic explosion and its interaction with adjacent structures, including nonlinear effects. However, it is more common to predict pyroshock environments using empirical procedures based upon extensive studies of past pyroshock data. Various empirical pyroshock prediction procedures are discussed, including those developed by the Jet Propulsion Laboratory, Lockheed-Martin, and Boeing.

  10. Predictability of Conversation Partners

    NASA Astrophysics Data System (ADS)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  11. Solar Cycle Predictions

    NASA Technical Reports Server (NTRS)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  12. Predicting cancer outcome

    SciTech Connect

    Gardner, S N; Fernandes, M

    2005-03-24

    We read with interest the paper by Michiels et al on the prediction of cancer with microarrays and the commentary by Ioannidis listing the potential as well as the limitations of this approach (February 5, p 488 and 454). Cancer is a disease characterized by complex, heterogeneous mechanisms and studies to define factors that can direct new drug discovery and use should be encouraged. However, this is easier said than done. Casti teaches that a better understanding does not necessarily extrapolate to better prediction, and that useful prediction is possible without complete understanding (1). To attempt both, explanation and prediction, in a single nonmathematical construct, is a tall order (Figure 1).

  13. Predicting Precipitation in Darwin: An Experiment with Markov Chains

    ERIC Educational Resources Information Center

    Boncek, John; Harden, Sig

    2009-01-01

    As teachers of first-year college mathematics and science students, the authors are constantly on the lookout for simple classroom exercises that improve their students' analytical and computational skills. In this article, the authors outline a project entitled "Predicting Precipitation in Darwin." In this project, students: (1) analyze and…

  14. Radar-aeolian roughness project

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald; Dobrovolskis, A.; Gaddis, L.; Iversen, J. D.; Lancaster, N.; Leach, Rodman N.; Rasnussen, K.; Saunders, S.; Vanzyl, J.; Wall, S.

    1991-01-01

    The objective is to establish an empirical relationship between measurements of radar, aeolian, and surface roughness on a variety of natural surfaces and to understand the underlying physical causes. This relationship will form the basis for developing a predictive equation to derive aeolian roughness from radar backscatter. Results are given from investigations carried out in 1989 on the principal elements of the project, with separate sections on field studies, radar data analysis, laboratory simulations, and development of theory for planetary applications.

  15. CUBES Project Support

    NASA Technical Reports Server (NTRS)

    Jenkins, Kenneth T., Jr.

    2012-01-01

    CUBES stands for Creating Understanding and Broadening Education through Satellites. The goal of the project is to allow high school students to build a small satellite, or CubeSat. Merritt Island High School (MIHS) was selected to partner with NASA, and California Polytechnic State University (Cal-Poly}, to build a CubeSat. The objective of the mission is to collect flight data to better characterize maximum predicted environments inside the CubeSat launcher, Poly-Picosatellite Orbital Deplorer (P-POD), while attached to the launch vehicle. The MIHS CubeSat team will apply to the NASA CubeSat Launch Initiative, which provides opportunities for small satellite development teams to secure launch slots on upcoming expendable launch vehicle missions. The MIHS team is working to achieve a test launch, or proof of concept flight aboard a suborbital launch vehicle in early 2013.

  16. Ace Project as a Project Management Tool

    ERIC Educational Resources Information Center

    Cline, Melinda; Guynes, Carl S.; Simard, Karine

    2010-01-01

    The primary challenge of project management is to achieve the project goals and objectives while adhering to project constraints--usually scope, quality, time and budget. The secondary challenge is to optimize the allocation and integration of resources necessary to meet pre-defined objectives. Project management software provides an active…

  17. Collaborative Physical Chemistry Projects Involving Computational Chemistry

    NASA Astrophysics Data System (ADS)

    Whisnant, David M.; Howe, Jerry J.; Lever, Lisa S.

    2000-02-01

    The physical chemistry classes from three colleges have collaborated on two computational chemistry projects using Quantum CAChe 3.0 and Gaussian 94W running on Pentium II PCs. Online communication by email and the World Wide Web was an important part of the collaboration. In the first project, students used molecular modeling to predict benzene derivatives that might be possible hair dyes. They used PM3 and ZINDO calculations to predict the electronic spectra of the molecules and tested the predicted spectra by comparing some with experimental measurements. They also did literature searches for real hair dyes and possible health effects. In the final phase of the project they proposed a synthetic pathway for one compound. In the second project the students were asked to predict which isomer of a small carbon cluster (C3, C4, or C5) was responsible for a series of IR lines observed in the spectrum of a carbon star. After preliminary PM3 calculations, they used ab initio calculations at the HF/6-31G(d) and MP2/6-31G(d) level to model the molecules and predict their vibrational frequencies and rotational constants. A comparison of the predictions with the experimental spectra suggested that the linear isomer of the C5 molecule was responsible for the lines.

  18. Predictability of large interannual Arctic sea-ice anomalies

    NASA Astrophysics Data System (ADS)

    Tietsche, Steffen; Notz, Dirk; Jungclaus, Johann H.; Marotzke, Jochem

    2013-11-01

    In projections of twenty-first century climate, Arctic sea ice declines and at the same time exhibits strong interannual anomalies. Here, we investigate the potential to predict these strong sea-ice anomalies under a perfect-model assumption, using the Max-Planck-Institute Earth System Model in the same setup as in the Coupled Model Intercomparison Project Phase 5 (CMIP5). We study two cases of strong negative sea-ice anomalies: a 5-year-long anomaly for present-day conditions, and a 10-year-long anomaly for conditions projected for the middle of the twenty-first century. We treat these anomalies in the CMIP5 projections as the truth, and use exactly the same model configuration for predictions of this synthetic truth. We start ensemble predictions at different times during the anomalies, considering lagged-perfect and sea-ice-assimilated initial conditions. We find that the onset and amplitude of the interannual anomalies are not predictable. However, the further deepening of the anomaly can be predicted for typically 1 year lead time if predictions start after the onset but before the maximal amplitude of the anomaly. The magnitude of an extremely low summer sea-ice minimum is hard to predict: the skill of the prediction ensemble is not better than a damped-persistence forecast for lead times of more than a few months, and is not better than a climatology forecast for lead times of two or more years. Predictions of the present-day anomaly are more skillful than predictions of the mid-century anomaly. Predictions using sea-ice-assimilated initial conditions are competitive with those using lagged-perfect initial conditions for lead times of a year or less, but yield degraded skill for longer lead times. The results presented here suggest that there is limited prospect of predicting the large interannual sea-ice anomalies expected to occur throughout the twenty-first century.

  19. Project Longshot

    NASA Technical Reports Server (NTRS)

    West, J. Curtis; Chamberlain, Sally A.; Stevens, Robert; Pagan, Neftali

    1989-01-01

    Project Longshot is an unmanned probe to our nearest star system, Alpha Centauri, 4.3 light years away. The Centauri system is a trinary system consisting of two central stars (A and B) orbiting a barycenter, and a third (Proxima Centauri) orbiting the two. The system is a declination of -67 degrees. The goal is to reach the Centauri system in 50 years. This time space was chosen because any shorter time would be impossible of the relativistic velocities involved, and any greater time would be impossible because of the difficulty of creating a spacecraft with such a long lifetime. Therefore, the following mission profile is proposed: (1) spacecraft is assembled in Earth orbit; (2) spacecraft escapes Earth and Sun in the ecliptic with a single impulse maneuver; (3) spacecraft changed declination to point toward Centauri system; (4) spacecraft accelerates to 0.1c; (5) spacecraft coasts at 0.1c for 41 years; (6) spacecraft decelerates upon reaching Centauri system; and (7) spacecraft orbits Centauri system, conducts investigations, and relays data to Earth. The total time to reach the Centauri system, taking into consideration acceleration and deceleration, will be approximately 50 years.

  20. Project LASER

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.

  1. Predicting discovery rates of genomic features.

    PubMed

    Gravel, Simon

    2014-06-01

    Successful sequencing experiments require judicious sample selection. However, this selection must often be performed on the basis of limited preliminary data. Predicting the statistical properties of the final sample based on preliminary data can be challenging, because numerous uncertain model assumptions may be involved. Here, we ask whether we can predict "omics" variation across many samples by sequencing only a fraction of them. In the infinite-genome limit, we find that a pilot study sequencing 5% of a population is sufficient to predict the number of genetic variants in the entire population within 6% of the correct value, using an estimator agnostic to demography, selection, or population structure. To reach similar accuracy in a finite genome with millions of polymorphisms, the pilot study would require ∼15% of the population. We present computationally efficient jackknife and linear programming methods that exhibit substantially less bias than the state of the art when applied to simulated data and subsampled 1000 Genomes Project data. Extrapolating based on the National Heart, Lung, and Blood Institute Exome Sequencing Project data, we predict that 7.2% of sites in the capture region would be variable in a sample of 50,000 African Americans and 8.8% in a European sample of equal size. Finally, we show how the linear programming method can also predict discovery rates of various genomic features, such as the number of transcription factor binding sites across different cell types. PMID:24637199

  2. Project Information Packages Kit.

    ERIC Educational Resources Information Center

    RMC Research Corp., Mountain View, CA.

    Presented are an overview booklet, a project selection guide, and six Project Information Packages (PIPs) for six exemplary projects serving underachieving students in grades k through 9. The overview booklet outlines the PIP projects and includes a chart of major project features. A project selection guide reviews the PIP history, PIP contents,…

  3. Climate predictability in the second year.

    PubMed

    Hermanson, Leon; Sutton, Rowan T

    2009-03-13

    In this paper, the predictability of climate arising from ocean heat content (OHC) anomalies is investigated in the HadCM3 coupled atmosphere-ocean model. An ensemble of simulations of the twentieth century are used to provide initial conditions for a case study. The case study consists of two ensembles started from initial conditions with large differences in regional OHC in the North Atlantic, the Southern Ocean and parts of the West Pacific. Surface temperatures and precipitation are on average not predictable beyond seasonal time scales, but for certain initial conditions there may be longer predictability. It is shown that, for the case study examined here, some aspects of tropical precipitation, European surface temperatures and North Atlantic sea-level pressure are potentially predictable 2 years ahead. Predictability also exists in the other case studies, but the climate variables and regions, which are potentially predictable, differ. This work was done as part of the Grid for Coupled Ensemble Prediction (GCEP) eScience project. PMID:19087941

  4. Researches on predictions of Earth orientation parameters

    NASA Astrophysics Data System (ADS)

    Xu, Xueqing

    2012-08-01

    Earth orientation parameters (EOP) are essential for transformation between the celestial and terrestrial coordinate systems, which has important applications in the Earth sciences, astronomy and navigation system. In this report, we firstly describe the principles and analyze the characteristics of several commonly used EOP prediction methods. Based on this discussion, we found that it’s essential to select appropriate method and length of base prediction sequence at different prediction span, e.g., autoregressive (AR) model has a higher accuracy in short - term forecasting, while the artificial neural network (ANN) model has advantage in the long term forecasting. Secondly, we employ for the first time a combination of AR model and Kalman filter (AR+Kalman) in short - term EOP prediction. Comparing with the single AR model, the combination of AR model and Kalman filter shows a significant improvement in short - term EOP prediction. At last, we will present the recent work during the period of our participation in the Earth Orientation Parameters Combination of Prediction Pilot Project (EOPC PPP). The EOPC PPP was initiated by the International Earth Rotation and Reference Systems Service (IERS) and Jet Propulsion Laboratory (JPL) in the summer of 2010, with the go al to develop a strategy for combining predictions.

  5. Improved nonlinear prediction method

    NASA Astrophysics Data System (ADS)

    Adenan, Nur Hamiza; Md Noorani, Mohd Salmi

    2014-06-01

    The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.

  6. Detecting and Predicting Changes

    ERIC Educational Resources Information Center

    Brown, Scott D.; Steyvers, Mark

    2009-01-01

    When required to predict sequential events, such as random coin tosses or basketball free throws, people reliably use inappropriate strategies, such as inferring temporal structure when none is present. We investigate the ability of observers to predict sequential events in dynamically changing environments, where there is an opportunity to detect…

  7. Stable predictive control horizons

    NASA Astrophysics Data System (ADS)

    Estrada, Raúl; Favela, Antonio; Raimondi, Angelo; Nevado, Antonio; Requena, Ricardo; Beltrán-Carbajal, Francisco

    2012-04-01

    The stability theory of predictive and adaptive predictive control for processes of linear and stable nature is based on the hypothesis of a physically realisable driving desired trajectory (DDT). The formal theoretical verification of this hypothesis is trivial for processes with a stable inverse, but it is not for processes with an unstable inverse. The extended strategy of predictive control was developed with the purpose of overcoming methodologically this stability problem and it has delivered excellent performance and stability in its industrial applications given a suitable choice of the prediction horizon. From a theoretical point of view, the existence of a prediction horizon capable of ensuring stability for processes with an unstable inverse was proven in the literature. However, no analytical solution has been found for the determination of the prediction horizon values which guarantee stability, in spite of the theoretical and practical interest of this matter. This article presents a new method able to determine the set of prediction horizon values which ensure stability under the extended predictive control strategy formulation and a particular performance criterion for the design of the DDT generically used in many industrial applications. The practical application of this method is illustrated by means of simulation examples.

  8. Propeller noise prediction

    NASA Technical Reports Server (NTRS)

    Zorumski, W. E.

    1983-01-01

    Analytic propeller noise prediction involves a sequence of computations culminating in the application of acoustic equations. The prediction sequence currently used by NASA in its ANOPP (aircraft noise prediction) program is described. The elements of the sequence are called program modules. The first group of modules analyzes the propeller geometry, the aerodynamics, including both potential and boundary layer flow, the propeller performance, and the surface loading distribution. This group of modules is based entirely on aerodynamic strip theory. The next group of modules deals with the actual noise prediction, based on data from the first group. Deterministic predictions of periodic thickness and loading noise are made using Farassat's time-domain methods. Broadband noise is predicted by the semi-empirical Schlinker-Amiet method. Near-field predictions of fuselage surface pressures include the effects of boundary layer refraction and (for a cylinder) scattering. Far-field predictions include atmospheric and ground effects. Experimental data from subsonic and transonic propellers are compared and NASA's future direction is propeller noise technology development are indicated.

  9. Predicting the MJO

    NASA Astrophysics Data System (ADS)

    Hendon, H.

    2003-04-01

    Extended range prediction of the Madden Julian Oscillation (MJO) and seasonal prediction of MJO activity are reviewed. Skillful prediction of individual MJO events offers the possibility of forecasting increased risk of cyclone development throughout the global tropics, altered risk of extreme rainfall events in both tropics and extratropics, and displacement of storm tracks with 3-4 week lead times. The level of MJO activity within a season, which affects the mean intensity of the Australian summer monsoon and possibly the evolution of ENSO, may be governed by variations of sea surface temperature that are predictable with lead times of a few seasons. The limit of predictability for individual MJO events is unknown. Empirical-statistical schemes are skillful out to about 3 weeks and have better skill than dynamical forecast models at lead times longer than about 5 days. The dynamical forecast models typically suffer from a poor representation (or complete lack) of the MJO and large initial error. They are better used to ascertain the global impacts of the lack of the MJO rather than for determination of the limit of predictability. Dynamical extended range prediction within a GCM that has a good representation of the MJO indicates potential skill comparable to the empirical schemes. Examples of operational extended range prediction with POAMA, the new coupled seasonal forecast model at the Bureau of Meteorology that also reasonably simulates the MJO, will be presented.

  10. Microcirculation and the Physiome Projects

    PubMed Central

    Bassingthwaighte, James B.

    2010-01-01

    The Physiome projects comprise a loosely knit worldwide effort to define the Physiome through databases and theoretical models, with the goal of better understanding the integrative functions of cells, organs, and organisms. The projects involve developing and archiving models, providing centralized databases, and linking experimental information and models from many laboratories into self-consistent frameworks. Increasingly accurate and complete models that embody quantitative biological hypotheses, adhere to high standards, and are publicly available and reproducible, together with refined and curated data, will enable biological scientists to advance integrative, analytical, and predictive approaches to the study of medicine and physiology. This review discusses the rationale and history of the Physiome projects, the role of theoretical models in the development of the Physiome, and the current status of efforts in this area addressing the microcirculation. PMID:19051119

  11. Crop status evaluations and yield predictions

    NASA Technical Reports Server (NTRS)

    Haun, J. R.

    1976-01-01

    One phase of the large area crop inventory project is presented. Wheat yield models based on the input of environmental variables potentially obtainable through the use of space remote sensing were developed and demonstrated. By the use of a unique method for visually qualifying daily plant development and subsequent multifactor computer analyses, it was possible to develop practical models for predicting crop development and yield. Development of wheat yield prediction models was based on the discovery that morphological changes in plants are detected and quantified on a daily basis, and that this change during a portion of the season was proportional to yield.

  12. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  13. Error mode prediction.

    PubMed

    Hollnagel, E; Kaarstad, M; Lee, H C

    1999-11-01

    The study of accidents ('human errors') has been dominated by efforts to develop 'error' taxonomies and 'error' models that enable the retrospective identification of likely causes. In the field of Human Reliability Analysis (HRA) there is, however, a significant practical need for methods that can predict the occurrence of erroneous actions--qualitatively and quantitatively. The present experiment tested an approach for qualitative performance prediction based on the Cognitive Reliability and Error Analysis Method (CREAM). Predictions of possible erroneous actions were made for operators using different types of alarm systems. The data were collected as part of a large-scale experiment using professional nuclear power plant operators in a full scope simulator. The analysis showed that the predictions were correct in more than 70% of the cases, and also that the coverage of the predictions depended critically on the comprehensiveness of the preceding task analysis. PMID:10582035

  14. Mars solar conjunction prediction modeling

    NASA Astrophysics Data System (ADS)

    Srivastava, Vineet K.; Kumar, Jai; Kulshrestha, Shivali; Kushvah, Badam Singh

    2016-01-01

    During the Mars solar conjunction, telecommunication and tracking between the spacecraft and the Earth degrades significantly. The radio signal degradation depends on the angular separation between the Sun, Earth and probe (SEP), the signal frequency band and the solar activity. All radiometric tracking data types display increased noise and signatures for smaller SEP angles. Due to scintillation, telemetry frame errors increase significantly when solar elongation becomes small enough. This degradation in telemetry data return starts at solar elongation angles of around 5° at S-band, around 2° at X-band and about 1° at Ka-band. This paper presents a mathematical model for predicting Mars superior solar conjunction for any Mars orbiting spacecraft. The described model is simulated for the Mars Orbiter Mission which experienced Mars solar conjunction during May-July 2015. Such a model may be useful to flight projects and design engineers in the planning of Mars solar conjunction operational scenarios.

  15. RESOLVE Project

    NASA Technical Reports Server (NTRS)

    Parker, Ray; Coan, Mary; Cryderman, Kate; Captain, Janine

    2013-01-01

    The RESOLVE project is a lunar prospecting mission whose primary goal is to characterize water and other volatiles in lunar regolith. The Lunar Advanced Volatiles Analysis (LAVA) subsystem is comprised of a fluid subsystem that transports flow to the gas chromatograph - mass spectrometer (GC-MS) instruments that characterize volatiles and the Water Droplet Demonstration (WDD) that will capture and display water condensation in the gas stream. The LAVA Engineering Test Unit (ETU) is undergoing risk reduction testing this summer and fall within a vacuum chamber to understand and characterize component and integrated system performance. Testing of line heaters, printed circuit heaters, pressure transducers, temperature sensors, regulators, and valves in atmospheric and vacuum environments was done. Test procedures were developed to guide experimental tests and test reports to analyze and draw conclusions from the data. In addition, knowledge and experience was gained with preparing a vacuum chamber with fluid and electrical connections. Further testing will include integrated testing of the fluid subsystem with the gas supply system, near-infrared spectrometer, WDD, Sample Delivery System, and GC-MS in the vacuum chamber. This testing will provide hands-on exposure to a flight forward spaceflight subsystem, the processes associated with testing equipment in a vacuum chamber, and experience working in a laboratory setting. Examples of specific analysis conducted include: pneumatic analysis to calculate the WDD's efficiency at extracting water vapor from the gas stream to form condensation; thermal analysis of the conduction and radiation along a line connecting two thermal masses; and proportional-integral-derivative (PID) heater control analysis. Since LAVA is a scientific subsystem, the near-infrared spectrometer and GC-MS instruments will be tested during the ETU testing phase.

  16. Predictable Books: Captivating Young Readers.

    ERIC Educational Resources Information Center

    Luckner, John

    1990-01-01

    Because prediction plays such a vital role in reading comprehension, predictable books are essential in the teaching of beginning readers. Prediction involves a three-step cycle: sampling, predicting, and confirming. Steps in using predictable books with hearing-impaired students are outlined, and a list of predictable and repetitive books is…

  17. The Materials Genome Project

    NASA Astrophysics Data System (ADS)

    Aourag, H.

    2008-09-01

    In the past, the search for new and improved materials was characterized mostly by the use of empirical, trial- and-error methods. This picture of materials science has been changing as the knowledge and understanding of fundamental processes governing a material's properties and performance (namely, composition, structure, history, and environment) have increased. In a number of cases, it is now possible to predict a material's properties before it has even been manufactured thus greatly reducing the time spent on testing and development. The objective of modern materials science is to tailor a material (starting with its chemical composition, constituent phases, and microstructure) in order to obtain a desired set of properties suitable for a given application. In the short term, the traditional "empirical" methods for developing new materials will be complemented to a greater degree by theoretical predictions. In some areas, computer simulation is already used by industry to weed out costly or improbable synthesis routes. Can novel materials with optimized properties be designed by computers? Advances in modelling methods at the atomic level coupled with rapid increases in computer capabilities over the last decade have led scientists to answer this question with a resounding "yes'. The ability to design new materials from quantum mechanical principles with computers is currently one of the fastest growing and most exciting areas of theoretical research in the world. The methods allow scientists to evaluate and prescreen new materials "in silico" (in vitro), rather than through time consuming experimentation. The Materials Genome Project is to pursue the theory of large scale modeling as well as powerful methods to construct new materials, with optimized properties. Indeed, it is the intimate synergy between our ability to predict accurately from quantum theory how atoms can be assembled to form new materials and our capacity to synthesize novel materials atom

  18. Prediction of bull fertility.

    PubMed

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. PMID:26791329

  19. Prediction of alumina penetration

    SciTech Connect

    Mandell, D A

    1993-02-01

    The MESA hydrocode was used to predict two-dimensional tests of L/D 10 and L/D 15 tungsten rods impacting AD 90 alumina with a steel backing. The residual penetration into the steel is the measured quantity in these experiments conducted at the Southwest Research Institute (SWR). The interface velocity as a function of time between an alumina target and a lithium fluoride window, impacted by an alumina disk at velocities between 544 m/s and 2329 m/s, was also predicted. These one-dimensional flyer plate experiments were conducted at Sandia National Laboratories using Coors AD 995 alumina. The material strength and fracture models are important in the prediction of ceramic experiments. The models used in these predictions are discussed. The penetrations in the two-dimensional tests were predicted to 11.4 percent or better. In five of the six experiments, the predicted penetration depth was deeper than the measured value. This trend is expected since the calculation is based on ideal conditions. The results show that good agreement between the 1-D flyer plate data and the MESA predictions exists at the lower impact velocities, but the maximum velocity is overpredicted as the flyer plate velocity increases. At a flyer plate velocity of 2329 m/s the code overpredicted the data by 12.3 percent.

  20. Projection in surrogate decisions about life-sustaining medical treatments.

    PubMed

    Fagerlin, A; Ditto, P H; Danks, J H; Houts, R M; Smucker, W D

    2001-05-01

    To honor the wishes of an incapacitated patient, surrogate decision makers must predict the treatment decisions patients would make for themselves if able. Social psychological research, however, suggests that surrogates' own treatment preferences may influence their predictions of others' preferences. In 2 studies (1 involving 60 college student surrogates and a parent, the other involving 361 elderly outpatients and their chosen surrogate decision maker), surrogates predicted whether a close other would want life-sustaining treatment in hypothetical end-of-life scenarios and stated their own treatment preferences in the same scenarios. Surrogate predictions more closely resembled surrogates' own treatment wishes than they did the wishes of the individual they were trying to predict. Although the majority of prediction errors reflected inaccurate use of surrogates' own treatment preferences, projection was also found to result in accurate prediction more often than counterprojective predictions. The rationality and accuracy of projection in surrogate decision making is discussed. PMID:11403214

  1. Project summary

    NASA Technical Reports Server (NTRS)

    1991-01-01

    California Polytechnic State University's design project for the 1990-91 school year was the design of a close air support aircraft. There were eight design groups that participated and were given requests for proposals. These proposals contained mission specifications, particular performance and payload requirements, as well as the main design drivers. The mission specifications called for a single pilot weighing 225 lb with equipment. The design mission profile consisted of the following: (1) warm-up, taxi, take off, and accelerate to cruise speed; (2) dash at sea level at 500 knots to a point 250 nmi from take off; (3) combat phase, requiring two combat passes at 450 knots that each consist of a 360 deg turn and an energy increase of 4000 ft. - at each pass, half of air-to-surface ordnance is released; (4) dash at sea level at 500 knots 250 nmi back to base; and (5) land with 20 min of reserve fuel. The request for proposal also specified the following performance requirements with 50 percent internal fuel and standard stores: (1) the aircraft must be able to accelerate from Mach 0.3 to 0.5 at sea level in less than 20 sec; (2) required turn rates are 4.5 sustained g at 450 knots at sea level; (3) the aircraft must have a reattack time of 25 sec or less (reattack time was defined as the time between the first and second weapon drops); (4) the aircraft is allowed a maximum take off and landing ground roll of 2000 ft. The payload requirements were 20 Mk 82 general-purpose free-fall bombs and racks; 1 GAU-8A 30-mm cannon with 1350 rounds; and 2 AIM-9L Sidewinder missiles and racks. The main design drivers expressed in the request for proposal were that the aircraft should be survivable and maintainable. It must be able to operate in remote areas with little or no maintenance. Simplicity was considered the most important factor in achieving the former goal. In addition, the aircraft must be low cost both in acquisition and operation. The summaries of the aircraft

  2. RESOLVE Project

    NASA Technical Reports Server (NTRS)

    Parker, Ray O.

    2012-01-01

    The RESOLVE project is a lunar prospecting mission whose primary goal is to characterize water and other volatiles in lunar regolith. The Lunar Advanced Volatiles Analysis (LAVA) subsystem is comprised of a fluid subsystem that transports flow to the gas chromatograph- mass spectrometer (GC-MS) instruments that characterize volatiles and the Water Droplet Demonstration (WDD) that will capture and display water condensation in the gas stream. The LAVA Engineering Test Unit (ETU) is undergoing risk reduction testing this summer and fall within a vacuum chamber to understand and characterize C!Jmponent and integrated system performance. Ray will be assisting with component testing of line heaters, printed circuit heaters, pressure transducers, temperature sensors, regulators, and valves in atmospheric and vacuum environments. He will be developing procedures to guide these tests and test reports to analyze and draw conclusions from the data. In addition, he will gain experience with preparing a vacuum chamber with fluid and electrical connections. Further testing will include integrated testing of the fluid subsystem with the gas supply system, near-infrared spectrometer, WDD, Sample Delivery System, and GC-MS in the vacuum chamber. This testing will provide hands-on exposure to a flight forward spaceflight subsystem, the processes associated with testing equipment in a vacuum chamber, and experience working in a laboratory setting. Examples of specific analysis Ray will conduct include: pneumatic analysis to calculate the WOO's efficiency at extracting water vapor from the gas stream to form condensation; thermal analysis of the conduction and radiation along a line connecting two thermal masses; and proportional-integral-derivative (PID) heater control analysis. In this Research and Technology environment, Ray will be asked to problem solve real-time as issues arise. Since LAVA is a scientific subsystem, Ray will be utilizing his chemical engineering background to

  3. Predictive Modeling of Cardiac Ischemia

    NASA Technical Reports Server (NTRS)

    Anderson, Gary T.

    1996-01-01

    The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.

  4. De Novo Protein Structure Prediction

    NASA Astrophysics Data System (ADS)

    Hung, Ling-Hong; Ngan, Shing-Chung; Samudrala, Ram

    An unparalleled amount of sequence data is being made available from large-scale genome sequencing efforts. The data provide a shortcut to the determination of the function of a gene of interest, as long as there is an existing sequenced gene with similar sequence and of known function. This has spurred structural genomic initiatives with the goal of determining as many protein folds as possible (Brenner and Levitt, 2000; Burley, 2000; Brenner, 2001; Heinemann et al., 2001). The purpose of this is twofold: First, the structure of a gene product can often lead to direct inference of its function. Second, since the function of a protein is dependent on its structure, direct comparison of the structures of gene products can be more sensitive than the comparison of sequences of genes for detecting homology. Presently, structural determination by crystallography and NMR techniques is still slow and expensive in terms of manpower and resources, despite attempts to automate the processes. Computer structure prediction algorithms, while not providing the accuracy of the traditional techniques, are extremely quick and inexpensive and can provide useful low-resolution data for structure comparisons (Bonneau and Baker, 2001). Given the immense number of structures which the structural genomic projects are attempting to solve, there would be a considerable gain even if the computer structure prediction approach were applicable to a subset of proteins.

  5. Surprise beyond prediction error

    PubMed Central

    Chumbley, Justin R; Burke, Christopher J; Stephan, Klaas E; Friston, Karl J; Tobler, Philippe N; Fehr, Ernst

    2014-01-01

    Surprise drives learning. Various neural “prediction error” signals are believed to underpin surprise-based reinforcement learning. Here, we report a surprise signal that reflects reinforcement learning but is neither un/signed reward prediction error (RPE) nor un/signed state prediction error (SPE). To exclude these alternatives, we measured surprise responses in the absence of RPE and accounted for a host of potential SPE confounds. This new surprise signal was evident in ventral striatum, primary sensory cortex, frontal poles, and amygdala. We interpret these findings via a normative model of surprise. PMID:24700400

  6. Rocket Noise Prediction Program

    NASA Technical Reports Server (NTRS)

    Margasahayam, Ravi; Caimi, Raoul

    1999-01-01

    A comprehensive, automated, and user-friendly software program was developed to predict the noise and ignition over-pressure environment generated during the launch of a rocket. The software allows for interactive modification of various parameters affecting the generated noise environment. Predictions can be made for different launch scenarios and a variety of vehicle and launch mount configurations. Moreover, predictions can be made for both near-field and far-field locations on the ground and any position on the vehicle. Multiple engine and fuel combinations can be addressed, and duct geometry can be incorporated efficiently. Applications in structural design are addressed.

  7. Wind power prediction models

    NASA Technical Reports Server (NTRS)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  8. Coating Life Prediction

    NASA Technical Reports Server (NTRS)

    Nesbitt, J. A.; Gedwill, M. A.

    1984-01-01

    Hot-section gas-turbine components typically require some form of coating for oxidation and corrosion protection. Efficient use of coatings requires reliable and accurate predictions of the protective life of the coating. Currently engine inspections and component replacements are often made on a conservative basis. As a result, there is a constant need to improve and develop the life-prediction capability of metallic coatings for use in various service environments. The purpose of this present work is aimed at developing of an improved methodology for predicting metallic coating lives in an oxidizing environment and in a corrosive environment.

  9. Geothermal Reservoir Technology Research Program: Abstracts of selected research projects

    SciTech Connect

    Reed, M.J.

    1993-03-01

    Research projects are described in the following areas: geothermal exploration, mapping reservoir properties and reservoir monitoring, and well testing, simulation, and predicting reservoir performance. The objectives, technical approach, and project status of each project are presented. The background, research results, and future plans for each project are discussed. The names, addresses, and telephone and telefax numbers are given for the DOE program manager and the principal investigators. (MHR)

  10. Predicting Aircraft Noise Levels

    NASA Technical Reports Server (NTRS)

    Clark, B. J.

    1983-01-01

    Computer program developed for predicting aircraft noise levels either in flight or in ground tests. Noise sources include fan inlet and exhaust jet flap (for powered lift), core (combustor), turbine and airframe. Program written in FORTRAN IV.

  11. Predicting Population Curves.

    ERIC Educational Resources Information Center

    Bunton, Matt

    2003-01-01

    Uses graphs to involve students in inquiry-based population investigations on the Wisconsin gray wolf. Requires students to predict future changes in the wolf population, carrying capacity, and deer population. (YDS)

  12. Chapter VII. Predicting Fertility

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Section 2. Visual and Microscopic Approaches for Differentiating Unfertilized Germinal Discs and Early dead Embryos from Pre-Incubated Blastoderms Section 3. Predicting the Duration of fertility by Counting Sperm in the Outer Perivitelline Layer of Laid Eggs...

  13. Membrane Protein Prediction Methods

    PubMed Central

    Punta, Marco; Forrest, Lucy R.; Bigelow, Henry; Kernytsky, Andrew; Liu, Jinfeng; Rost, Burkhard

    2007-01-01

    We survey computational approaches that tackle membrane protein structure and function prediction. While describing the main ideas that have led to the development of the most relevant and novel methods, we also discuss pitfalls, provide practical hints and highlight the challenges that remain. The methods covered include: sequence alignment, motif search, functional residue identification, transmembrane segment and protein topology predictions, homology and ab initio modeling. Overall, predictions of functional and structural features of membrane proteins are improving, although progress is hampered by the limited amount of high-resolution experimental information available. While predictions of transmembrane segments and protein topology rank among the most accurate methods in computational biology, more attention and effort will be required in the future to ameliorate database search, homology and ab initio modeling. PMID:17367718

  14. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  15. Prediction of airframe noise

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.; Fratello, D. J.; Hayden, R. E.; Kadman, Y.; Africk, S.

    1975-01-01

    Methods of predicting airframe noise generated by aircraft in flight under nonpowered conditions are discussed. Approaches to predictions relying on flyover data and component theoretical analyses are developed. A nondimensional airframe noise spectrum of various aircraft is presented. The spectrum was obtained by smoothing all the measured spectra to remove any peculiarities due to airframe protrusions, normalizing each spectra by its overall sound pressure level and a characteristics frequency, and averaging the spectra together. A chart of airframe noise sources is included.

  16. The Arctic Predictability and Prediction on Seasonal-to-Interannual TimEscales (APPOSITE) data set version 1

    NASA Astrophysics Data System (ADS)

    Day, Jonathan J.; Tietsche, Steffen; Collins, Mat; Goessling, Helge F.; Guemas, Virginie; Guillory, Anabelle; Hurlin, William J.; Ishii, Masayoshi; Keeley, Sarah P. E.; Matei, Daniela; Msadek, Rym; Sigmond, Michael; Tatebe, Hiroaki; Hawkins, Ed

    2016-06-01

    Recent decades have seen significant developments in climate prediction capabilities at seasonal-to-interannual timescales. However, until recently the potential of such systems to predict Arctic climate had rarely been assessed. This paper describes a multi-model predictability experiment which was run as part of the Arctic Predictability and Prediction On Seasonal to Interannual Timescales (APPOSITE) project. The main goal of APPOSITE was to quantify the timescales on which Arctic climate is predictable. In order to achieve this, a coordinated set of idealised initial-value predictability experiments, with seven general circulation models, was conducted. This was the first model intercomparison project designed to quantify the predictability of Arctic climate on seasonal to interannual timescales. Here we present a description of the archived data set (which is available at the British Atmospheric Data Centre), an assessment of Arctic sea ice extent and volume predictability estimates in these models, and an investigation into to what extent predictability is dependent on the initial state. The inclusion of additional models expands the range of sea ice volume and extent predictability estimates, demonstrating that there is model diversity in the potential to make seasonal-to-interannual timescale predictions. We also investigate whether sea ice forecasts started from extreme high and low sea ice initial states exhibit higher levels of potential predictability than forecasts started from close to the models' mean state, and find that the result depends on the metric. Although designed to address Arctic predictability, we describe the archived data here so that others can use this data set to assess the predictability of other regions and modes of climate variability on these timescales, such as the El Niño-Southern Oscillation.

  17. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  18. Sensor image prediction techniques

    NASA Astrophysics Data System (ADS)

    Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.

    1981-02-01

    The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.

  19. Operational Dust Prediction

    NASA Technical Reports Server (NTRS)

    Benedetti, Angela; Baldasano, Jose M.; Basart, Sara; Benincasa, Francesco; Boucher, Olivier; Brooks, Malcolm E.; Chen, Jen-Ping; Colarco, Peter R.; Gong, Sunlin; Huneeus, Nicolas; Jones, Luke; Lu, Sarah; Menut, Laurent; Morcrette, Jean-Jacques; Mulcahy, Jane; Nickovic, Slobodan; Garcia-Pando, Carlos P.; Reid, Jeffrey S.; Sekiyama, Thomas T.; Tanaka, Taichu Y.; Terradellas, Enric; Westphal, Douglas L.; Zhang, Xiao-Ye; Zhou, Chun-Hong

    2014-01-01

    Over the last few years, numerical prediction of dust aerosol concentration has become prominent at several research and operational weather centres due to growing interest from diverse stakeholders, such as solar energy plant managers, health professionals, aviation and military authorities and policymakers. Dust prediction in numerical weather prediction-type models faces a number of challenges owing to the complexity of the system. At the centre of the problem is the vast range of scales required to fully account for all of the physical processes related to dust. Another limiting factor is the paucity of suitable dust observations available for model, evaluation and assimilation. This chapter discusses in detail numerical prediction of dust with examples from systems that are currently providing dust forecasts in near real-time or are part of international efforts to establish daily provision of dust forecasts based on multi-model ensembles. The various models are introduced and described along with an overview on the importance of dust prediction activities and a historical perspective. Assimilation and evaluation aspects in dust prediction are also discussed.

  20. Prediction of Microporosity in Shrouded Impeller Castings

    SciTech Connect

    Viswanathan, S. Nelson, C.D.

    1998-09-01

    The purpose of this Cooperative Research and Development Agreement (CRADA) between the Oak Ridge National Laboratory (ORNL) and Morris Bean and Company was to link computer models of heat and fluid flow with previously developed quality criteria for the prediction of microporosity in a Al-4.5% Cu alloy shrouded impeller casting. The results may be used to analyze the casting process design for the commercial production of 206 o alloy shrouded impeller castings. Test impeller castings were poured in the laboratory for the purpose of obtaining thermal data and porosity distributions. Also, a simulation of the test impeller casting was conducted and the results validated with porosity measurements on the test castings. A comparison of the predicted and measured microporosity distributions indicated an excellent correlation between experiments and prediction. The results of the experimental and modeling studies undertaken in this project indicate that the quality criteria developed for the prediction of microporosity in Al-4.5% Cu alloy castings can accurately predict regions of elevated microporosity even in complex castings such as the shrouded impeller casting. Accordingly, it should be possible to use quality criteria for porosity prediction in conjunction with computer models of heat and fluid flow to optimize the casting process for the production of shrouded impeller castings. Since high levels of microporosity may be expected to result in poor fatigue properties, casting designs that are optimized for low levels of microporosity should exhibit superior fatigue life.

  1. Integrated Project Management System description. [UMTRAP Project

    SciTech Connect

    Not Available

    1987-03-01

    The Uranium Mill Tailings Remedial Action (UMTRA) Project is a Department of Energy (DOE) designated Major System Acquisition (MSA). To execute and manage the Project mission successfully and to comply with the MSA requirements, the UMTRA Project Office ( Project Office'') has implemented and operates an Integrated Project Management System (IPMS). The Project Office is assisted by the Technical Assistance Contractor's (TAC) Project Integration and Control (PIC) Group in system operation. Each participant, in turn, provides critical input to system operation and reporting requirements. The IPMS provides a uniform structured approach for integrating the work of Project participants. It serves as a tool for planning and control, workload management, performance measurement, and specialized reporting within a standardized format. This system description presents the guidance for its operation. Appendices 1 and 2 contain definitions of commonly used terms and abbreviations and acronyms, respectively. 17 figs., 5 tabs.

  2. The Hairy Head Project.

    ERIC Educational Resources Information Center

    Gallick, Barbara

    A class of 3- to 6-year-old children in a Midwestern child care center chose to study hair and hairstyling salons as a group project. This article discusses how the project evolved, describes the three phases of the project, and provides the teacher's reflections on the project. Photos taken during the project are included. (Author)

  3. Analysis of Variables: Predicting Sophomore Persistence Using Logistic Regression Analysis at the University of South Florida

    ERIC Educational Resources Information Center

    Miller, Thomas E.; Herreid, Charlene H.

    2009-01-01

    This is the fifth in a series of articles describing an attrition prediction and intervention project at the University of South Florida (USF) in Tampa. The project was originally presented in the 83(2) issue (Miller 2007). The statistical model for predicting attrition was described in the 83(3) issue (Miller and Herreid 2008). The methods and…

  4. Fine-Tuning Dropout Prediction through Discriminant Analysis: The Ethnic Factor.

    ERIC Educational Resources Information Center

    Wilkinson, L. David; Frazer, Linda H.

    In the 1988-89 school year, the Austin (Texas) Independent School District's Office of Research and Evaluation undertook a new dropout research project. Part of this initiative, termed Project GRAD, attempted to develop a statistical equation by which one could predict which students were likely to drop out. If reliable predictive information…

  5. Decadal Prediction Experiments using EC-EARTH

    NASA Astrophysics Data System (ADS)

    Wouters, Bert; Hazeleger, Wilco; van Oldenborgh, Geert Jan

    2010-05-01

    We present the first results of decadal prediction experiments with EC-EARTH 2.1 in the framework of the EU-THOR project. This model consists of the ECMWF IFSc31 model at T159/L62 resolution, the NEMO2 ocean model at 1° resolution and the LIM2 sea-ice model. The purpose is to predict decadal variability in the Atlantic Ocean and the resulting predictability in the weather at these time scales. As expected, the model shows a bias in the first years due to the initialization shock from the full initial state (NEMOVAR), but stabilizes afterwards. The bias and first estimates of the skill are shown for a partial CMIP5 ensemble, covering ocean, surface and atmospheric components of the coupled model.

  6. Ozone dosimetry predictions for humans and rats

    SciTech Connect

    Overton, J.H.; Graham, R.C.; McCurdy, T.R.; Richmond, H.M.

    1990-11-01

    The report summarizes ozone (O3) dosimetry model predictions for rats and humans under several different scenarios based on the most recent empirical data and theoretical considerations in the field of O3 dosimetry. The report was prepared at the request of the Office of Air Quality Planning and Standards (OAQPS) as an input to be considered by scientists participating in a chronic lung injury risk assessment project for O3. As indicated in the report a number of judgments and assumptions had to be made to obtain the dosimetry predictions. In addition to presenting the simulation results, the O3 dosimetry model used to make the predictions is discussed and the choice or method of selecting important physiological parameters explained. This includes anatomical dimensions, choices of rat and human ventilatory parameters, and the method of estimating human and rat upper respiratory tract uptake. Finally, a comparison of simulation results to recent experimental dosimetry results is discussed.

  7. Project CREST, Gainesville, Florida. An Exemplary Project.

    ERIC Educational Resources Information Center

    DeJong, William; Stewart, Carolyn

    This manual describes Project CREST (Clinical Regional Support Teams), a community project established in Gainesville, Florida, to supplement State probation services by providing professional counseling to delinquent youth. The project uses a dual treatment approach in which, on the one hand, probation officers impose restrictions, while on the…

  8. Managing Projects for Change: Contextualised Project Management

    ERIC Educational Resources Information Center

    Tynan, Belinda; Adlington, Rachael; Stewart, Cherry; Vale, Deborah; Sims, Rod; Shanahan, Peter

    2010-01-01

    This paper will detail three projects which focussed on enhancing online learning at a large Australian distance education University within a School of Business, School of Health and School of Education. Each project had special funding and took quite distinctive project management approaches, which reflect the desire to embed innovation and…

  9. Project Panama: An International Service Project

    ERIC Educational Resources Information Center

    Aydlett, Lydia; Randolph, Mickey; Wells, Gayle

    2010-01-01

    Participation in service learning projects is a growing phenomenon at universities and colleges. Research indicates service projects are beneficial for college students and adults. There is little data investigating developmental differences in how younger versus older participants perceive the service learning process. In this project, older…

  10. Cytomics in predictive medicine

    NASA Astrophysics Data System (ADS)

    Tarnok, Attila; Valet, Guenther K.

    2004-07-01

    Predictive Medicine aims at the detection of changes in patient's disease state prior to the manifestation of deterioration or improvement of the current status. Patient-specific, disease-course predictions with >95% or >99% accuracy during therapy would be highly valuable for everyday medicine. If these predictors were available, disease aggravation or progression, frequently accompanied by irreversible tissue damage or therapeutic side effects, could then potentially be avoided by early preventive therapy. The molecular analysis of heterogeneous cellular systems (Cytomics) by cytometry in conjunction with pattern-oriented bioinformatic analysis of the multiparametric cytometric and other data provides a promising approach to individualized or personalized medical treatment or disease management. Predictive medicine is best implemented by cell oriented measurements e.g. by flow or image cytometry. Cell oriented gene or protein arrays as well as bead arrays for the capture of solute molecules form serum, plasma, urine or liquor are equally of high value. Clinical applications of predictive medicine by Cytomics will include multi organ failure in sepsis or non infectious posttraumatic shock in intensive care, or the pretherapeutic identification of high risk patients in cancer cytostatic. Early individualized therapy may provide better survival chances for individual patient at concomitant cost containment. Predictive medicine guided early reduction or stop of therapy may lower or abrogate potential therapeutic side effects. Further important aspects of predictive medicine concern the preoperative identification of patients with a tendency for postoperative complications or coronary artery disease patients with an increased tendency for restenosis. As a consequence, better patient care and new forms of inductive scientific hypothesis development based on the interpretation of predictive data patterns are at reach.

  11. On the prediction of GLE events

    NASA Astrophysics Data System (ADS)

    Nunez, Marlon; Reyes, Pedro

    2016-04-01

    A model for predicting the occurrence of GLE events is presented. This model uses the UMASEP scheme based on the lag-correlation between the time derivatives of soft X-ray flux (SXR) and near-earth proton fluxes (Núñez, 2011, 2015). We extended this approach with the correlation between SXR and ground-level neutron measurements. This model was calibrated with X-ray, proton and neutron data obtained during the period 1989-2015 from the GOES/HEPAD instrument, and neutron data from the Neutron Monitor Data Base (NMDB). During this period, 32 GLE events were detected by neutron monitor stations. We consider that a GLE prediction is successful when it is triggered before the first GLE alert is issued by any neutron station of the NMDB network. For the most recent 16 years (2015-2000), the model was able to issue successful predictions for the 53.8% (7 of 13 GLE events), obtaining a false alarm ratio (FAR) of 36.4% (4/11), and an average warning time of 10 min. For the first years of the evaluation period (1989-1999), the model was able to issue successful predictions for the 31.6% (6 of 19 GLE events), obtaining a FAR of 33.3% (3/9), and an AWT of 17 min. A preliminary conclusion is that the model is not able to predict the promptest events but the more gradual ones. The final goal of this project, which is now halfway through its planned two-year duration, is the prediction of >500 MeV events. This project has received funding from the European Union's Horizon 2020 research and innovation programme under agreement No 637324.

  12. Aircraft noise prediction

    NASA Astrophysics Data System (ADS)

    Filippone, Antonio

    2014-07-01

    This contribution addresses the state-of-the-art in the field of aircraft noise prediction, simulation and minimisation. The point of view taken in this context is that of comprehensive models that couple the various aircraft systems with the acoustic sources, the propagation and the flight trajectories. After an exhaustive review of the present predictive technologies in the relevant fields (airframe, propulsion, propagation, aircraft operations, trajectory optimisation), the paper addresses items for further research and development. Examples are shown for several airplanes, including the Airbus A319-100 (CFM engines), the Bombardier Dash8-Q400 (PW150 engines, Dowty R408 propellers) and the Boeing B737-800 (CFM engines). Predictions are done with the flight mechanics code FLIGHT. The transfer function between flight mechanics and the noise prediction is discussed in some details, along with the numerical procedures for validation and verification. Some code-to-code comparisons are shown. It is contended that the field of aircraft noise prediction has not yet reached a sufficient level of maturity. In particular, some parametric effects cannot be investigated, issues of accuracy are not currently addressed, and validation standards are still lacking.

  13. Deadbeat Predictive Controllers

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Phan, Minh

    1997-01-01

    Several new computational algorithms are presented to compute the deadbeat predictive control law. The first algorithm makes use of a multi-step-ahead output prediction to compute the control law without explicitly calculating the controllability matrix. The system identification must be performed first and then the predictive control law is designed. The second algorithm uses the input and output data directly to compute the feedback law. It combines the system identification and the predictive control law into one formulation. The third algorithm uses an observable-canonical form realization to design the predictive controller. The relationship between all three algorithms is established through the use of the state-space representation. All algorithms are applicable to multi-input, multi-output systems with disturbance inputs. In addition to the feedback terms, feed forward terms may also be added for disturbance inputs if they are measurable. Although the feedforward terms do not influence the stability of the closed-loop feedback law, they enhance the performance of the controlled system.

  14. Elementary School Projects.

    ERIC Educational Resources Information Center

    Learning By Design, 2001

    2001-01-01

    Highlights elementary school construction projects that have won the Learning By Design Awards for 2001. Projects covered involve new school construction; and renovation, additions, and restoration. (GR)

  15. Scorecard on weather predictions

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    No matter that several northern and eastern states were pelted by snow and sleet early in March, as far as longterm weather forecasters are concerned, winter ended on February 28. Now is the time to review their winter seasonal forecasts to determine how accurate were those predictions issued at the start of winter.The National Weather Service (NWS) predicted on November 27, 1981, that the winter season would bring colder-than-normal temperatures to the eastern half of the United States, while temperatures were expected to be higher than normal in the westernmost section (see Figure 1). The NWS made no prediction for the middle of the country, labeling the area ‘indeterminate,’ or having the same chance of experiencing above-normal temperatures as below-normal temperatures, explained Donald L. Gilman, chief of the NWS long-range forecasting group.

  16. Predicting the Sunspot Cycle

    NASA Technical Reports Server (NTRS)

    Hathaway, David H.

    2009-01-01

    The 11-year sunspot cycle was discovered by an amateur astronomer in 1844. Visual and photographic observations of sunspots have been made by both amateurs and professionals over the last 400 years. These observations provide key statistical information about the sunspot cycle that do allow for predictions of future activity. However, sunspots and the sunspot cycle are magnetic in nature. For the last 100 years these magnetic measurements have been acquired and used exclusively by professional astronomers to gain new information about the nature of the solar activity cycle. Recently, magnetic dynamo models have evolved to the stage where they can assimilate past data and provide predictions. With the advent of the Internet and open data policies, amateurs now have equal access to the same data used by professionals and equal opportunities to contribute (but, alas, without pay). This talk will describe some of the more useful prediction techniques and reveal what they say about the intensity of the upcoming sunspot cycle.

  17. PREDICT : A CASE STUDY.

    SciTech Connect

    Kerscher, W. J. III; Booker, J. M.; Meyer, Mary A.

    2001-01-01

    Delphi Automotive Systems and the Los Alamos National Laboratory worked together to develop PREDICT, a new methodology to characterize the reliability of a new product during its development program. Rather than conducting testing after hardware has been built, and developing statistical confidence bands around the results, this updating approach starts with an early reliability estimate characterized by large uncertainty, and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework. A considerable amount of knowledge is available at the beginning of a program in the form of expert judgment which helps to provide the initial estimate. This estimate is then continually updated as substantial and varied information becomes available during the course of the development program. This paper presents a case study of the application of PREDICT, with the objective of further describing the methodology. PREDICT has been honored with an R&D 100 Award presented by R&D Magazine.

  18. Predicting Emergency Department Visits

    PubMed Central

    Poole, Sarah; Grannis, Shaun; Shah, Nigam H.

    2016-01-01

    High utilizers of emergency departments account for a disproportionate number of visits, often for nonemergency conditions. This study aims to identify these high users prospectively. Routinely recorded registration data from the Indiana Public Health Emergency Surveillance System was used to predict whether patients would revisit the Emergency Department within one month, three months, and six months of an index visit. Separate models were trained for each outcome period, and several predictive models were tested. Random Forest models had good performance and calibration for all outcome periods, with area under the receiver operating characteristic curve of at least 0.96. This high performance was found to be due to non-linear interactions among variables in the data. The ability to predict repeat emergency visits may provide an opportunity to establish, prioritize, and target interventions to ensure that patients have access to the care they require outside an emergency department setting. PMID:27570684

  19. Questioning the Faith - Models and Prediction in Stream Restoration (Invited)

    NASA Astrophysics Data System (ADS)

    Wilcock, P.

    2013-12-01

    River management and restoration demand prediction at and beyond our present ability. Management questions, framed appropriately, can motivate fundamental advances in science, although the connection between research and application is not always easy, useful, or robust. Why is that? This presentation considers the connection between models and management, a connection that requires critical and creative thought on both sides. Essential challenges for managers include clearly defining project objectives and accommodating uncertainty in any model prediction. Essential challenges for the research community include matching the appropriate model to project duration, space, funding, information, and social constraints and clearly presenting answers that are actually useful to managers. Better models do not lead to better management decisions or better designs if the predictions are not relevant to and accepted by managers. In fact, any prediction may be irrelevant if the need for prediction is not recognized. The predictive target must be developed in an active dialog between managers and modelers. This relationship, like any other, can take time to develop. For example, large segments of stream restoration practice have remained resistant to models and prediction because the foundational tenet - that channels built to a certain template will be able to transport the supplied sediment with the available flow - has no essential physical connection between cause and effect. Stream restoration practice can be steered in a predictive direction in which project objectives are defined as predictable attributes and testable hypotheses. If stream restoration design is defined in terms of the desired performance of the channel (static or dynamic, sediment surplus or deficit), then channel properties that provide these attributes can be predicted and a basis exists for testing approximations, models, and predictions.

  20. Template Matching Approach to Signal Prediction

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; Kulikov, Igor

    2010-01-01

    A new approach to signal prediction and prognostic assessment of spacecraft health resolves an inherent difficulty in fusing sensor data with simulated data. This technique builds upon previous work that demonstrated the importance of physics-based transient models to accurate prediction of signal dynamics and system performance. While models can greatly improve predictive accuracy, they are difficult to apply in general because of variations in model type, accuracy, or intended purpose. However, virtually any flight project will have at least some modeling capability at its disposal, whether a full-blown simulation, partial physics models, dynamic look-up tables, a brassboard analogue system, or simple hand-driven calculation by a team of experts. Many models can be used to develop a predict, or an estimate of the next day s or next cycle s behavior, which is typically used for planning purposes. The fidelity of a predict varies from one project to another, depending on the complexity of the simulation (i.e. linearized or full differential equations) and the level of detail in anticipated system operation, but typically any predict cannot be adapted to changing conditions or adjusted spacecraft command execution. Applying a predict blindly, without adapting the predict to current conditions, produces mixed results at best, primarily due to mismatches between assumed execution of spacecraft activities and actual times of execution. This results in the predict becoming useless during periods of complicated behavior, exactly when the predict would be most valuable. Each spacecraft operation tends to show up as a transient in the data, and if the transients are misaligned, using the predict can actually harm forecasting performance. To address this problem, the approach here expresses the predict in terms of a baseline function superposed with one or more transient functions. These transients serve as signal templates, which can be relocated in time and space against

  1. Is genetic evolution predictable?

    PubMed

    Stern, David L; Orgogozo, Virginie

    2009-02-01

    Ever since the integration of Mendelian genetics into evolutionary biology in the early 20th century, evolutionary geneticists have for the most part treated genes and mutations as generic entities. However, recent observations indicate that all genes are not equal in the eyes of evolution. Evolutionarily relevant mutations tend to accumulate in hotspot genes and at specific positions within genes. Genetic evolution is constrained by gene function, the structure of genetic networks, and population biology. The genetic basis of evolution may be predictable to some extent, and further understanding of this predictability requires incorporation of the specific functions and characteristics of genes into evolutionary theory. PMID:19197055

  2. Predictive aging of polymers

    NASA Technical Reports Server (NTRS)

    Cuddihy, Edward F. (Inventor); Willis, Paul B. (Inventor)

    1990-01-01

    A method of predicting aging of polymers operates by heating a polymer in the outdoors to an elevated temperature until a change of property is induced. The test is conducted at a plurality of temperatures to establish a linear Arrhenius plot which is extrapolated to predict the induction period for failure of the polymer at ambient temperature. An Outdoor Photo Thermal Aging Reactor (OPTAR) is also described including a heatable platen for receiving a sheet of polymer, means to heat the platen and switching means such as a photoelectric switch for turning off the heater during dark periods.

  3. Predictive aging of polymers

    NASA Technical Reports Server (NTRS)

    Cuddihy, Edward F. (Inventor); Willis, Paul B. (Inventor)

    1989-01-01

    A method of predicting aging of polymers operates by heating a polymer in the outdoors to an elevated temperature until a change of property is induced. The test is conducted at a plurality of temperatures to establish a linear Arrhenius plot which is extrapolated to predict the induction period for failure of the polymer at ambient temperature. An Outdoor Photo Thermal Aging Reactor (OPTAR) is also described including a heatable platen for receiving a sheet of polymer, means to heat the platen, and switching means such as a photoelectric switch for turning off the heater during dark periods.

  4. Socio-Economic Status and Occupational Status Projections of Southern Youth, By Race and Sex.

    ERIC Educational Resources Information Center

    Lever, Michael F.; Kuvlesky, William P.

    The purpose of this study was to examine selected occupational status projections and the relationship between these projections and socioeconomic status (SES). Occupational status projections referred to predictive statements about the future lifetime job of the respondents. The occupational status projections included in the analysis were: (1)…

  5. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  6. Projecting future drug expenditures--1994.

    PubMed

    Santell, J P

    1994-01-15

    The use of information on inflation, generic competition, market introduction of new drug entities, institution-specific drug-use patterns, and federal legislation to project drug expenditures is discussed. Inflation of pharmaceutical prices has been decreasing over the past few years. Increases in the producer price index for drugs and pharmaceuticals diminished from 6.9% in 1991 to 4.3% in the first half of 1993; the specter of government regulation may be one reason. Pharmacy group purchasing organizations (GPOs) predicted that in 1994 expenditures would increase an average of 2.1% for contracted drug items and 8.3% for noncontracted items. Expenditures for biotechnology drugs in January through July 1993 increased 16% over the same period in 1992; such agents are now hospital pharmacies' third most costly drug category, at 10% of total expenditures. Future price competition by generic drug products can be predicted from information on patent or market-exclusivity expiration. To predict the market release of new drug products, new-drug applications filed with FDA can be monitored. The most important component in projecting drug expenditures is a specific institution's pattern of use of high-cost drugs. Mechanisms that can be used to monitor changes in therapeutic strategies and drug-use protocols include drug cost indexes, assessment of drug-use patterns by outside companies, and computerized models for specific high-cost drugs. Drug expenditures can be affected by legislative changes such as the Medicaid rebate provisions of the Omnibus Budget Reconciliation Act of 1990 and the Medicare outpatient drug benefit in the proposed American Health Security Act. The accuracy of projections of drug expenditures can be improved by examining inflation, generic competition, the introduction of new drug entities, institution-specific drug-use patterns, and legislative issues. Pharmacy managers need better methods for estimating institution-specific use of high-cost drugs

  7. Numerical tokamak turbulence project (OFES grand challenge)

    SciTech Connect

    Beer, M; Cohen, B I; Crotinger, J; Dawson, J; Decyk, V; Dimits, A M; Dorland, W D; Hammett, G W; Kerbel, G D; Leboeuf, J N; Lee, W W; Lin, Z; Nevins, W M; Reynders, J; Shumaker, D E; Smith, S; Sydora, R; Waltz, R E; Williams, T

    1999-08-27

    The primary research objective of the Numerical Tokamak Turbulence Project (NTTP) is to develop a predictive ability in modeling turbulent transport due to drift-type instabilities in the core of tokamak fusion experiments, through the use of three-dimensional kinetic and fluid simulations and the derivation of reduced models.

  8. Advanced Ground Systems Maintenance Prognostics Project

    NASA Technical Reports Server (NTRS)

    Harp, Janicce Leshay

    2014-01-01

    The project implements prognostics capabilities to predict when a component, system or subsystem will no longer meet desired functional or performance criteria, called the "end of life." The capability also provides an assessment of the "remaining useful life" of a hardware component.

  9. Improving Software Engineering on NASA Projects

    NASA Technical Reports Server (NTRS)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  10. Predicting elections: child's play!

    PubMed

    Antonakis, John; Dalgas, Olaf

    2009-02-27

    In two experiments, children and adults rated pairs of faces from election races. Naïve adults judged a pair on competence; after playing a game, children chose who they would prefer to be captain of their boat. Children's (as well as adults') preferences accurately predicted actual election outcomes. PMID:19251621

  11. Predicting Intrinsic Motivation

    ERIC Educational Resources Information Center

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the extent to…

  12. Predicting Reasoning from Memory

    ERIC Educational Resources Information Center

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  13. Inflation of Conditional Predictions

    ERIC Educational Resources Information Center

    Koriat, Asher; Fiedler, Klaus; Bjork, Robert A.

    2006-01-01

    The authors report 7 experiments indicating that conditional predictions--the assessed probability that a certain outcome will occur given a certain condition--tend to be markedly inflated. The results suggest that this inflation derives in part from backward activation in which the target outcome highlights aspects of the condition that are…

  14. Predicting service life margins

    NASA Technical Reports Server (NTRS)

    Egan, G. F.

    1971-01-01

    Margins are developed for equipment susceptible to malfunction due to excessive time or operation cycles, and for identifying limited life equipment so monitoring and replacing is accomplished before hardware failure. Method applies to hardware where design service is established and where reasonable expected usage prediction is made.

  15. Can You Predict?

    ERIC Educational Resources Information Center

    Brown, William R.

    1977-01-01

    Describes a variation of "the suffocating candle" activity used to develop the process of predicting based on reliable data. Instead of using jars of varying sizes under which the burning time of candles is measured, the same jar is used while the candle is elevated on varying numbers of blocks. (CS)

  16. Predictability of critical transitions.

    PubMed

    Zhang, Xiaozhu; Kuehn, Christian; Hallerberg, Sarah

    2015-11-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socioeconomic changes and climate transitions between ice ages and warm ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However, especially in the presence of noise, it is not clear whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the quadratic integrate-and-fire model and the van der Pol model under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictability of the system. The performance of different indicator variables turns out to be dependent on the specific model under study and the conditions of accessing it. Furthermore, we study the influence of the magnitude of transitions on the predictive performance. PMID:26651760

  17. Earthquake Prediction is Coming

    ERIC Educational Resources Information Center

    MOSAIC, 1977

    1977-01-01

    Describes (1) several methods used in earthquake research, including P:S ratio velocity studies, dilatancy models; and (2) techniques for gathering base-line data for prediction using seismographs, tiltmeters, laser beams, magnetic field changes, folklore, animal behavior. The mysterious Palmdale (California) bulge is discussed. (CS)

  18. Prediction method abstracts

    SciTech Connect

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  19. Predictive models in urology.

    PubMed

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology. PMID:23423686

  20. Predictability of critical transitions

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaozhu; Kuehn, Christian; Hallerberg, Sarah

    2015-11-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socioeconomic changes and climate transitions between ice ages and warm ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However, especially in the presence of noise, it is not clear whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the quadratic integrate-and-fire model and the van der Pol model under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictability of the system. The performance of different indicator variables turns out to be dependent on the specific model under study and the conditions of accessing it. Furthermore, we study the influence of the magnitude of transitions on the predictive performance.

  1. Predicting visibility of aircraft.

    PubMed

    Watson, Andrew; Ramirez, Cesar V; Salud, Ellen

    2009-01-01

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration. PMID:19462007

  2. Brightness predictions for comets

    NASA Astrophysics Data System (ADS)

    Green, Daniel W. E.; Marsden, Brian G.; Morris, Charles S.

    2001-02-01

    Daniel W E Green, Brian G Marsden and Charles S Morris write with the aim of illuminating the issue of cometary light curves and brightness predictions, following the publication in this journal last October of the letter by John McFarland (2000).

  3. Predicted airframe noise levels

    NASA Astrophysics Data System (ADS)

    Raney, J. P.

    1980-09-01

    Calculated values of airframe noise levels corresponding to FAA noise certification conditions for six aircraft are presented. The aircraft are: DC-9-30; Boeing 727-200; A300-B2 Airbus; Lockheed L-1011; DC-10-10; and Boeing 747-200B. The prediction methodology employed is described and discussed.

  4. Predicting Systemic Confidence

    ERIC Educational Resources Information Center

    Falke, Stephanie Inez

    2009-01-01

    Using a mixed method approach, this study explored which educational factors predicted systemic confidence in master's level marital and family therapy (MFT) students, and whether or not the impact of these factors was influenced by student beliefs and their perception of their supervisor's beliefs about the value of systemic practice. One hundred…

  5. Prediction and Guidance.

    ERIC Educational Resources Information Center

    Shimberg, Benjamin

    Problems in the application and misapplication of test scores are discussed. Tests have been used to achieve optimum use of resources rather than optimum development of the individual. Or, they have been used to predict a child's achievement rather than to identify his learning difficulties. This latter use would indicate when and where…

  6. Predicting intrinsic brain activity.

    PubMed

    Craddock, R Cameron; Milham, Michael P; LaConte, Stephen M

    2013-11-15

    Multivariate supervised learning methods exhibit a remarkable ability to decode externally driven sensory, behavioral, and cognitive states from functional neuroimaging data. Although they are typically applied to task-based analyses, supervised learning methods are equally applicable to intrinsic effective and functional connectivity analyses. The obtained models of connectivity incorporate the multivariate interactions between all brain regions simultaneously, which will result in a more accurate representation of the connectome than the ones available with standard bivariate methods. Additionally the models can be applied to decode or predict the time series of intrinsic brain activity of a region from an independent dataset. The obtained prediction accuracy provides a measure of the integration between a brain region and other regions in its network, as well as a method for evaluating acquisition and preprocessing pipelines for resting state fMRI data. This article describes a method for learning multivariate models of connectivity. The method is applied in the non-parametric prediction accuracy, influence, and reproducibility-resampling (NPAIRS) framework, to study the regional variation of prediction accuracy and reproducibility (Strother et al., 2002). The resulting spatial distribution of these metrics is consistent with the functional hierarchy proposed by Mesulam (1998). Additionally we illustrate the utility of the multivariate regression connectivity modeling method for optimizing experimental parameters and assessing the quality of functional neuroimaging data. PMID:23707580

  7. Predicting Visibility of Aircraft

    PubMed Central

    Watson, Andrew; Ramirez, Cesar V.; Salud, Ellen

    2009-01-01

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration. PMID:19462007

  8. PREVAPORATION PERFORMANCE PREDICTION SOFTWARE

    EPA Science Inventory

    The Pervaporation, Performance, Prediction Software and Database (PPPS&D) computer software program is currently being developed within the USEPA, NRMRL. The purpose of the PPPS&D program is to educate and assist potential users in identifying opportunities for using pervaporati...

  9. Predicting rainfall beyond tomorrow

    Technology Transfer Automated Retrieval System (TEKTRAN)

    NOAA’s Climate Prediction Center issues climate precipitation forecasts that offer potential support for water resource managers and farmers and ranchers in New Mexico, but the forecasts are frequently misunderstood and not widely used in practical decision making. The objectives of this newsletter ...

  10. COMPLEXITY and the QGCW Project

    NASA Astrophysics Data System (ADS)

    Zichichi, Antonino

    2014-06-01

    The following sections are included: * Seven definitions of Complexity * Complexity exists at all scales * AFB phenomena from Beethoven to the Superworld * UEEC events, from Galilei up to SM&B * The two asymptotic limits: History and Science * The basic points on the correlation between Complexity and Predictions * The lesson needed for the future * From Planck to Complexity * Consequences for LHC: the QGCW project * Conclusions * The Platonic Grand Unification * The Platonic Supersymmetry * Examples of UEEC events in the construction of the SM&B * Open Problems in Subnuclear Physics * The ten challenges of Subnuclear Physics * References

  11. Projecting future drug expenditures--1997.

    PubMed

    Mehl, B; Santell, J P

    1997-01-15

    Use of the producer price index; data from independent sources, drug industry analyst, group purchasing organizations (GPOs), and health maintenance organizations (HMOs); pharmacoeconomics; and legal developments to project drug expenditures and prepare pharmacy budgets for 1997 is discussed. The producer price index indicates that prices for drugs and pharmaceuticals increased 2.2% during January to May 1996; the increase for prescription preparations was 3.4%. Medi-Span reports an average increase for all drug products of 1.2% for the first six months of 1996. IMS America data show the price of all drugs increasing 1.8% between the second quarters of 1995 and 1996. Drug industry analysts project the overall price increase in the next 12 months at 2.5-5.0%. GPOs predict an average increase over the next 12 months of 2.2% for contracted drugs and 4.3% for non-contracted drugs. HMO pharmacy directors predict pharmacy expenditures will increase by 4.5% per member in 1997. Caution must be applied in using pharmacoeconomics to project drug costs and their impact on health care expenditures. Today's budget must account for the greater integration of drug expenditures into the institution's objectives, possible reductions in other service costs, capitation, competition, shifting of control of the drug budget to specific patient care centers, relocation of services to the ambulatory care setting, and outsourcing. Legal actions in 1996 that may affect price increases and drug budgets included a class-action lawsuit by community and chain pharmacies alleging price discrimination by manufactures and wholesalers. Prices of pharmaceutical products are fairly stable and may remain so in 1997, but projections of future drug expenditures must account for the continuing reshaping of the health care landscape. PMID:9117803

  12. eProject Builder

    Energy Science and Technology Software Center (ESTSC)

    2014-06-01

    eProject Builder enables Energy Services Companies (ESCOs) and their contracting agencies to: 1. upload and track project-level Information 2. generate basic project reports required by local, state, and/or federal agencies 3. benchmark new Energy Savings Performance Contract (ESPC) projects against historical data

  13. The 100 People Project

    ERIC Educational Resources Information Center

    McLeod, Keri

    2007-01-01

    This article describes the 100 People Project and how the author integrates the project in her class. The 100 People Project is a nonprofit organization based in New York City. The organization poses the question: If there were only 100 people in the world, what would the world look like? Through the project, students were taught about ethics in…

  14. Determinants of project success

    NASA Technical Reports Server (NTRS)

    Murphy, D. C.; Baker, B. N.; Fisher, D.

    1974-01-01

    The interactions of numerous project characteristics, with particular reference to project performance, were studied. Determinants of success are identified along with the accompanying implications for client organization, parent organization, project organization, and future research. Variables are selected which are found to have the greatest impact on project outcome, and the methodology and analytic techniques to be employed in identification of those variables are discussed.

  15. Korea's School Grounds Projects

    ERIC Educational Resources Information Center

    Park, Joohun

    2003-01-01

    This article describes two projects which Korea has undertaken to improve its school grounds: (1) the Green School Project; and (2) the School Forest Pilot Project. The Korean Ministry of Education and Human Resources Development (MOE&HRI) recently launched the Green School Project centred on existing urban schools with poor outdoor environments.…

  16. NCMS ESS 2000 Project

    NASA Technical Reports Server (NTRS)

    Gibbel, Mark; Bellamy, Marvin; DeSantis, Charlie; Hess, John; Pattok, Tracy; Quintero, Andrew; Silver, R.

    1996-01-01

    ESS 2000 has the vision of enhancing the knowledge necessary to implement cost-effective, leading-edge ESS technologies and procedures in order to increase U.S. electronics industry competitiveness. This paper defines EES and discusses the factors driving the project, the objectives of the project, its participants, the three phases of the project, the technologies involved, and project deliverables.

  17. Earth System Science Project

    ERIC Educational Resources Information Center

    Rutherford, Sandra; Coffman, Margaret

    2004-01-01

    For several decades, science teachers have used bottles for classroom projects designed to teach students about biology. Bottle projects do not have to just focus on biology, however. These projects can also be used to engage students in Earth science topics. This article describes the Earth System Science Project, which was adapted and developed…

  18. Project Lodestar Special Report.

    ERIC Educational Resources Information Center

    Brown, Peggy, Ed.

    1981-01-01

    The Association of American Colleges' (AAC) Project Lodestar is addressed in an article and descriptions of the pilot phase of the project at 13 institutions. In "Project Lodestar: Realistically Assessing the Future," Peggy Brown provides an overview of the project, which is designed to help colleges and universities in assessment of institutional…

  19. Project Follow Through.

    ERIC Educational Resources Information Center

    Illinois State Office of the Superintendent of Public Instruction, Springfield. Dept. for Exceptional Children.

    The four Follow Through projects in Illinois are described and evaluated. These projects involve approximately 1,450 children in K-3 in Mounds, East Saint Louis, Waukegan, and Chicago. The Chicago project is subdivided into three individual projects and is trying three experimental programs. Emphasis is given to the nature of the environmental…

  20. The lightcraft project

    NASA Technical Reports Server (NTRS)

    Messitt, Don G.; Myrabo, Leik N.

    1991-01-01

    Rensselaer Polytechnic Institute has been developing a transatmospheric 'Lightcraft' technology which uses beamed laser energy to propel advanced shuttle craft to orbit. In the past several years, Rensselaer students have analyzed the unique combined-cycle Lightcraft engine, designed a small unmanned Lightcraft Technology Demonstrator, and conceptualized larger manned Lightcraft - to name just a few of the interrelated design projects. The 1990-91 class carried out preliminary and detailed design efforts for a one-person 'Mercury' Lightcraft, using computer-aided design and finite-element structural modeling techniques. In addition, they began construction of a 2.6 m-diameter, full-scale engineering prototype mockup. The mockup will be equipped with three robotic legs that 'kneel' for passenger entry and exit. More importantly, the articulated tripod gear is crucial for accurately pointing at, and tracking the laser relay mirrors, a maneuver that must be performed just prior to liftoff. Also accomplished were further design improvements on a 6-inch-diameter Lightcraft model (for testing in RPI's hypersonic tunnel), and new laser propulsion experiments. The resultant experimental data will be used to calibrate Computational Fluid Dynamic (CFD) codes and analytical laser propulsion models that can simulate vehicle/engine flight conditions along a transatmospheric boost trajectory. These efforts will enable the prediction of distributed aerodynamic and thruster loads over the entire full-scale spacecraft.

  1. PREDICTIVE MODELS. Enhanced Oil Recovery Model

    SciTech Connect

    Ray, R.M.

    1992-02-26

    PREDICTIVE MODELS is a collection of five models - CFPM, CO2PM, ICPM, PFPM, and SFPM - used in the 1982-1984 National Petroleum Council study of enhanced oil recovery (EOR) potential. Each pertains to a specific EOR process designed to squeeze additional oil from aging or spent oil fields. The processes are: 1 chemical flooding, where soap-like surfactants are injected into the reservoir to wash out the oil; 2 carbon dioxide miscible flooding, where carbon dioxide mixes with the lighter hydrocarbons making the oil easier to displace; 3 in-situ combustion, which uses the heat from burning some of the underground oil to thin the product; 4 polymer flooding, where thick, cohesive material is pumped into a reservoir to push the oil through the underground rock; and 5 steamflood, where pressurized steam is injected underground to thin the oil. CFPM, the Chemical Flood Predictive Model, models micellar (surfactant)-polymer floods in reservoirs, which have been previously waterflooded to residual oil saturation. Thus, only true tertiary floods are considered. An option allows a rough estimate of oil recovery by caustic or caustic-polymer processes. CO2PM, the Carbon Dioxide miscible flooding Predictive Model, is applicable to both secondary (mobile oil) and tertiary (residual oil) floods, and to either continuous CO2 injection or water-alternating gas processes. ICPM, the In-situ Combustion Predictive Model, computes the recovery and profitability of an in-situ combustion project from generalized performance predictive algorithms. PFPM, the Polymer Flood Predictive Model, is switch-selectable for either polymer or waterflooding, and an option allows the calculation of the incremental oil recovery and economics of polymer relative to waterflooding. SFPM, the Steamflood Predictive Model, is applicable to the steam drive process, but not to cyclic steam injection (steam soak) processes.

  2. Guidelines for Project Management

    NASA Technical Reports Server (NTRS)

    Ben-Arieh, David

    2001-01-01

    Project management is an important part of the professional activities at Kennedy Space Center (KSC). Project management is the means by which many of the operations at KSC take shape. Moreover, projects at KSC are implemented in a variety of ways in different organizations. The official guidelines for project management are provided by NASA headquarters and are quite general. The project reported herein deals with developing practical and detailed project management guidelines in support of the project managers. This report summarizes the current project management effort in the Process Management Division and presents a new modeling approach of project management developed by the author. The report also presents the Project Management Guidelines developed during the summer.

  3. Predicting Major Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-05-01

    Coronal mass ejections (CMEs) and solar flares are two examples of major explosions from the surface of the Sun but theyre not the same thing, and they dont have to happen at the same time. A recent study examines whether we can predict which solar flares will be closely followed by larger-scale CMEs.Image of a solar flare from May 2013, as captured by NASAs Solar Dynamics Observatory. [NASA/SDO]Flares as a Precursor?A solar flare is a localized burst of energy and X-rays, whereas a CME is an enormous cloud of magnetic flux and plasma released from the Sun. We know that some magnetic activity on the surface of the Sun triggers both a flare and a CME, whereas other activity only triggers a confined flare with no CME.But what makes the difference? Understanding this can help us learn about the underlying physical drivers of flares and CMEs. It also might help us to better predict when a CME which can pose a risk to astronauts, disrupt radio transmissions, and cause damage to satellites might occur.In a recent study, Monica Bobra and Stathis Ilonidis (Stanford University) attempt to improve our ability to make these predictions by using a machine-learning algorithm.Classification by ComputerUsing a combination of 6 or more features results in a much better predictive success (measured by the True Skill Statistic; higher positive value = better prediction) for whether a flare will be accompanied by a CME. [Bobra Ilonidis 2016]Bobra and Ilonidis used magnetic-field data from an instrument on the Solar Dynamics Observatory to build a catalog of solar flares, 56 of which were accompanied by a CME and 364 of which were not. The catalog includes information about 18 different features associated with the photospheric magnetic field of each flaring active region (for example, the mean gradient of the horizontal magnetic field).The authors apply a machine-learning algorithm known as a binary classifier to this catalog. This algorithm tries to predict, given a set of features

  4. Multiple regression analyses in the prediction of aerospace instrument costs

    NASA Astrophysics Data System (ADS)

    Tran, Linh

    The aerospace industry has been investing for decades in ways to improve its efficiency in estimating the project life cycle cost (LCC). One of the major focuses in the LCC is the cost/prediction of aerospace instruments done during the early conceptual design phase of the project. The accuracy of early cost predictions affects the project scheduling and funding, and it is often the major cause for project cost overruns. The prediction of instruments' cost is based on the statistical analysis of these independent variables: Mass (kg), Power (watts), Instrument Type, Technology Readiness Level (TRL), Destination: earth orbiting or planetary, Data rates (kbps), Number of bands, Number of channels, Design life (months), and Development duration (months). This author is proposing a cost prediction approach of aerospace instruments based on these statistical analyses: Clustering Analysis, Principle Components Analysis (PCA), Bootstrap, and multiple regressions (both linear and non-linear). In the proposed approach, the Cost Estimating Relationship (CER) will be developed for the dependent variable Instrument Cost by using a combination of multiple independent variables. "The Full Model" will be developed and executed to estimate the full set of nine variables. The SAS program, Excel, Automatic Cost Estimating Integrate Tool (ACEIT) and Minitab are the tools to aid the analysis. Through the analysis, the cost drivers will be identified which will help develop an ultimate cost estimating software tool for the Instrument Cost prediction and optimization of future missions.

  5. Uranium Pyrophoricity Phenomena and Prediction

    SciTech Connect

    DUNCAN, D.R.

    2000-04-20

    We have compiled a topical reference on the phenomena, experiences, experiments, and prediction of uranium pyrophoricity for the Hanford Spent Nuclear Fuel Project (SNFP) with specific applications to SNFP process and situations. The purpose of the compilation is to create a reference to integrate and preserve this knowledge. Decades ago, uranium and zirconium fires were commonplace at Atomic Energy Commission facilities, and good documentation of experiences is surprisingly sparse. Today, these phenomena are important to site remediation and analysis of packaging, transportation, and processing of unirradiated metal scrap and spent nuclear fuel. Our document, bearing the same title as this paper, will soon be available in the Hanford document system [Plys, et al., 2000]. This paper explains general content of our topical reference and provides examples useful throughout the DOE complex. Moreover, the methods described here can be applied to analysis of potentially pyrophoric plutonium, metal, or metal hydride compounds provided that kinetic data are available. A key feature of this paper is a set of straightforward equations and values that are immediately applicable to safety analysis.

  6. How predictable are water resources?

    NASA Astrophysics Data System (ADS)

    Mason, P.

    2010-10-01

    Peter Mason, technical director of international dams and hydropower at MWH, explains how some water resources might be more predictable than generally supposed. Some years ago the writer examined the levels of Lake Victoria in east Africa as part of a major refurbishment project. This revealed a clear cyclic behavior in lake level and hence in discharges from the lake down the Nile system and up into Egypt. A recent study by the writer demonstrated that 20-year mean flows in the Kafue River in Zambia corresponded well to reconstructed rainfall records based on regional tree ring records. The Rio Parana has a catchment area of 3,100,000km 2 and a mean stream flow of 21,300m 3/sec. In the wider context an improved understanding of apparent periodicities in the natural record would seem to offer at least one planning scenario to be considered in terms of investment and even for the long term planning of aid and famine relief.

  7. High speed transition prediction

    NASA Technical Reports Server (NTRS)

    Gasperas, Gediminis

    1993-01-01

    The main objective of this work period was to develop, maintain and exercise state-of-the-art methods for transition prediction in supersonic flow fields. Basic state and stability codes, acquired during the last work period, were exercised and applied to calculate the properties of various flowfields. The development of a code for the prediction of transition location using a currently novel method (the PSE or Parabolized Stability Equation method), initiated during the last work period and continued during the present work period, was cancelled at mid-year for budgetary reasons. Other activities during this period included the presentation of a paper at the APS meeting in Tallahassee, Florida entitled 'Stability of Two-Dimensional Compressible Boundary Layers', as well as the initiation of a paper co-authored with H. Reed of the Arizona State University entitled 'Stability of Boundary Layers'.

  8. Fan noise prediction assessment

    NASA Technical Reports Server (NTRS)

    Bent, Paul H.

    1995-01-01

    This report is an evaluation of two techniques for predicting the fan noise radiation from engine nacelles. The first is a relatively computational intensive finite element technique. The code is named ARC, an abbreviation of Acoustic Radiation Code, and was developed by Eversman. This is actually a suite of software that first generates a grid around the nacelle, then solves for the potential flowfield, and finally solves the acoustic radiation problem. The second approach is an analytical technique requiring minimal computational effort. This is termed the cutoff ratio technique and was developed by Rice. Details of the duct geometry, such as the hub-to-tip ratio and Mach number of the flow in the duct, and modal content of the duct noise are required for proper prediction.

  9. Predicting catastrophic shifts.

    PubMed

    Weissmann, Haim; Shnerb, Nadav M

    2016-05-21

    Catastrophic shifts are known to pose a serious threat to ecology, and a reliable set of early warning indicators is desperately needed. However, the tools suggested so far have two problems. First, they cannot discriminate between a smooth transition and an imminent irreversible shift. Second, they aimed at predicting the tipping point where a state loses its stability, but in noisy spatial system the actual transition occurs when an alternative state invades. Here we suggest a cluster tracking technique that solves both problems, distinguishing between smooth and catastrophic transitions and to identify an imminent shift in both cases. Our method may allow for the prediction, and thus hopefully the prevention of such transitions, avoiding their destructive outcomes. PMID:26970446

  10. Predictive spark timing method

    SciTech Connect

    Tang, D.L.; Chang, M.F.; Sultan, M.C.

    1990-01-09

    This patent describes a method of determining spark time in a spark timing system of an internal combustion engine having a plurality of cylinders and a spark period for each cylinder in which a spark occurs. It comprises: generating at least one crankshaft position reference pulse for each spark firing event, the reference pulse nearest the next spark being set to occur within a same cylinder event as the next spark; measuring at least two reference periods between recent reference pulses; calculating the spark timing synchronously with crankshaft position by performing the calculation upon receipt of the reference pulse nearest the next spark; predicting the engine speed for the next spark period from at least two reference periods including the most recent reference period; and based on the predicted speed, calculating a spark time measured from the the reference pulse nearest the next spark.

  11. Prediction of Antibody Epitopes.

    PubMed

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    Antibodies recognize their cognate antigens in a precise and effective way. In order to do so, they target regions of the antigenic molecules that have specific features such as large exposed areas, presence of charged or polar atoms, specific secondary structure elements, and lack of similarity to self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin. Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody epitopes from the sequence and/or the three-dimensional structure of a target protein. PMID:26424260

  12. Airframe noise prediction evaluation

    NASA Technical Reports Server (NTRS)

    Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.

    1995-01-01

    The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).

  13. Multivariate respiratory motion prediction

    NASA Astrophysics Data System (ADS)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  14. Predicting appointment breaking.

    PubMed

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows. PMID:10142384

  15. Predicting Individual Fuel Economy

    SciTech Connect

    Lin, Zhenhong; Greene, David L

    2011-01-01

    To make informed decisions about travel and vehicle purchase, consumers need unbiased and accurate information of the fuel economy they will actually obtain. In the past, the EPA fuel economy estimates based on its 1984 rules have been widely criticized for overestimating on-road fuel economy. In 2008, EPA adopted a new estimation rule. This study compares the usefulness of the EPA's 1984 and 2008 estimates based on their prediction bias and accuracy and attempts to improve the prediction of on-road fuel economies based on consumer and vehicle attributes. We examine the usefulness of the EPA fuel economy estimates using a large sample of self-reported on-road fuel economy data and develop an Individualized Model for more accurately predicting an individual driver's on-road fuel economy based on easily determined vehicle and driver attributes. Accuracy rather than bias appears to have limited the usefulness of the EPA 1984 estimates in predicting on-road MPG. The EPA 2008 estimates appear to be equally inaccurate and substantially more biased relative to the self-reported data. Furthermore, the 2008 estimates exhibit an underestimation bias that increases with increasing fuel economy, suggesting that the new numbers will tend to underestimate the real-world benefits of fuel economy and emissions standards. By including several simple driver and vehicle attributes, the Individualized Model reduces the unexplained variance by over 55% and the standard error by 33% based on an independent test sample. The additional explanatory variables can be easily provided by the individuals.

  16. Predictive Game Theory

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  17. Coating life prediction

    NASA Technical Reports Server (NTRS)

    Nesbitt, James A.; Gedwill, Michael A.

    1985-01-01

    The investigation combines both experimental studies and numerical modeling to predict coating life in an oxidizing environment. The experimental work provides both input to and verification of two numerical models. The coatings being examined are an aluminide coating on Udimet 700 (U-700), a low-pressure plasma spray (LPPS) Ni-18Co-17Cr-24Al-0.2Y overlay coating also on U- 700, and bulk deposits of the LPPS NiCoCrAlY coating.

  18. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  19. Asian summer monsoon rainfall predictability: a predictable mode analysis

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Lee, June-Yi; Xiang, Baoqiang

    2015-01-01

    To what extent the Asian summer monsoon (ASM) rainfall is predictable has been an important but long-standing issue in climate science. Here we introduce a predictable mode analysis (PMA) method to estimate predictability of the ASM rainfall. The PMA is an integral approach combining empirical analysis, physical interpretation and retrospective prediction. The empirical analysis detects most important modes of variability; the interpretation establishes the physical basis of prediction of the modes; and the retrospective predictions with dynamical models and physics-based empirical (P-E) model are used to identify the "predictable" modes. Potential predictability can then be estimated by the fractional variance accounted for by the "predictable" modes. For the ASM rainfall during June-July-August, we identify four major modes of variability in the domain (20°S-40°N, 40°E-160°E) during 1979-2010: (1) El Niño-La Nina developing mode in central Pacific, (2) Indo-western Pacific monsoon-ocean coupled mode sustained by a positive thermodynamic feedback with the aid of background mean circulation, (3) Indian Ocean dipole mode, and (4) a warming trend mode. We show that these modes can be predicted reasonably well by a set of P-E prediction models as well as coupled models' multi-model ensemble. The P-E and dynamical models have comparable skills and complementary strengths in predicting ASM rainfall. Thus, the four modes may be regarded as "predictable" modes, and about half of the ASM rainfall variability may be predictable. This work not only provides a useful approach for assessing seasonal predictability but also provides P-E prediction tools and a spatial-pattern-bias correction method to improve dynamical predictions. The proposed PMA method can be applied to a broad range of climate predictability and prediction problems.

  20. Bacterial start site prediction.

    PubMed

    Hannenhalli, S S; Hayes, W S; Hatzigeorgiou, A G; Fickett, J W

    1999-09-01

    With the growing number of completely sequenced bacterial genes, accurate gene prediction in bacterial genomes remains an important problem. Although the existing tools predict genes in bacterial genomes with high overall accuracy, their ability to pinpoint the translation start site remains unsatisfactory. In this paper, we present a novel approach to bacterial start site prediction that takes into account multiple features of a potential start site, viz., ribosome binding site (RBS) binding energy, distance of the RBS from the start codon, distance from the beginning of the maximal ORF to the start codon, the start codon itself and the coding/non-coding potential around the start site. Mixed integer programing was used to optimize the discriminatory system. The accuracy of this approach is up to 90%, compared to 70%, using the most common tools in fully automated mode (that is, without expert human post-processing of results). The approach is evaluated using Bacillus subtilis, Escherichia coli and Pyrococcus furiosus. These three genomes cover a broad spectrum of bacterial genomes, since B.subtilis is a Gram-positive bacterium, E.coli is a Gram-negative bacterium and P. furiosus is an archaebacterium. A significant problem is generating a set of 'true' start sites for algorithm training, in the absence of experimental work. We found that sequence conservation between P. furiosus and the related Pyrococcus horikoshii clearly delimited the gene start in many cases, providing a sufficient training set. PMID:10446249

  1. Predicting Human Cooperation

    PubMed Central

    Nay, John J.; Vorobeychik, Yevgeniy

    2016-01-01

    The Prisoner’s Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner’s Dilemma (defection), when played by both players, is mutually harmful. Repetition of the Prisoner’s Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner’s Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner’s Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation. PMID:27171417

  2. Prediction of psychoacoustic parameters

    NASA Astrophysics Data System (ADS)

    Genuit, Klaus; Fiebig, Andre

    2005-09-01

    Noise is defined as an audible sound which either disturbs the silence, or an intentional sound that listening to leads to annoyance. Thus, it is clearly defined that the assignment of noise cannot be reduced to simple determining objective parameters like the A-weighted SPL. The question whether a sound is judged as noise can only be answered after the transformation from the sound event into an hearing event has been accomplished. The evaluation of noise depends on the physical characteristics of the sound event, on the psychoacoustical features of the human ear as well as on the psychological aspects of men. The subjectively felt noise quality depends not only on the A-weighted sound-pressure level, but also on other psychoacoustical parameters such as loudness, roughness, sharpness, etc. The known methods for the prediction of the spatial A-weighted SPL distribution in dependence on the propagation are not suitable to predict psychoacoustic parameters in an adequate way. Especially, the roughness provoked by modulation or the sharpness generated by an accumulation of high, frequent sound energy cannot offhandedly be predicted as distance dependent.

  3. Predicting Human Cooperation.

    PubMed

    Nay, John J; Vorobeychik, Yevgeniy

    2016-01-01

    The Prisoner's Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner's Dilemma (defection), when played by both players, is mutually harmful. Repetition of the Prisoner's Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner's Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner's Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation. PMID:27171417

  4. PREDICT User's Manual

    SciTech Connect

    YOUNG, LARRY W.; STURGIS, BEVERLY R.

    2002-07-01

    Sandia National Laboratories has developed a Near Real Time Range Safety Analysis Tool named PREDICT that is based upon a probabilistic range safety analysis process. Probabilistic calculations of risk may be used in place of the total containment of potentially hazardous debris during a missile launch operation. Impact probabilities are computed based upon probabilistic density functions, Monte Carlo trajectories of dispersion events, and missile failure scenarios. Impact probabilities are then coupled with current demographics (land populations, commercial and military ship traffic, and aircraft traffic) to produce expected casualty predictions for a particular launch window. Historically, these calculations required days of computer time to finalize. Sandia has developed a process that utilizes the IBM SP machines at the Maui High Performance Computing Center and at the Arctic Region Supercomputing Center to reduce the computation time from days to as little as an hour or two. This analysis tool then allows the Missile Flight Safety Officer to make launch decisions based on the latest information (winds, ship, and aircraft movements) utilizing an intelligent risk management approach. This report provides a user's manual for PREDICT version 3.3.

  5. Eclipse prediction in Mesopotamia.

    NASA Astrophysics Data System (ADS)

    Steele, J. M.

    2000-02-01

    Among the many celestial phenomena observed in ancient Mesopotamia, eclipses, particularly eclipses of the Moon, were considered to be among the astrologically most significant events. In Babylon, by at least the middle of the seventh century BC, and probably as early as the middle of the eighth century BC, astronomical observations were being systematically conducted and recorded in a group of texts which we have come to call Astronomical Diaries. These Diaries contain many observations and predictions of eclipses. The predictions generally include the expected time of the eclipse, apparently calculated quite precisely. By the last three centuries BC, the Babylonian astronomers had developed highly advanced mathematical theories of the Moon and planets. This paper outlines the various methods which appear to have been formulated by the Mesopotamian astronomers to predict eclipses of the Sun and the Moon. It also considers the question of which of these methods were actually used in compiling the Astronomical Diaries, and speculates why these particular methods were used.

  6. Towards Predicting Solar Flares

    NASA Astrophysics Data System (ADS)

    Winter, Lisa; Balasubramaniam, Karatholuvu S.

    2015-04-01

    We present a statistical study of solar X-ray flares observed using GOES X-ray observations of the ~50,000 fares that occurred from 1986 - mid-2014. Observed X-ray parameters are computed for each of the flares, including the 24-hour non-flare X-ray background in the 1-8 A band and the maximum ratio of the short (0.5 - 4 A) to long band (1-8 A) during flares. These parameters, which are linked to the amount of active coronal heating and maximum flare temperature, reveal a separation between the X-, M-, C-, and B- class fares. The separation was quantified and verified through machine-learning algorithms (k nearest neighbor; nearest centroid). Using the solar flare parameters learned from solar cycles 22-23, we apply the models to predict flare categories of solar cycle 24. Skill scores are then used to assess the success of our models, yielding correct predictions for ~80% of M-, C-, and B-class flares and 100% correct predictions for X-flares. We present details of the analysis along with the potential uses of our model in flare forecasting.

  7. [Predictive models for ART].

    PubMed

    Arvis, P; Guivarc'h-Levêque, A; Varlan, E; Colella, C; Lehert, P

    2013-02-01

    A predictive model is a mathematical expression estimating the probability of pregnancy, by combining predictive variables, or indicators. Its development requires three successive phases: formulation of the model, its validation--internal then external--and the impact study. Its performance is assessed by its discrimination and its calibration. Numerous models were proposed, for spontaneous pregnancies, IUI and IVF, but with rather poor results, and their external validation was seldom carried out and was mainly inconclusive. The impact study-consisting in ascertaining whether their use improves medical practice--was exceptionally done. The ideal ART predictive model is a "Center specific" model, helping physicians to choose between abstention, IUI and IVF, by providing a reliable cumulative rate of pregnancy for each option. This tool would allow to rationalize the practices, by avoiding premature, late, or hopeless treatments. The model would also allow to compare the performances between ART Centers based on objective criteria. Today the best solution is to adjust the existing models to one's own practice, by considering models validated with variables describing the treated population, whilst adjusting the calculation to the Center's performances. PMID:23182786

  8. NSF Funded Projects: Perspectives of Project Leaders.

    ERIC Educational Resources Information Center

    Custer, Rodney L.; Loepp, Franzie; Martin, G. Eugene

    2000-01-01

    A survey of 23 principal investigators of National Science Foundation-funded projects identified the chief facilitating factor to be direct contact with program officers. They had difficulties conceptualizing and envisioning their projects, finding time and confidence for writing proposals, and dealing with the complexity of the guidelines and the…

  9. Canadian Urban Dynamics Project. Project Canada West.

    ERIC Educational Resources Information Center

    Western Curriculum Project on Canada Studies, Edmonton (Alberta).

    This is a progress report of a curriculum development project aimed at involving students in community and regional development by creating an awareness of urban problems and instilling a sense of positive self worth and capability which will stimulate active community participation. Initial planning of the project is reported in ED 055 017. The…

  10. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377

  11. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  12. Predictive Temperature Equations for Three Sites at the Grand Canyon

    NASA Astrophysics Data System (ADS)

    McLaughlin, Katrina Marie Neitzel

    Climate data collected at a number of automated weather stations were used to create a series of predictive equations spanning from December 2009 to May 2010 in order to better predict the temperatures along hiking trails within the Grand Canyon. The central focus of this project is how atmospheric variables interact and can be combined to predict the weather in the Grand Canyon at the Indian Gardens, Phantom Ranch, and Bright Angel sites. Through the use of statistical analysis software and data regression, predictive equations were determined. The predictive equations are simple or multivariable best fits that reflect the curvilinear nature of the data. With data analysis software curves resulting from the predictive equations were plotted along with the observed data. Each equation's reduced chi2 was determined to aid the visual examination of the predictive equations' ability to reproduce the observed data. From this information an equation or pair of equations was determined to be the best of the predictive equations. Although a best predictive equation for each month and season was determined for each site, future work may refine equations to result in a more accurate predictive equation.

  13. Projecting future drug expenditures--1996.

    PubMed

    Santell, J P

    1996-01-15

    The use of information on inflation, pharmacoeconomics, generic competition, new drug entities, site-specific drug-use patterns, legislation, and the changing health care environment in the projection of drug expenditures is discussed. Drug price inflation has declined from 6.9% in 1991 to 2.1% for part of 1995. Much of the decline is attributable to deep discounts given by manufacturers to managed care institutions. Some marketing specialists are predicting that drug manufacturers will begin to scale back discounts. Pharmaceutical industry analysts project that overall price increase for pharmaceuticals in the next 12-24 months will average 2.8% (range, 0-6%). Pharmacists need to be able to understand and critically evaluate pharmacoeconomic research, particularly studies conducted by the pharmaceutical industry. Savings due to increases in generic product selection may be offset to some degree by extensions of patent expiration dates under the General Agreement on Tariffs and Trade (GATT). Drug budget projections should include a complete review of new drugs and biotechnology agents pending FDA approval, drugs pending approval for new indications, and common unlabeled uses of expensive existing agents. Various methods are available for tracking drug-use patterns in specific practice settings. When resources are limited, pharmacy managers may elect to target only high-cost drugs; a proactive approach, such as projecting costs and developing guidelines for costly agents before their market release and before consideration by the pharmacy and therapeutics committee, is advantageous. Relevant legislative activities in 1995 included reform proposals for Medicare, Medicaid, and FDA; the Federal Acquisition Streamlining Act; and GATT. Disease management and other approaches to pharmacy benefits have increased opportunities for cooperative arrangements between drug companies and health care providers that may have major effects on drug marketing and pricing. Combining

  14. The CrossGrid project

    NASA Astrophysics Data System (ADS)

    Kunze, M.; CrossGrid Collaboration

    2003-04-01

    There are many large-scale problems that require new approaches to computing, such as earth observation, environmental management, biomedicine, industrial and scientific modeling. The CrossGrid project addresses realistic problems in medicine, environmental protection, flood prediction, and physics analysis and is oriented towards specific end-users: Medical doctors, who could obtain new tools to help them to obtain correct diagnoses and to guide them during operations; industries, that could be advised on the best timing for some critical operations involving risk of pollution; flood crisis teams, that could predict the risk of a flood on the basis of historical records and actual hydrological and meteorological data; physicists, who could optimize the analysis of massive volumes of data distributed across countries and continents. Corresponding applications will be based on Grid technology and could be complex and difficult to use: the CrossGrid project aims at developing several tools that will make the Grid more friendly for average users. Portals for specific applications will be designed, that should allow for easy connection to the Grid, create a customized work environment, and provide users with all necessary information to get their job done.

  15. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  16. API Requirements for Dynamic Graph Prediction

    SciTech Connect

    Gallagher, B; Eliassi-Rad, T

    2006-10-13

    Given a large-scale time-evolving multi-modal and multi-relational complex network (a.k.a., a large-scale dynamic semantic graph), we want to implement algorithms that discover patterns of activities on the graph and learn predictive models of those discovered patterns. This document outlines the application programming interface (API) requirements for fast prototyping of feature extraction, learning, and prediction algorithms on large dynamic semantic graphs. Since our algorithms must operate on large-scale dynamic semantic graphs, we have chosen to use the graph API developed in the CASC Complex Networks Project. This API is supported on the back end by a semantic graph database (developed by Scott Kohn and his team). The advantages of using this API are (i) we have full-control of its development and (ii) the current API meets almost all of the requirements outlined in this document.

  17. Extreme events: dynamics, statistics and prediction

    NASA Astrophysics Data System (ADS)

    Ghil, M.

    2011-12-01

    In this talk, I will review work on extreme events, their causes and consequences, by a group of European and American researchers involved in a three-year project on these topics. The review covers theoretical aspects of time series analysis and of extreme value theory, as well as of the deterministic modeling of extreme events, via continuous and discrete dynamic models. The applications include climatic, seismic and socio-economic events, along with their prediction. Two important results refer to (i) the complementarity of spectral analysis of a time series in terms of the continuous and the discrete part of its power spectrum; and (ii) the need for coupled modeling of natural and socio-economic systems. Both these results have implications for the study and prediction of natural hazards and their human impacts.

  18. Extreme events: dynamics, statistics and prediction

    NASA Astrophysics Data System (ADS)

    Ghil, M.; Yiou, P.; Hallegatte, S.; Malamud, B. D.; Naveau, P.; Soloviev, A.; Friederichs, P.; Keilis-Borok, V.; Kondrashov, D.; Kossobokov, V.; Mestre, O.; Nicolis, C.; Rust, H. W.; Shebalin, P.; Vrac, M.; Witt, A.; Zaliapin, I.

    2011-05-01

    We review work on extreme events, their causes and consequences, by a group of European and American researchers involved in a three-year project on these topics. The review covers theoretical aspects of time series analysis and of extreme value theory, as well as of the deterministic modeling of extreme events, via continuous and discrete dynamic models. The applications include climatic, seismic and socio-economic events, along with their prediction. Two important results refer to (i) the complementarity of spectral analysis of a time series in terms of the continuous and the discrete part of its power spectrum; and (ii) the need for coupled modeling of natural and socio-economic systems. Both these results have implications for the study and prediction of natural hazards and their human impacts.

  19. Spacecraft Magnetic Cleanliness Prediction and Control

    NASA Astrophysics Data System (ADS)

    Weikert, S.; Mehlem, K.; Wiegand, A.

    2012-05-01

    The paper describes a sophisticated and realistic control and prediction method for the magnetic cleanliness of spacecraft, covering all phases of a project till the final system test. From the first establishment of the so-called magnetic moment allocation list the necessary boom length can be determined. The list is then continuously updated by real unit test results with the goal to ensure that the magnetic cleanliness budget is not exceeded at a given probability level. A complete example is described. The synthetic spacecraft modeling which predicts only quite late the final magnetic state of the spacecraft is also described. Finally, the most important cleanliness verification, the spacecraft system test, is described shortly with an example. The emphasis of the paper is put on the magnetic dipole moment allocation method.

  20. Genome Majority Vote Improves Gene Predictions

    PubMed Central

    Wall, Michael E.; Raghavan, Sindhu; Cohn, Judith D.; Dunbar, John

    2011-01-01

    Recent studies have noted extensive inconsistencies in gene start sites among orthologous genes in related microbial genomes. Here we provide the first documented evidence that imposing gene start consistency improves the accuracy of gene start-site prediction. We applied an algorithm using a genome majority vote (GMV) scheme to increase the consistency of gene starts among orthologs. We used a set of validated Escherichia coli genes as a standard to quantify accuracy. Results showed that the GMV algorithm can correct hundreds of gene prediction errors in sets of five or ten genomes while introducing few errors. Using a conservative calculation, we project that GMV would resolve many inconsistencies and errors in publicly available microbial gene maps. Our simple and logical solution provides a notable advance toward accurate gene maps. PMID:22131910

  1. Data driven propulsion system weight prediction model

    NASA Technical Reports Server (NTRS)

    Gerth, Richard J.

    1994-01-01

    The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.

  2. Radiation Effects: Core Project

    NASA Technical Reports Server (NTRS)

    Dicello, John F.

    1999-01-01

    methods and predictions which are being used to assess the levels of risks to be encountered and to evaluate appropriate strategies for countermeasures. Although the work in this project is primarily directed toward problems associated with space travel, the problem of protracted exposures to low-levels of radiation is one of national interest in our energy and defense programs, and the results may suggest new paradigms for addressing such risks.

  3. Predictive Modeling of the CDRA 4BMS

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.

    2016-01-01

    As part of NASA's Advanced Exploration Systems (AES) program and the Life Support Systems Project (LSSP), fully predictive models of the Four Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) on the International Space Station (ISS) are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  4. Phonematic recognition by linear prediction: Experiment

    NASA Astrophysics Data System (ADS)

    Miclet, L.; Grenier, Y.; Leroux, J.

    The recognition of speech signals analyzed by linear prediction is introduced. The principle of the channel adapted vocoder (CAV) is outlined. The learning of each channel model and adaptation to the speaker are discussed. A method stemming from the canonical analysis of correlations is given. This allows, starting with the CAV of one speaker, the calculation of that of another. The projection function is learned from a series of key words pronounced by both speakers. The reconstruction of phonemes can be explained by recognition factors arising from the vocoder. Automata associated with the channels are used for local smoothing and series of segments are treated in order to produce a phonemic lattice.

  5. A T-EOF Based Prediction Method.

    NASA Astrophysics Data System (ADS)

    Lee, Yung-An

    2002-01-01

    A new statistical time series prediction method based on temporal empirical orthogonal function (T-EOF) is introduced in this study. This method first applies singular spectrum analysis (SSA) to extract dominant T-EOFs from historical data. Then, the most recent data are projected onto an optimal subset of the T-EOFs to estimate the corresponding temporal principal components (T-PCs). Finally, a forecast is constructed from these T-EOFs and T-PCs. Results from forecast experiments on the El Niño sea surface temperature (SST) indices from 1993 to 2000 showed that this method consistently yielded better correlation skill than autoregressive models for a lead time longer than 6 months. Furthermore, the correlation skills of this method in predicting Niño-3 index remained above 0.5 for a lead time up to 36 months during this period. However, this method still encountered the `spring barrier' problem. Because the 1990s exhibited relatively weak spring barrier, these results indicate that the T-EOF based prediction method has certain extended forecasting capability in the period when the spring barrier is weak. They also suggest that the potential predictability of ENSO in a certain period may be longer than previously thought.

  6. QoS Predictability of Internet Services

    NASA Astrophysics Data System (ADS)

    Bilski, Tomasz

    The paper presents problems of QoS (Quality of Service) predictability of network services (mainly in WAN environment). In the first part we present general remarks on QoS predictability problem, mentioning some research projects and available resources. The main part of the paper deals with QoS predictability in long-term as well as short-term viewpoints. We will try to answer a question: is it possible to predict network QoS/performance level with a use of statistical data from the past? The term quality of service has many meanings ranging from the user's qualitative perception of the service to a set of quantitative connection parameters (RTT (Round Trip Time), throughput, loss packet rate) necessary to achieve particular service quality. In the paper we will mostly use the second meaning of the term based on RFC 2386 [1]. Analyzed, statistical data on Internet performance are taken from the IEPM (Internet End-to-end Performance Measurement) database.

  7. Geostatistical enhancement of european hydrological predictions

    NASA Astrophysics Data System (ADS)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  8. Analytical predictions of RTG power degradation. [Radioisotope Thermoelectric Generator

    NASA Technical Reports Server (NTRS)

    Noon, E. L.; Raag, V.

    1979-01-01

    The DEGRA computer code that is based on a mathematical model which predicts performance and time-temperature dependent degradation of a radioisotope thermoelectric generator is discussed. The computer code has been used to predict performance and generator degradation for the selenide Ground Demonstration Unit (GDS-1) and the generator used in the Galileo Project. Results of parametric studies of load voltage vs generator output are examined as well as the I-V curve and the resulting predicted power vs voltage. The paper also discusses the increased capability features contained in DEGRA2 and future plans for expanding the computer code performance.

  9. On identified predictive control

    NASA Technical Reports Server (NTRS)

    Bialasiewicz, Jan T.

    1993-01-01

    Self-tuning control algorithms are potential successors to manually tuned PID controllers traditionally used in process control applications. A very attractive design method for self-tuning controllers, which has been developed over recent years, is the long-range predictive control (LRPC). The success of LRPC is due to its effectiveness with plants of unknown order and dead-time which may be simultaneously nonminimum phase and unstable or have multiple lightly damped poles (as in the case of flexible structures or flexible robot arms). LRPC is a receding horizon strategy and can be, in general terms, summarized as follows. Using assumed long-range (or multi-step) cost function the optimal control law is found in terms of unknown parameters of the predictor model of the process, current input-output sequence, and future reference signal sequence. The common approach is to assume that the input-output process model is known or separately identified and then to find the parameters of the predictor model. Once these are known, the optimal control law determines control signal at the current time t which is applied at the process input and the whole procedure is repeated at the next time instant. Most of the recent research in this field is apparently centered around the LRPC formulation developed by Clarke et al., known as generalized predictive control (GPC). GPC uses ARIMAX/CARIMA model of the process in its input-output formulation. In this paper, the GPC formulation is used but the process predictor model is derived from the state space formulation of the ARIMAX model and is directly identified over the receding horizon, i.e., using current input-output sequence. The underlying technique in the design of identified predictive control (IPC) algorithm is the identification algorithm of observer/Kalman filter Markov parameters developed by Juang et al. at NASA Langley Research Center and successfully applied to identification of flexible structures.

  10. Optimising Impact in Astronomy for Development Projects

    NASA Astrophysics Data System (ADS)

    Grant, Eli

    2015-08-01

    Positive outcomes in the fields of science education and international development are notoriously difficult to achieve. Among the challenges facing projects that use astronomy to improve education and socio-economic development is how to optimise project design in order to achieve the greatest possible benefits. Over the past century, medical scientists along with statisticians and economists have progressed an increasingly sophisticated and scientific approach to designing, testing and improving social intervention and public health education strategies. This talk offers a brief review of the history and current state of `intervention science'. A similar framework is then proposed for astronomy outreach and education projects, with applied examples given of how existing evidence can be used to inform project design, predict and estimate cost-effectiveness, minimise the risk of unintended negative consequences and increase the likelihood of target outcomes being achieved.

  11. Human genetics: international projects and personalized medicine.

    PubMed

    Apellaniz-Ruiz, Maria; Gallego, Cristina; Ruiz-Pinto, Sara; Carracedo, Angel; Rodríguez-Antona, Cristina

    2016-03-01

    In this article, we present the progress driven by the recent technological advances and new revolutionary massive sequencing technologies in the field of human genetics. We discuss this knowledge in relation with drug response prediction, from the germline genetic variation compiled in the 1000 Genomes Project or in the Genotype-Tissue Expression project, to the phenome-genome archives, the international cancer projects, such as The Cancer Genome Atlas or the International Cancer Genome Consortium, and the epigenetic variation and its influence in gene expression, including the regulation of drug metabolism. This review is based on the lectures presented by the speakers of the Symposium "Human Genetics: International Projects & New Technologies" from the VII Conference of the Spanish Pharmacogenetics and Pharmacogenomics Society, held on the 20th and 21st of April 2015. PMID:26581075

  12. Fluctuations, Intermittency and Predictivity

    NASA Astrophysics Data System (ADS)

    Charbonneau, Paul

    This chapter considers the various mechanisms capable of producing amplitude and duration variations in the various dynamo models introduced in Chap. 3 (10.1007/978-3-642-32093-4_3). After a survey of observed and inferred fluctuation patterns of the solar cycle, the effects on the basic cycle of stochastic forcing, dynamical nonlinearities and time delay are considered in turn. The occurrence of intermittency in a subset of these models is then investigated, with an eye on explaining Grand Minima observed in the solar activity record. The chapter closes with a brief discussion of solar cycle prediction schemes based on dynamo models.

  13. Coal extraction - environmental prediction

    SciTech Connect

    C. Blaine Cecil; Susan J. Tewalt

    2002-08-01

    To predict and help minimize the impact of coal extraction in the Appalachian region, the U.S. Geological Survey (USGS) is addressing selected mine-drainage issues through the following four interrelated studies: spatial variability of deleterious materials in coal and coal-bearing strata; kinetics of pyrite oxidation; improved spatial geologic models of the potential for drainage from abandoned coal mines; and methodologies for the remediation of waters discharged from coal mines. As these goals are achieved, the recovery of coal resources will be enhanced. 2 figs.

  14. Age and Stress Prediction

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Genoa is a software product that predicts progressive aging and failure in a variety of materials. It is the result of a SBIR contract between the Glenn Research Center and Alpha Star Corporation. Genoa allows designers to determine if the materials they plan on applying to a structure are up to the task or if alternate materials should be considered. Genoa's two feature applications are its progressive failure simulations and its test verification. It allows for a reduction in inspection frequency, rapid design solutions, and manufacturing with low cost materials. It will benefit the aerospace, airline, and automotive industries, with future applications for other uses.

  15. Predicting Ground Illuminance

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.; Tregoning, Brett D.; Hitchens, Alexandra E.

    2015-01-01

    Our Sun outputs 3.85 x 1026 W of radiation, of which roughly 37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather.Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types.In this paper we compare the models' predictions to ground illuminance data from an observing run at the White Sands missile range (data was obtained from the United Kingdom's Meteorology Office). Continuous illuminance readings were recorded under various cloud conditions, during both daytime and nighttime hours. We find that under clear skies, the Shapiro model tends to better fit the observations during daytime hours with typical discrepancies under 10%. Under cloudy skies, both models tend to poorly predict ground illuminance. However, the Shapiro model, with typical average daytime discrepancies of 25% or less in many cases

  16. Predicting Ground Illuminance

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.

    2014-01-01

    Our Sun outputs 3.85 × 1026 W of radiation, of which ≈37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather. Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types. Ground illuminance data from an observing run at the White Sands missile range were obtained from the United Kingdom Meteorology Office. Based on available weather reports, five days of clear sky observations were selected. These data are compared to the predictions of the two models. We find that neither of the models provide an accurate treatment during twilight conditions when the Sun is at or a few degrees below the horizon. When the Sun is above the horizon, the Shapiro model straddles the observed data, ranging between 90% and 120% of the recorded illuminance. During the same times, the Brown model is between 70% and 90% of the

  17. Timebias corrections to predictions

    NASA Technical Reports Server (NTRS)

    Wood, Roger; Gibbs, Philip

    1993-01-01

    The importance of an accurate knowledge of the time bias corrections to predicted orbits to a satellite laser ranging (SLR) observer, especially for low satellites, is highlighted. Sources of time bias values and the optimum strategy for extrapolation are discussed from the viewpoint of the observer wishing to maximize the chances of getting returns from the next pass. What is said may be seen as a commercial encouraging wider and speedier use of existing data centers for mutually beneficial exchange of time bias data.

  18. Stress Prediction System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA wanted to know how astronauts' bodies would react under various gravitational pulls and space suit weights. Under contract to NASA, the University of Michigan's Center for Ergonomics developed a model capable of predicting what type of stress and what degree of load a body could stand. The algorithm generated was commercialized with the ISTU (Isometric Strength Testing Unit) Functional Capacity Evaluation System, which simulates tasks such as lifting a heavy box or pushing a cart and evaluates the exertion expended. It also identifies the muscle group that limits the subject's performance. It is an effective tool of personnel evaluation, selection and job redesign.

  19. The Alzheimer's Project

    MedlinePlus

    ... Bar Home Current Issue Past Issues The Alzheimer's Project Past Issues / Spring 2009 Table of Contents For ... this page please turn Javascript on. The Alzheimer's Project A 4-Part Documentary Series Starting May 10 ...

  20. Elective Program Projects

    ERIC Educational Resources Information Center

    Estrada, Christelle

    1976-01-01

    Outlined is an interdisciplinary program in Ecology and Oceanography for grades six through eight. Numerous student projects are suggested in the outline and the course requirements and the project system are explained. (MA)

  1. Venezuela's Bolivarian Schools Project.

    ERIC Educational Resources Information Center

    Diaz, Maria Magnolia Santamaria

    2002-01-01

    Discusses efforts by the Venezuelan government to improve the nation's school infrastructure through the Bolivarian Schools Project administered by the Ministry of Education, Culture and Sport. The project set educational principles which are guiding current school building efforts. (EV)

  2. The Alzheimer's Project

    MedlinePlus

    ... Navigation Bar Home Current Issue Past Issues The Alzheimer's Project Past Issues / Spring 2009 Table of Contents ... of this page please turn Javascript on. The Alzheimer's Project A 4-Part Documentary Series Starting May ...

  3. CALLA ENERGY BIOMASS COFIRING PROJECT

    SciTech Connect

    Unknown

    2003-07-01

    The Calla Energy Biomass Project, to be located in Estill County, Kentucky is to be conducted in two phases. The objective of Phase I is to evaluate the technical and economic feasibility of cofiring biomass-based gasification fuel-gas in a power generation boiler. Waste coal fines are to be evaluated as the cofired fuel. The project is based on the use of commercially available technology for feeding and gas cleanup that would be suitable for deployment in municipal, large industrial and utility applications. Define a combustion system for the biomass gasification-based fuel-gas capable of stable, low-NOx combustion over the full range of gaseous fuel mixtures, with low carbon monoxide emissions and turndown capabilities suitable for large-scale power generation applications. The objective for Phase II is to design, install and demonstrate the combined gasification and combustion system in a large-scale, long-term cofiring operation to promote acceptance and utilization of indirect biomass cofiring technology for large-scale power generation applications. GTI received supplemental authorization A002 from DOE for additional work to be performed under Phase I that will further extend the performance period until the end of February 2003. The additional scope of work is for GTI to develop the gasification characteristics of selected feedstock for the project. To conduct this work, GTI assembles an existing ''mini-bench'' unit to perform the gasification tests. The results of the test will be used to confirm or if necessary update the process design completed in Phase Task 1 During this Performance Period work efforts focused on conducting tests of biomass feedstock samples on the 2 inch mini-bench gasifier. The gasification tests were completed. The GTI U-GAS model was used to check some of the early test results against the model predictions. Additional modeling will be completed to further verify the model predictions and actual results.

  4. Enhancing seasonal climate prediction capacity for the Pacific countries

    NASA Astrophysics Data System (ADS)

    Kuleshov, Y.; Jones, D.; Hendon, H.; Charles, A.; Cottrill, A.; Lim, E.-P.; Langford, S.; de Wit, R.; Shelton, K.

    2012-04-01

    Seasonal and inter-annual climate variability is a major factor in determining the vulnerability of many Pacific Island Countries to climate change and there is need to improve weekly to seasonal range climate prediction capabilities beyond what is currently available from statistical models. In the seasonal climate prediction project under the Australian Government's Pacific Adaptation Strategy Assistance Program (PASAP), we describe a comprehensive project to strengthen the climate prediction capacities in National Meteorological Services in 14 Pacific Island Countries and East Timor. The intent is particularly to reduce the vulnerability of current services to a changing climate, and improve the overall level of information available assist with managing climate variability. Statistical models cannot account for aspects of climate variability and change that are not represented in the historical record. In contrast, dynamical physics-based models implicitly include the effects of a changing climate whatever its character or cause and can predict outcomes not seen previously. The transition from a statistical to a dynamical prediction system provides more valuable and applicable climate information to a wide range of climate sensitive sectors throughout the countries of the Pacific region. In this project, we have developed seasonal climate outlooks which are based upon the current dynamical model POAMA (Predictive Ocean-Atmosphere Model for Australia) seasonal forecast system. At present, meteorological services of the Pacific Island Countries largely employ statistical models for seasonal outlooks. Outcomes of the PASAP project enhanced capabilities of the Pacific Island Countries in seasonal prediction providing National Meteorological Services with an additional tool to analyse meteorological variables such as sea surface temperatures, air temperature, pressure and rainfall using POAMA outputs and prepare more accurate seasonal climate outlooks.

  5. GHPsRUS Project

    DOE Data Explorer

    Battocletti, Liz

    2013-07-09

    The GHPsRUS Project's full name is "Measuring the Costs and Benefits of Nationwide Geothermal Heat Pump Deployment." The dataset contains employment and installation price data collected by four economic surveys: (1)GHPsRUS Project Manufacturer & OEM Survey, (2) GHPsRUS Project Geothermal Loop Survey, (3) GHPsRUS Project Mechanical Equipment Installation Survey, and (4) GHPsRUS Geothermal Heat Pump Industry Survey

  6. Predictability of Rogue Events

    NASA Astrophysics Data System (ADS)

    Birkholz, Simon; Brée, Carsten; Demircan, Ayhan; Steinmeyer, Günter

    2015-05-01

    Using experimental data from three different rogue wave supporting systems, determinism, and predictability of the underlying dynamics are evaluated with methods of nonlinear time series analysis. We included original records from the Draupner platform in the North Sea as well as time series from two optical systems in our analysis. One of the latter was measured in the infrared tail of optical fiber supercontinua, the other in the fluence profiles of multifilaments. All three data sets exhibit extreme-value statistics and exceed the significant wave height in the respective system by a factor larger than 2. Nonlinear time series analysis indicates a different degree of determinism in the systems. The optical fiber scenario is found to be driven by quantum noise whereas rogue waves emerge as a consequence of turbulence in the others. With the large number of rogue events observed in the multifilament system, we can systematically explore the predictability of such events in a turbulent system. We observe that rogue events do not necessarily appear without a warning, but are often preceded by a short phase of relative order. This surprising finding sheds some new light on the fascinating phenomenon of rogue waves.

  7. Compressor map prediction tool

    NASA Astrophysics Data System (ADS)

    Ravi, Arjun; Sznajder, Lukasz; Bennett, Ian

    2015-08-01

    Shell Global Solutions uses an in-house developed system for remote condition monitoring of centrifugal compressors. It requires field process data collected during operation to calculate and assess the machine's performance. Performance is assessed by comparing live results of polytropic head and efficiency versus design compressor curves provided by the Manufacturer. Typically, these design curves are given for specific suction conditions. The further these conditions on site deviate from those prescribed at design, the less accurate the health assessment of the compressor becomes. To address this specified problem, a compressor map prediction tool is proposed. The original performance curves of polytropic head against volumetric flow for varying rotational speeds are used as an input to define a range of Mach numbers within which the non-dimensional invariant performance curve of head and volume flow coefficient is generated. The new performance curves of polytropic head vs. flow for desired set of inlet conditions are then back calculated using the invariant non-dimensional curve. Within the range of Mach numbers calculated from design data, the proposed methodology can predict polytropic head curves at a new set of inlet conditions within an estimated 3% accuracy. The presented methodology does not require knowledge of detailed impeller geometry such as throat areas, blade number, blade angles, thicknesses nor other aspects of the aerodynamic design - diffusion levels, flow angles, etc. The only required mechanical design feature is the first impeller tip diameter. Described method makes centrifugal compressor surveillance activities more accurate, enabling precise problem isolation affecting machine's performance.

  8. Predicting Alloreactivity in Transplantation

    PubMed Central

    Geneugelijk, Kirsten; Thus, Kirsten Anne; Spierings, Eric

    2014-01-01

    Human leukocyte Antigen (HLA) mismatching leads to severe complications after solid-organ transplantation and hematopoietic stem-cell transplantation. The alloreactive responses underlying the posttransplantation complications include both direct recognition of allogeneic HLA by HLA-specific alloantibodies and T cells and indirect T-cell recognition. However, the immunogenicity of HLA mismatches is highly variable; some HLA mismatches lead to severe clinical B-cell- and T-cell-mediated alloreactivity, whereas others are well tolerated. Definition of the permissibility of HLA mismatches prior to transplantation allows selection of donor-recipient combinations that will have a reduced chance to develop deleterious host-versus-graft responses after solid-organ transplantation and graft-versus-host responses after hematopoietic stem-cell transplantation. Therefore, several methods have been developed to predict permissible HLA-mismatch combinations. In this review we aim to give a comprehensive overview about the current knowledge regarding HLA-directed alloreactivity and several developed in vitro and in silico tools that aim to predict direct and indirect alloreactivity. PMID:24868561

  9. The Proposal Project

    ERIC Educational Resources Information Center

    Pierce, Elizabeth

    2007-01-01

    The proposal project stretches over a significant portion of the semester-long sophomore course Professional Communication (ENG 250) at Monroe Community College. While developing their proposal project, students need to use time management skills to successfully complete a quality project on time. In addition, excellent oral and written…

  10. Kansas Advanced Semiconductor Project

    SciTech Connect

    Baringer, P.; Bean, A.; Bolton, T.; Horton-Smith, G.; Maravin, Y.; Ratra, B.; Stanton, N.; von Toerne, E.; Wilson, G.

    2007-09-21

    KASP (Kansas Advanced Semiconductor Project) completed the new Layer 0 upgrade for D0, assumed key electronics projects for the US CMS project, finished important new physics measurements with the D0 experiment at Fermilab, made substantial contributions to detector studies for the proposed e+e- international linear collider (ILC), and advanced key initiatives in non-accelerator-based neutrino physics.

  11. Library Digitisation Project Management.

    ERIC Educational Resources Information Center

    Middleton, Michael

    Supervision of library digitization is the focus of this paper. First outlined are the definition, formalization, implementation, and completion phases of project management. Descriptions of management decisions involved in digitization projects follow on matters such as: collection analysis, resourcing, project personnel, production, access and…

  12. The Sidewalk Project

    ERIC Educational Resources Information Center

    Church, William

    2005-01-01

    In this article, the author features "the sidewalk project" in Littleton High School. The sidewalk project is a collaboration of more than 40 high school physics students, 10 local mentors, and a few regional and national organizations who worked together to invent a way to heat a sidewalk with an alternative energy source. The project, which…

  13. Project ASTRO: A Partnership.

    ERIC Educational Resources Information Center

    Rothenberger, Lisa

    2001-01-01

    Describes a project that enriches astronomy lessons with hands-on activities facilitated by an astronomer. The project links professional and amateur astronomers with middle-level classroom teachers and informal educators. Families and community organizations are also involved in the project. Provides information on how to join the ASTRO network.…

  14. Ideas for Science Projects.

    ERIC Educational Resources Information Center

    Showalter, Victor; Slesnick, Irwin

    This booklet was written for students as a source of ideas for research type science projects. Part One shows how three high school students developed individual projects a s a result of asking questions about the same natural phenomena. Part Two contains project suggestions and sample questions designed to stimulate student thinking along…

  15. Toll Gate Metrication Project

    ERIC Educational Resources Information Center

    Izzi, John

    1974-01-01

    The project director of the Toll Gate Metrication Project describes the project as the first structured United States public school educational experiment in implementing change toward the adoption of the International System of Units. He believes the change will simplify, rather than complicate, the educational task. (AG)

  16. Visible Human Project

    MedlinePlus

    ... Mobile Gallery Site Navigation Home The Visible Human Project ® Overview The Visible Human Project ® is an outgrowth of the NLM's 1986 Long- ... The long-term goal of the Visible Human Project ® is to produce a system of knowledge structures ...

  17. THE ATLANTA SUPERSITE PROJECT

    EPA Science Inventory

    The Atlanta Supersites project is the first of two Supersites projects to be established during Phase I of EPA's Supersites Program; Phase 11 is being established through a Request for Assistance. The other initial project is in Fresno, California. The Supersites Program is par...

  18. Of Principals and Projects.

    ERIC Educational Resources Information Center

    Wyant, Spencer H.; And Others

    Principals play an important role in the success of externally funded change projects in their schools. Interviews exploring the participation of principals in such projects in 14 Oregon elementary and secondary schools provided 11 case studies illustrating helpful and unhelpful behaviors. The projects were found to have life cycles of their own,…

  19. Humane Education Projects Handbook.

    ERIC Educational Resources Information Center

    Junior League of Ogden, UT.

    This handbook was developed to promote interest in humane education and to encourage the adoption of humane education projects. Although specifically designed to assist Junior Leagues in developing such projects, the content should prove valuable to animal welfare organizations, zoos, aquariums, nature centers, and other project-oriented groups…

  20. The Eggen Card Project

    NASA Astrophysics Data System (ADS)

    Silvis, G.

    2014-06-01

    (Abstract only) Olin Eggen, noted astronomer (1919-1998), left to us all his raw observation records recorded on 3x5 cards. This project is to make all this data available as an online resource. History and progress of the project will be presented. Project details available at: https://sites.google.com/site/eggencards/home.

  1. Projection: A Bibliography.

    ERIC Educational Resources Information Center

    Pedrini, D. T.; Pedrini, Bonnie C.

    Sigmund Freud and his associates did much clinical work with the dynamic of projection, especially with regard to paranoid symptoms and syndromes. Much experimental work has also been done with projection. Sears evaluated the results of some of those studies. Murstein and Pryer sub-classified projection and reviewed typical studies. The…

  2. The Illinois Rivers Project.

    ERIC Educational Resources Information Center

    Williams, Robert A.; And Others

    The Illinois Rivers Project was developed as an integrated, multidimensional science/technology/society pilot project designed to introduce water quality dimensions into Illinois high schools. The project involved high school science, social science, and English teachers in an integrated study of their local river and community. Science students…

  3. Predicting hand function after hemidisconnection.

    PubMed

    Küpper, Hanna; Kudernatsch, Manfred; Pieper, Tom; Groeschel, Samuel; Tournier, Jacques-Donald; Raffelt, David; Winkler, Peter; Holthausen, Hans; Staudt, Martin

    2016-09-01

    Hemidisconnections (i.e. hemispherectomies or hemispherotomies) invariably lead to contralateral hemiparesis. Many patients with a pre-existing hemiparesis, however, experience no deterioration in motor functions, and some can still grasp with their paretic hand after hemidisconnection. The scope of our study was to predict this phenomenon. Hypothesizing that preserved contralateral grasping ability after hemidisconnection can only occur in patients controlling their paretic hands via ipsilateral corticospinal projections already in the preoperative situation, we analysed the asymmetries of the brainstem (by manual magnetic resonance imaging volumetry) and of the structural connectivity of the corticospinal tracts within the brainstem (by magnetic resonance imaging diffusion tractography), assuming that marked hypoplasia or Wallerian degeneration on the lesioned side in patients who can grasp with their paretic hands indicate ipsilateral control. One hundred and two patients who underwent hemidisconnections between 0.8 and 36 years of age were included. Before the operation, contralateral hand function was normal in 3/102 patients, 47/102 patients showed hemiparetic grasping ability and 52/102 patients could not grasp with their paretic hands. After hemidisconnection, 20/102 patients showed a preserved grasping ability, and 5/102 patients began to grasp with their paretic hands only after the operation. All these 25 patients suffered from pre- or perinatal brain lesions. Thirty of 102 patients lost their grasping ability. This group included all seven patients with a post-neonatally acquired or progressive brain lesion who could grasp before the operation, and also all three patients with a preoperatively normal hand function. The remaining 52/102 patients were unable to grasp pre- and postoperatively. On magnetic resonance imaging, the patients with preserved grasping showed significantly more asymmetric brainstem volumes than the patients who lost their grasping

  4. Projecting future sea level

    USGS Publications Warehouse

    Cayan, Daniel R.; Bromirski, Peter; Hayhoe, Katharine; Tyree, Mary; Dettinger, Mike; Flick, Reinhard

    2006-01-01

    California’s coastal observations and global model projections indicate that California’s open coast and estuaries will experience increasing sea levels over the next century. Sea level rise has affected much of the coast of California, including the Southern California coast, the Central California open coast, and the San Francisco Bay and upper estuary. These trends, quantified from a small set of California tide gages, have ranged from 10–20 centimeters (cm) (3.9–7.9 inches) per century, quite similar to that estimated for global mean sea level. So far, there is little evidence that the rate of rise has accelerated, and the rate of rise at California tide gages has actually flattened since 1980, but projections suggest substantial sea level rise may occur over the next century. Climate change simulations project a substantial rate of global sea level rise over the next century due to thermal expansion as the oceans warm and runoff from melting land-based snow and ice accelerates. Sea level rise projected from the models increases with the amount of warming. Relative to sea levels in 2000, by the 2070–2099 period, sea level rise projections range from 11–54 cm (4.3–21 in) for simulations following the lower (B1) greenhouse gas (GHG) emissions scenario, from 14–61 cm (5.5–24 in) for the middle-upper (A2) emission scenario, and from 17–72 cm (6.7–28 in) for the highest (A1fi) scenario. In addition to relatively steady secular trends, sea levels along the California coast undergo shorter period variability above or below predicted tide levels and changes associated with long-term trends. These variations are caused by weather events and by seasonal to decadal climate fluctuations over the Pacific Ocean that in turn affect the Pacific coast. Highest coastal sea levels have occurred when winter storms and Pacific climate disturbances, such as El Niño, have coincided with high astronomical tides. This study considers a range of projected future

  5. Prediction and predictability of North American seasonal climate variability

    NASA Astrophysics Data System (ADS)

    Infanti, Johnna M.

    Climate prediction on short time-scales such as months to seasons is of broad and current interest in the scientific research community. Monthly and seasonal climate prediction of variables such as precipitation, temperature, and sea surface temperature (SST) has implications for users in the agricultural and water management domains, among others. It is thus important to further understand the complexities of prediction of these variables using the most recent practices in climate prediction. The overarching goal of this dissertation is to determine the important contributions to seasonal prediction skill, predictability, and variability over North America using current climate prediction models and approaches. This dissertation aims to study a variety of approaches to seasonal climate prediction of variables over North America, including both climate prediction systems and methods of analysis. We utilize the North American Multi-Model Ensemble (NMME) System for Intra-Seasonal to Inter-Annual Prediction (ISI) to study seasonal climate prediction skill of North American and in particular for southeast US precipitation. We find that NMME results are often equal to or better than individual model results in terms of skill, as expected, making it a reasonable choice for southeast US seasonal climate predictions. However, climate models, including those involved in NMME, typically overestimate eastern Pacific warming during central Pacific El Nino events, which can affect regions that are influenced by teleconnections, such as the southeast US. Community Climate System Model version 4.0 (CCSM4) hindacasts and forecasts are included in NMME, and we preform a series of experiments that examine contributions to skill from certain drivers of North American climate prediction. The drivers we focus on are sea surface temperatures (SSTs) and their accuracy, land and atmosphere initialization, and ocean-atmosphere coupling. We compare measures of prediction skill of

  6. Underestimation of Project Costs

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  7. Selection of sequence variants to improve dairy cattle genomic predictions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genomic prediction reliabilities improved when adding selected sequence variants from run 5 of the 1,000 bull genomes project. High density (HD) imputed genotypes for 26,970 progeny tested Holstein bulls were combined with sequence variants for 444 Holstein animals. The first test included 481,904 c...

  8. Budget Preparation and Inflation Prediction. AIR Forum Paper 1978.

    ERIC Educational Resources Information Center

    Garcia, Juan G.; And Others

    Price indices related to specific expenditure categories are necessary for realistic budget projections in higher education. Given the erratic inflationary behavior of the past seven years, and the uniqueness of higher education expenditures, realistic inflation prediction requires a balanced combination of analytic forecasting and intuitive…

  9. DOE-EPSCOR SPONSORED PROJECT FINAL REPORT

    SciTech Connect

    Zhu, Jianting

    2010-03-11

    Concern over the quality of environmental management and restoration has motivated the model development for predicting water and solute transport in the vadose zone. Soil hydraulic properties are required inputs to subsurface models of water flow and contaminant transport in the vadose zone. Computer models are now routinely used in research and management to predict the movement of water and solutes into and through the vadose zone of soils. Such models can be used successfully only if reliable estimates of the soil hydraulic parameters are available. The hydraulic parameters considered in this project consist of the saturated hydraulic conductivity and four parameters of the water retention curves. To quantify hydraulic parameters for heterogeneous soils is both difficult and time consuming. The overall objective of this project was to better quantify soil hydraulic parameters which are critical in predicting water flows and contaminant transport in the vadose zone through a comprehensive and quantitative study to predict heterogeneous soil hydraulic properties and the associated uncertainties. Systematic and quantitative consideration of the parametric heterogeneity and uncertainty can properly address and further reduce predictive uncertainty for contamination characterization and environmental restoration at DOE-managed sites. We conducted a comprehensive study to assess soil hydraulic parameter heterogeneity and uncertainty. We have addressed a number of important issues related to the soil hydraulic property characterizations. The main focus centered on new methods to characterize anisotropy of unsaturated hydraulic property typical of layered soil formations, uncertainty updating method, and artificial neural network base pedo-transfer functions to predict hydraulic parameters from easily available data. The work also involved upscaling of hydraulic properties applicable to large scale flow and contaminant transport modeling in the vadose zone and

  10. Seasonal Atmospheric and Oceanic Predictions

    NASA Technical Reports Server (NTRS)

    Roads, John; Rienecker, Michele (Technical Monitor)

    2003-01-01

    Several projects associated with dynamical, statistical, single column, and ocean models are presented. The projects include: 1) Regional Climate Modeling; 2) Statistical Downscaling; 3) Evaluation of SCM and NSIPP AGCM Results at the ARM Program Sites; and 4) Ocean Forecasts.

  11. Coastal Ohio Wind Project

    SciTech Connect

    Gorsevski, Peter; Afjeh, Abdollah; Jamali, Mohsin; Bingman, Verner

    2014-04-04

    The Coastal Ohio Wind Project intends to address problems that impede deployment of wind turbines in the coastal and offshore regions of Northern Ohio. The project evaluates different wind turbine designs and the potential impact of offshore turbines on migratory and resident birds by developing multidisciplinary research, which involves wildlife biology, electrical and mechanical engineering, and geospatial science. Firstly, the project conducts cost and performance studies of two- and three-blade wind turbines using a turbine design suited for the Great Lakes. The numerical studies comprised an analysis and evaluation of the annual energy production of two- and three-blade wind turbines to determine the levelized cost of energy. This task also involved wind tunnel studies of model wind turbines to quantify the wake flow field of upwind and downwind wind turbine-tower arrangements. The experimental work included a study of a scaled model of an offshore wind turbine platform in a water tunnel. The levelized cost of energy work consisted of the development and application of a cost model to predict the cost of energy produced by a wind turbine system placed offshore. The analysis found that a floating two-blade wind turbine presents the most cost effective alternative for the Great Lakes. The load effects studies showed that the two-blade wind turbine model experiences less torque under all IEC Standard design load cases considered. Other load effects did not show this trend and depending on the design load cases, the two-bladed wind turbine showed higher or lower load effects. The experimental studies of the wake were conducted using smoke flow visualization and hot wire anemometry. Flow visualization studies showed that in the downwind turbine configuration the wake flow was insensitive to the presence of the blade and was very similar to that of the tower alone. On the other hand, in the upwind turbine configuration, increasing the rotor blade angle of attack

  12. Prediction uncertainty of environmental change effects on temperate European biodiversity.

    PubMed

    Dormann, Carsten F; Schweiger, Oliver; Arens, P; Augenstein, I; Aviron, St; Bailey, Debra; Baudry, J; Billeter, R; Bugter, R; Bukácek, R; Burel, F; Cerny, M; Cock, Raphaël De; De Blust, Geert; DeFilippi, R; Diekötter, Tim; Dirksen, J; Durka, W; Edwards, P J; Frenzel, M; Hamersky, R; Hendrickx, Frederik; Herzog, F; Klotz, St; Koolstra, B; Lausch, A; Le Coeur, D; Liira, J; Maelfait, J P; Opdam, P; Roubalova, M; Schermann-Legionnet, Agnes; Schermann, N; Schmidt, T; Smulders, M J M; Speelmans, M; Simova, P; Verboom, J; van Wingerden, Walter; Zobel, M

    2008-03-01

    Observed patterns of species richness at landscape scale (gamma diversity) cannot always be attributed to a specific set of explanatory variables, but rather different alternative explanatory statistical models of similar quality may exist. Therefore predictions of the effects of environmental change (such as in climate or land cover) on biodiversity may differ considerably, depending on the chosen set of explanatory variables. Here we use multimodel prediction to evaluate effects of climate, land-use intensity and landscape structure on species richness in each of seven groups of organisms (plants, birds, spiders, wild bees, ground beetles, true bugs and hoverflies) in temperate Europe. We contrast this approach with traditional best-model predictions, which we show, using cross-validation, to have inferior prediction accuracy. Multimodel inference changed the importance of some environmental variables in comparison with the best model, and accordingly gave deviating predictions for environmental change effects. Overall, prediction uncertainty for the multimodel approach was only slightly higher than that of the best model, and absolute changes in predicted species richness were also comparable. Richness predictions varied generally more for the impact of climate change than for land-use change at the coarse scale of our study. Overall, our study indicates that the uncertainty introduced to environmental change predictions through uncertainty in model selection both qualitatively and quantitatively affects species richness projections. PMID:18070098

  13. Modeling Success in FLOSS Project Groups

    SciTech Connect

    Beaver, Justin M; Cui, Xiaohui; ST Charles, Jesse Lee; Potok, Thomas E

    2009-01-01

    A significant challenge in software engineering is accurately modeling projects in order to correctly forecast success or failure. The primary difficulty is that software development efforts are complex in terms of both the technical and social aspects of the engineering environment. This is compounded by the lack of real data that captures both the measures of success in performing a process, and the measures that reflect a group s social dynamics. This research focuses on the development of a model for predicting software project success that leverages the wealth of available open source project data in order to accurately model the behavior of those software engineering groups. Our model accounts for both the technical elements of software engineering as well as the social elements that drive the decisions of individual developers. We use agent-based simulations to represent the complexity of the group interactions, and base the behavior of the agents on the real software engineering data acquired. For four of the five project success measures, our results indicate that the developed model represents the underlying data well and provides accurate predictions of open source project success indicators.

  14. ON PREDICTION AND MODEL VALIDATION

    SciTech Connect

    M. MCKAY; R. BECKMAN; K. CAMPBELL

    2001-02-01

    Quantification of prediction uncertainty is an important consideration when using mathematical models of physical systems. This paper proposes a way to incorporate ''validation data'' in a methodology for quantifying uncertainty of the mathematical predictions. The report outlines a theoretical framework.

  15. The FLARECAST Project and What Lies Beyond

    NASA Astrophysics Data System (ADS)

    Georgoulis, Manolis K.; Flarecast Team

    2016-04-01

    Solar eruptions exhibit three different legs, namely flares, coronal mass ejections, and solar energetic particle (SEP) events. All these eruptive manifestations entail an impact in heliospheric space weather, at different spatial and temporal scales. Therefore, these eruptive manifestations should be ideally predicted to shield humanity and its assets in space and, in some cases, on Earth's surface. The EC has endorsed this need, calling for and funding projects targeted on the forecasting of aspects of the near-Earth space environment. The Flare Likelihood And Region Eruption foreCASTing (FLARECAST) is one of them, with an objective to develop a definitive, openly accessible solar-flare prediction facility. We will focus on the main attributes of this facility, namely its ability to expand by reconciling new flare predictors and its setup, that is intended to couple tactical understanding of the flare phenomenon with a consolidated view on how this understanding can be turned into a deliverable with practical, operational face value. A third component of the FLARECAST project, namely its exploratory part, aims to bridge flare prediction with prediction of CMEs and, hopefully, SPE events, touching the other two areas of space-weather forecasting. Fragmented but very significant work exists in these areas that prompts one to envision a future, EC-funded unified prediction platform that could address all forecasting needs of the Sun-generated space weather. Research partially funded by the European Union's Horizon 2020 Research and Innovation Programme under grant agreement No. 640216.

  16. Predictive assessment of reading.

    PubMed

    Wood, Frank B; Hill, Deborah F; Meyer, Marianne S; Flowers, D Lynn

    2005-12-01

    Study 1 retrospectively analyzed neuropsychological and psychoeducational tests given to N=220 first graders, with follow-up assessments in third and eighth grade. Four predictor constructs were derived: (1) Phonemic Awareness, (2) Picture Vocabulary, (3) Rapid Naming, and (4) Single Word Reading. Together, these accounted for 88%, 76%, 69%, and 69% of the variance, respectively, in first, third, and eighth grade Woodcock Johnson Broad Reading and eighth grade Gates-MacGinitie. When Single Word Reading was excluded from the predictors, the remaining predictors still accounted for 71%, 65%, 61%, and 65% of variance in the respective outcomes. Secondary analyses of risk of low outcome showed sensitivities/specificities of 93.0/91.0, and 86.4/84.9, respectively, for predicting which students would be in the bottom 15% and 30% of actual first grade WJBR. Sensitivities/specificities were 84.8/83.3 and 80.2/81.3, respectively, for predicting the bottom 15% and 30% of actual third grade WJBR outcomes; eighth grade outcomes had sensitivities/specificities of 80.0/80.0 and 85.7/83.1, respectively, for the bottom 15% and 30% of actual eighth grade WJBR scores. Study 2 cross-validated the concurrent predictive validities in an N=500 geographically diverse sample of late kindergartners through third graders, whose ethnic and racial composition closely approximated the national early elementary school population. New tests of the same four predictor domains were used, together taking only 15 minutes to administer by teachers; the new Woodcock-Johnson III Broad Reading standard score was the concurrent criterion, whose testers were blind to the predictor results. This cross-validation showed 86% of the variance accounted for, using the same regression weights as used in Study 1. With these weights, sensitivity/specificity values for the 15% and 30% thresholds were, respectively, 91.3/88.0 and 94.1/89.1. These validities and accuracies are stronger than others reported for

  17. Managing Projects with KPRO

    NASA Technical Reports Server (NTRS)

    Braden, Barry M.

    2004-01-01

    How does a Project Management Office provide: Consistent, familiar, easily used scheduling tools to Project Managers and project team members? Provide a complete list of organization resources available for use on the project? Facilitate resource tracking and visibility? Provide the myriad reports that the organization requires? Facilitate consistent budget planning and cost performance information? Provide all of this to the entire organization? Provide for the unique requirement of the organization? and get people to use it? Answer: Implementation of the Kennedy space Center Projects and Resources Online (KPRO), a modified COTS solution.

  18. ALS Project Management Manual

    SciTech Connect

    Krupnick, Jim; Harkins, Joe

    2000-05-01

    This manual has been prepared to help establish a consistent baseline of management practices across all ALS projects. It describes the initial process of planning a project, with a specific focus on the production of a formal project plan. We feel that the primary weakness in ALS project management efforts to date stems from a failure to appreciate the importance of ''up-front'' project planning. In this document, we present a guide (with examples) to preparing the documents necessary to properly plan, monitor, and control a project's activities. While following the manual will certainly not guarantee good project management, failure to address the issues we raise will dramatically reduce the chance of success. Here we define success as meeting the technical goals on schedule and within the prescribed budget.

  19. Advancing Drought Understanding, Monitoring and Prediction

    NASA Technical Reports Server (NTRS)

    Mariotti, Annarita; Schubert, Siegfried D.; Mo, Kingtse; Peters-Lidard, Christa; Wood, Andy; Pulwarty, Roger; Huang, Jin; Barrie, Dan

    2013-01-01

    , focused and coordinated research efforts are needed, drawing from excellence across the broad drought research community. To meet this challenge, National Oceanic and Atmospheric Administration (NOAA)'s Drought Task Force was established in October 2011 with the ambitious goal of achieving significant new advances in the ability to understand, monitor, and predict drought over North America. The Task Force (duration of October 2011-September 2014) is an initiative of NOAA's Climate Program Office Modeling, Analysis, Predictions, and Projections (MAPP) program in partnership with NIDIS. It brings together over 30 leading MAPP-funded drought scientists from multiple academic and federal institutions [involves scientists from NOAA's research laboratories and centers, the National Aeronautics and Space Administration (NASA), U.S. Department of Agriculture, National Center for Atmospheric Research (NCAR), and many universities] in a concerted research effort that builds on individual MAPP research projects. These projects span the wide spectrum of drought research needed to make fundamental advances, from those aimed at the basic understanding of drought mechanisms to those aimed at testing new drought monitoring and prediction tools for operational and service purposes (as part of NCEP's Climate Test Bed). The Drought Task Force provides focus and coordination to MAPP drought research activities and also facilitates synergies with other national and international drought research efforts, including those by the GDIS.

  20. Motor degradation prediction methods

    SciTech Connect

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.