Reducing the Risk of Human Space Missions with INTEGRITY
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.
2003-01-01
The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.
Future-Orientated Approaches to Curriculum Development: Fictive Scripting
ERIC Educational Resources Information Center
Garraway, James
2017-01-01
Though the future cannot be accurately predicted, it is possible to envisage a number of probable developments which can promote thinking about the future and so promote a more informed stance about what should or should not be done. Studies in technology and society have claimed that the use of a type of forecasting using plausible but imaginary…
Handbook for Conducting Future Studies in Education.
ERIC Educational Resources Information Center
Phi Delta Kappa, Bloomington, IN.
This handbook is designed to aid school administrators, policy-makers, and teachers in bringing a "futures orientation" to their schools. The first part of the book describes a "futuring process" developed as a tool for examining alternative future probabilities. It consists of a series of diverging and converging techniques that alternately…
History and future of remote sensing technology and education
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1980-01-01
A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.
NASA Astrophysics Data System (ADS)
Taner, M. U.; Ray, P.; Brown, C.
2016-12-01
Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.
Interfutures: Facing the Future, Mastering the Probable and Managing the Unpredictable.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
This report discusses the findings of the three year Interfutures Project which studied the future development of advanced industrial societies and the relations between these countries and the developing countries. The major emphasis of the project was to analyze economic problems. However, political and social elements were also studied. The…
Trempala, J; Malmberg, L E
1998-05-01
The purpose of this study was to describe the effect of a set of individual resources and cultural factors on adolescents' probability estimations of the occurrence of positive future events in three life domains: education, occupation, and family. The hypothesis was that the effects of culture and individual resources are interwoven in the formation process of future orientation. The sample consisted of 352 17-year-old Polish and Finnish girls and boys from vocational and upper secondary schools. The 78-item questionnaire developed by the authors was used to measure different aspects of future orientation (probability, valence, and extension of future events in three life domains) and individual resources (self-esteem, control beliefs, and social knowledge about normatively and the generation gap). Data analysis showed that culture separately affected individual resources and adolescents' expectations. However, the results broadly confirmed the thesis that the culture has a limited effect on adolescents' expectations of the occurrence of future events. Moreover, these data suggested that the influence of sociocultural differences on adolescents' probability estimations is indirect. In the context of the presented data, the authors discuss their model of future orientation.
20 Years of Research into Violence and Trauma: Past and Future Developments
ERIC Educational Resources Information Center
Kamphuis, Jan H.; Emmelkamp, Paul M. G.
2005-01-01
This reflection on major developments in the past, present, and future of the wider field of violence and trauma is a personal (and probably biased) sampling of what the authors hold to be important. The authors reviewed advances for victims and perpetrators of violence separately. For victims, the authors note that empirical research has…
Analysis of blocking probability for OFDM-based variable bandwidth optical network
NASA Astrophysics Data System (ADS)
Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi
2011-12-01
Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.
Global mega forces: Implications for the future of natural resources
George H. Kubik
2012-01-01
The purpose of this paper is to provide an overview of leading global mega forces and their importance to the future of natural resource decisionmaking, policy development, and operation. Global mega forces are defined as a combination of major trends, preferences, and probabilities that come together to produce the potential for future high-impact outcomes. These...
An approach to evaluating reactive airborne wind shear systems
NASA Technical Reports Server (NTRS)
Gibson, Joseph P., Jr.
1992-01-01
An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.
Domestic and world trends (1980 - 2000) affecting the future of aviation
NASA Technical Reports Server (NTRS)
Friedman, N.; Overholt, W.; Thomas, J.; Wiener, A. J.
1975-01-01
Variables affecting aviation in the United States during the last fifth of the twentieth century are studied. Estimates of relevant future developments are presented and their probable impact on the aviation industry in this country are identified. A series of key trends relating to economic, social, political, technological, ecological and environmental developments are identified and discussed with relation to their possible effects on aviation. From this analysis, a series of scenarios are developed representing an array of possibilities ranging from severe economic depression and high international tension on the one hand, to a world of detente which enjoys an unprecedented economic growth rate and relaxation of tensions on the other. A surprise free scenario is presented which represents the best judgment of the manner in which events will most probably develop and the effect on the aviation industry such developments will likely produce.
Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F
2010-12-01
We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.
The ESA activities on future launchers
NASA Technical Reports Server (NTRS)
Pfeffer, H.
1984-01-01
A future launcher development scenario depends on many assumptions, such as the impetus provided by the probability of future missions, and the political willingness of member states to undertake future developments. Because of the long timescale implied by a coherent launcher development, a step-wise approach within an overall future launcher development plan appears essential. The definition of development steps allows the launcher developments to be adapted to the driving external forces, so that no possible opportunity to Europe in the space launch business is missed out because of improper planning on the absence of a long term goal. The launcher senario, to be presented in 1985, forms part of Europe's overall STS plan for the future. This overall STS plan is one product of the complete STS LTPP, a first draft of which should exist by 1985, and which will be updated regularly to take into account the changing political and economic perspectives.
Assessment of potential future hydrogen markets in the U.S.
NASA Technical Reports Server (NTRS)
Kashani, A. K.
1980-01-01
Potential future hydrogen markets in the United States are assessed. Future hydrogen markets for various use sectors are projected, the probable range of hydrogen production costs from various alternatives is estimated, stimuli and barriers to the development of hydrogen markets are discussed, an overview of the status of technologies for the production and utilization of hydrogen is presented, and, finally, societal aspects of hydrogen production and utilization are discussed.
McGowan, Conor P.; Allan, Nathan; Servoss, Jeff; Hedwall, Shaula J.; Wooldridge, Brian
2017-01-01
Assessment of a species' status is a key part of management decision making for endangered and threatened species under the U.S. Endangered Species Act. Predicting the future state of the species is an essential part of species status assessment, and projection models can play an important role in developing predictions. We built a stochastic simulation model that incorporated parametric and environmental uncertainty to predict the probable future status of the Sonoran desert tortoise in the southwestern United States and North Central Mexico. Sonoran desert tortoise was a Candidate species for listing under the Endangered Species Act, and decision makers wanted to use model predictions in their decision making process. The model accounted for future habitat loss and possible effects of climate change induced droughts to predict future population growth rates, abundances, and quasi-extinction probabilities. Our model predicts that the population will likely decline over the next few decades, but there is very low probability of quasi-extinction less than 75 years into the future. Increases in drought frequency and intensity may increase extinction risk for the species. Our model helped decision makers predict and characterize uncertainty about the future status of the species in their listing decision. We incorporated complex ecological processes (e.g., climate change effects on tortoises) in transparent and explicit ways tailored to support decision making processes related to endangered species.
Training in Small Business Retailing: Testing Human Capital Theory.
ERIC Educational Resources Information Center
Barcala, Marta Fernandez; Perez, Maria Jose Sanzo; Gutierrez, Juan Antonio Trespalacios
1999-01-01
Looks at four models of training demand: (1) probability of attending training in the near future; (2) probability of having attended training in the past; (3) probability of being willing to follow multimedia and correspondence courses; and (4) probability of repeating the experience of attending another training course in the near future.…
Projecting Future Sea Level Rise for Water Resources Planning in California
NASA Astrophysics Data System (ADS)
Anderson, J.; Kao, K.; Chung, F.
2008-12-01
Sea level rise is one of the major concerns for the management of California's water resources. Higher water levels and salinity intrusion into the Sacramento-San Joaquin Delta could affect water supplies, water quality, levee stability, and aquatic and terrestrial flora and fauna species and their habitat. Over the 20th century, sea levels near San Francisco Bay increased by over 0.6ft. Some tidal gauge and satellite data indicate that rates of sea level rise are accelerating. Sea levels are expected to continue to rise due to increasing air temperatures causing thermal expansion of the ocean and melting of land-based ice such as ice on Greenland and in southeastern Alaska. For water planners, two related questions are raised on the uncertainty of future sea levels. First, what is the expected sea level at a specific point in time in the future, e.g., what is the expected sea level in 2050? Second, what is the expected point of time in the future when sea levels will exceed a certain height, e.g., what is the expected range of time when the sea level rises by one foot? To address these two types of questions, two factors are considered: (1) long term sea level rise trend, and (2) local extreme sea level fluctuations. A two-step approach will be used to develop sea level rise projection guidelines for decision making that takes both of these factors into account. The first step is developing global sea level rise probability distributions for the long term trends. The second step will extend the approach to take into account the effects of local astronomical tides, changes in atmospheric pressure, wind stress, floods, and the El Niño/Southern Oscillation. In this paper, the development of the first step approach is presented. To project the long term sea level rise trend, one option is to extend the current rate of sea level rise into the future. However, since recent data indicate rates of sea level rise are accelerating, methods for estimating sea level rise that account for this acceleration are needed. One such method is an empirical relationship between air temperatures and global sea levels. The air temperature-sea level rise relationship was applied to the 12 climate change projections selected by the California Climate Action Team to estimate future sea levels. The 95% confidence level developed from the historical data was extrapolated to estimate the uncertainties in the future projections. To create sea level rise trend probability distributions, a lognormal probability distribution and a generalized extreme value probability distribution are used. Parameter estimations for these distributions are subjective and inevitably involve uncertainties, which will be improved as more research is conducted in this area.
A 30-year history of earthquake crisis communication in California and lessons for the future
NASA Astrophysics Data System (ADS)
Jones, L.
2015-12-01
The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories
Redundant actuator development study. [flight control systems for supersonic transport aircraft
NASA Technical Reports Server (NTRS)
Ryder, D. R.
1973-01-01
Current and past supersonic transport configurations are reviewed to assess redundancy requirements for future airplane control systems. Secondary actuators used in stability augmentation systems will probably be the most critical actuator application and require the highest level of redundancy. Two methods of actuator redundancy mechanization have been recommended for further study. Math models of the recommended systems have been developed for use in future computer simulations. A long range plan has been formulated for actuator hardware development and testing in conjunction with the NASA Flight Simulator for Advanced Aircraft.
The Everett-Wheeler interpretation and the open future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudbery, Anthony
2011-03-28
I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.
Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.
Speirs, Calandra; Huang, Vivian; Konnert, Candace
2017-09-01
Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.
Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century
Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.
2011-01-01
Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained <10% through the end of the century. The summed probabilities of vulnerable, rare, and extirpated (P(v,r,e)) increased from a current level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.
Overview of Fiber-Optical Sensors
NASA Technical Reports Server (NTRS)
Depaula, Ramon P.; Moore, Emery L.
1987-01-01
Design, development, and sensitivity of sensors using fiber optics reviewed. State-of-the-art and probable future developments of sensors using fiber optics described in report including references to work in field. Serves to update previously published surveys. Systems incorporating fiber-optic sensors used in medical diagnosis, navigation, robotics, sonar, power industry, and industrial controls.
Stimulus probability effects in absolute identification.
Kent, Christopher; Lamberts, Koen
2016-05-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
The potential impact of MMICs on future satellite communications
NASA Technical Reports Server (NTRS)
Dunn, Vernon E.
1988-01-01
This is the Final Report representing the results of a 17-month study on the future trends and requirements of Monolithic Microwave Integrated Circuits (MMIC) for space communication applications. Specifically this report identifies potential space communication applications of MMICs, assesses the impact of MMIC on the classes of systems that were identified, determines the present status and probable 10-year growth in capability of required MMIC and competing technologies, identifies the applications most likely to benefit from further MMIC development and presents recommendations for NASA development activities to address the needs of these applications.
The potential impact of MMICs on future satellite communications: Executive summary
NASA Technical Reports Server (NTRS)
Dunn, Vernon E.
1988-01-01
This Executive Summary presents the results of a 17-month study on the future trends and requirments for Monolithic Microwave Integrated circuits (MMIC) for space communication application. Specifically this report identifies potential space communication applications of MMICs, assesses the impact of MMIC on the classes of systems that were identified, determines the present status and probable 10-year growth in capability of required MMIC and competing technologies, identifies the applications most likely to benefit from further MMIC development, and presents recommendations for NASA development activities to address the needs of these applications.
Making Computers Smarter: A Look At the Controversial Field of Artificial Intelligence.
ERIC Educational Resources Information Center
Green, John O.
1984-01-01
Defines artificial intelligence (AI) and discusses its history; the current state of the art, research, experimentation, and practical applications; and probable future developments. Key dates in the history of AI and eight references are provided. (MBR)
NASA Astrophysics Data System (ADS)
Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.
2017-06-01
In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.
Financial issues for commercial space ventures: Paying for the dreams
NASA Technical Reports Server (NTRS)
Egan, J. J.
1984-01-01
Various financial issues involved in commercial space enterprise are discussed. Particular emphasis is placed on the materials processing area: the current state of business plan and financial developments, what is needed for enhanced probability of success of future materials development efforts in attracting financial backing, and finally, the risks involved in this entire business area.
NASA Technical Reports Server (NTRS)
Arbuckle, P. Douglas; Abbott, Kathy H.; Abbott, Terence S.; Schutte, Paul C.
1998-01-01
The evolution of commercial transport flight deck configurations over the past 20-30 years and expected future developments are described. Key factors in the aviation environment are identified that the authors expect will significantly affect flight deck designers. One of these is the requirement for commercial aviation accident rate reduction, which is probably required if global commercial aviation is to grow as projected. Other factors include the growing incrementalism in flight deck implementation, definition of future airspace operations, and expectations of a future pilot corps that will have grown up with computers. Future flight deck developments are extrapolated from observable factors in the aviation environment, recent research results in the area of pilot-centered flight deck systems, and by considering expected advances in technology that are being driven by other than aviation requirements. The authors hypothesize that revolutionary flight deck configuration changes will be possible with development of human-centered flight deck design methodologies that take full advantage of commercial and/or entertainment-driven technologies.
Bush, Peter W.; Johnston, Richard H.
1988-01-01
A considerable area remains of the Floridan aquifer system where large ground-water supplies may be developed. This area is largely inland from the coasts and characterized by high transmissivity and minimal development prior to the early 1980's. The major constraint on future development probably is degradation of water quality rather than water-quantity limitations.
Improving default risk prediction using Bayesian model uncertainty techniques.
Kazemi, Reza; Mosleh, Ali
2012-11-01
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.
Establishing endangered species recovery criteria using predictive simulation modeling
McGowan, Conor P.; Catlin, Daniel H.; Shaffer, Terry L.; Gratto-Trevor, Cheri L.; Aron, Carol
2014-01-01
Listing a species under the Endangered Species Act (ESA) and developing a recovery plan requires U.S. Fish and Wildlife Service to establish specific and measurable criteria for delisting. Generally, species are listed because they face (or are perceived to face) elevated risk of extinction due to issues such as habitat loss, invasive species, or other factors. Recovery plans identify recovery criteria that reduce extinction risk to an acceptable level. It logically follows that the recovery criteria, the defined conditions for removing a species from ESA protections, need to be closely related to extinction risk. Extinction probability is a population parameter estimated with a model that uses current demographic information to project the population into the future over a number of replicates, calculating the proportion of replicated populations that go extinct. We simulated extinction probabilities of piping plovers in the Great Plains and estimated the relationship between extinction probability and various demographic parameters. We tested the fit of regression models linking initial abundance, productivity, or population growth rate to extinction risk, and then, using the regression parameter estimates, determined the conditions required to reduce extinction probability to some pre-defined acceptable threshold. Binomial regression models with mean population growth rate and the natural log of initial abundance were the best predictors of extinction probability 50 years into the future. For example, based on our regression models, an initial abundance of approximately 2400 females with an expected mean population growth rate of 1.0 will limit extinction risk for piping plovers in the Great Plains to less than 0.048. Our method provides a straightforward way of developing specific and measurable recovery criteria linked directly to the core issue of extinction risk. Published by Elsevier Ltd.
The Impact of Recreational Facilities on National Park Landscapes.
ERIC Educational Resources Information Center
Fitzsimmons, Allan K.
1979-01-01
Discusses a study to examine developed acreage in 14 national parks. Total park acreage is compared to service center and campground acreage and total mileage of primary and secondary roads. The most probable future for national park landscapes is maintenance of the status quo. (Author/KC)
The Five-Year Outlook on Science and Technology: 1982.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Committee on Science and Public Policy.
Presented are reports on trends and probable future developments in eight selected areas of basic science and engineering. These reports are: "The Genetic Program of Complex Organisms" (Maxine F. Singer); "The Molecular and Genetic Technology of Plants" (Joseph E. Varner); "Cell Receptors for Hormones and…
Present and Probable CATV/Broadband-Communication Technology.
ERIC Educational Resources Information Center
Ward, John E.
The study reports on technical and cost factors affecting future growth of Cable TV (CATV) systems and the development of the "wired nation." Comparisons are made between alternatives for distributing CATV signals and alternative prototypes for subscriber home terminals. Multi-cable, augmented-channel (with converter), and switched CATV…
Bayesian probability of success for clinical trials using historical data
Ibrahim, Joseph G.; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F.; Heyse, Joseph F.
2015-01-01
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein’s work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499
Bayesian probability of success for clinical trials using historical data.
Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F
2015-01-30
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. Copyright © 2014 John Wiley & Sons, Ltd.
Lee, Sunghee; Liu, Mingnan; Hu, Mengyao
2017-06-01
Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.
Lee, Sunghee; Liu, Mingnan; Hu, Mengyao
2017-01-01
Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381
Tang, Zhongwen
2015-01-01
An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.
Characterizing user requirements for future land observing satellites
NASA Technical Reports Server (NTRS)
Barker, J. L.; Cressy, P. J.; Schnetzler, C. C.; Salomonson, V. V.
1981-01-01
The objective procedure was developed for identifying probable sensor and mission characteristics for an operational satellite land observing system. Requirements were systematically compiled, quantified and scored by type of use, from surveys of federal, state, local and private communities. Incremental percent increases in expected value of data were estimated for critical system improvements. Comparisons with costs permitted selection of a probable sensor system, from a set of 11 options, with the following characteristics: 30 meter spatial resolution in 5 bands and 15 meters in 1 band, spectral bands nominally at Thematic Mapper (TM) bands 1 through 6 positions, and 2 day data turn around for receipt of imagery. Improvements are suggested for both the form of questions and the procedures for analysis of future surveys in order to provide a more quantitatively precise definition of sensor and mission requirements.
Steve Ostro and the Near-Earth Asteroid Impact Hazard
NASA Astrophysics Data System (ADS)
Chapman, Clark R.
2009-09-01
The late Steve Ostro, whose scientific interests in Near-Earth Asteroids (NEAs) primarily related to his planetary radar research in the 1980s, soon became an expert on the impact hazard. He quickly realized that radar provided perspectives on close-approaching NEAs that were both very precise as well as complementary to traditional astrometry, enabling good predictions of future orbits and collision probabilities extending for centuries into the future. He also was among the few astronomers who considered the profound issues raised by this newly recognized hazard and by early suggestions of how to mitigate the hazard. With Carl Sagan, Ostro articulated the "deflection dilemma" and other potential low-probability but real dangers of mitigation technologies that might be more serious than the low-probability impact hazard itself. Yet Ostro maintained a deep interest in developing responsible mitigation technologies, in educating the public about the nature of the impact hazard, and in learning more about the population of threatening bodies, especially using the revealing techniques of delay-doppler radar mapping of NEAs and their satellites.
Charts designate probable future oceanographic research fields
NASA Technical Reports Server (NTRS)
1968-01-01
Charts outline the questions and problems of oceanographic research in the future. NASA uses the charts to estimate the probable requirements for instrumentation carried by satellites engaged in cooperative programs with other agencies concerned with identification, analysis, and solution of many of these problems.
Pathways to the Future: Linking Environmental Scanning to Strategic Management.
ERIC Educational Resources Information Center
Mecca, Thomas V.; Morrison, James L.
ED QUEST (Quick Environmental Scanning Techniques) is a strategic planning process designed to identify emerging trends, issues, and events which portend threats or opportunities for colleges and universities, analyze their probable impact on the institution, and facilitate the development of appropriate institutional strategies. A workshop was…
"If It Is Dreamable It Is Doable": The Role of Desired Job Flexibility in Imagining the Future
ERIC Educational Resources Information Center
Guglielmi, Dina; Chiesa, Rita; Mazzetti, Greta
2016-01-01
Purpose: The purpose of this paper is to compare how the dimension of attitudes toward future that consists in perception of dynamic future may be affected by desirable goals (desired job flexibility) and probable events (probable job flexibility) in a group of permanent vs temporary employees. Moreover the aim is to explore the gender differences…
Bean, Nigel G.; Ruberu, Ravi P.
2017-01-01
Background The external validity, or generalizability, of trials and guidelines has been considered poor in the context of multiple morbidity. How multiple morbidity might affect the magnitude of benefit of a given treatment, and thereby external validity, has had little study. Objective To provide a method of decision analysis to quantify the effects of age and comorbidity on the probability of deriving a given magnitude of treatment benefit. Design We developed a method to calculate probabilistically the effect of all of a patient’s comorbidities on their underlying utility, or well-being, at a future time point. From this, we derived a distribution of possible magnitudes of treatment benefit at that future time point. We then expressed this distribution as the probability of deriving at least a given magnitude of treatment benefit. To demonstrate the applicability of this method of decision analysis, we applied it to the treatment of hypercholesterolaemia in a geriatric population of 50 individuals. We highlighted the results of four of these individuals. Results This method of analysis provided individualized quantifications of the effect of age and comorbidity on the probability of treatment benefit. The average probability of deriving a benefit, of at least 50% of the magnitude of benefit available to an individual without comorbidity, was only 0.8%. Conclusion The effects of age and comorbidity on the probability of deriving significant treatment benefits can be quantified for any individual. Even without consideration of other factors affecting external validity, these effects may be sufficient to guide decision-making. PMID:29090189
Real-Time Safety Monitoring and Prediction for the National Airspace System
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil
2016-01-01
As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have both an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecasts, predicted health of assets in the airspace, and so on. To this end, we have developed a Real-Time Safety Monitoring (RTSM) that first, estimates the state of the NAS using the dynamic models. Then, given the state estimate and a probability distribution of future inputs to the NAS, the framework predicts the evolution of the NAS, i.e., the future state, and analyzes these future states to predict the occurrence of unsafe events. The entire probability distribution of airspace safety metrics is computed, not just point estimates, without significant assumptions regarding the distribution type and or parameters. We demonstrate our overall approach by predicting the occurrence of some unsafe events and show how these predictions evolve in time as flight operations progress.
An information diffusion technique to assess integrated hazard risks.
Huang, Chongfu; Huang, Yundong
2018-02-01
An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.
Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Djulbegovic, Benjamin
2014-10-21
To understand how often 'breakthroughs,' that is, treatments that significantly improve health outcomes, can be developed. We applied weighted adaptive kernel density estimation to construct the probability density function for observed treatment effects from five publicly funded cohorts and one privately funded group. 820 trials involving 1064 comparisons and enrolling 331,004 patients were conducted by five publicly funded cooperative groups. 40 cancer trials involving 50 comparisons and enrolling a total of 19,889 patients were conducted by GlaxoSmithKline. We calculated that the probability of detecting treatment with large effects is 10% (5-25%), and that the probability of detecting treatment with very large treatment effects is 2% (0.3-10%). Researchers themselves judged that they discovered a new, breakthrough intervention in 16% of trials. We propose these figures as the benchmarks against which future development of 'breakthrough' treatments should be measured. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Estimation of State Transition Probabilities: A Neural Network Model
NASA Astrophysics Data System (ADS)
Saito, Hiroshi; Takiyama, Ken; Okada, Masato
2015-12-01
Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.
Probability of success for phase III after exploratory biomarker analysis in phase II.
Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver
2017-05-01
The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.
Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2014-01-01
The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.
Probability Forecasting Using Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Duncan, M.; Frisbee, J.; Wysack, J.
2014-09-01
Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.
Potential Future Igneous Activity at Yucca Mountain, Nevada
NASA Astrophysics Data System (ADS)
Cline, M.; Perry, F. V.; Valentine, G. A.; Smistad, E.
2005-12-01
Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgement, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 X 10-8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. U.S. Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 X 10-8 be evaluated. Two consequence scenarios are considered; 1) igneous intrusion-groundwater transport case and 2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of the waste packages into the atmosphere, deposition of a tephra sheet, and redistribution of the contaminated ash. In both cases radioactive material is released to the accessible environment either through groundwater transport or through the atmospheric dispersal and deposition. Six Quaternary volcanic centers exist within 20 km of Yucca Mountain. Lathrop Wells cone (LWC), the youngest (approximately 75,000 yrs), is a well-preserved cinder cone with associated flows and tephra sheet that provides an excellent analogue for consequence studies related to future volcanism. Cone, lavas, hydrovolcanic ash, and ash-fall tephra have been examined to estimate eruptive volume and eruption type. LWC ejecta volumes suggest basaltic volcanism may be waning in the Yucca Mountain region.. The eruptive products indicate a sequence of initial fissure fountaining, early Strombolian ash and lapilli deposition forming the scoria cone, a brief hydrovolcanic pulse (possibly limited to the NW sector), and a violent Strombolian phase. Mathematical models have been developed to represent magmatic processes and their consequences on proposed repository performance. These models address dike propagation, magma interaction and flow into drifts, eruption through the proposed repository, and post intrusion/eruption effects. These models continue to be refined to reduce the uncertainty associated with the consequences from a possible future igneous event.
Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick
2012-01-01
Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.
Mediators of the Availability Heuristic in Probability Estimates of Future Events.
ERIC Educational Resources Information Center
Levi, Ariel S.; Pryor, John B.
Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…
NASA Astrophysics Data System (ADS)
Kaneko, Yoshihiro; Wallace, Laura M.; Hamling, Ian J.; Gerstenberger, Matthew C.
2018-05-01
Slow slip events (SSEs) have been documented in subduction zones worldwide, yet their implications for future earthquake occurrence are not well understood. Here we develop a relatively simple, simulation-based method for estimating the probability of megathrust earthquakes following tectonic events that induce any transient stress perturbations. This method has been applied to the locked Hikurangi megathrust (New Zealand) surrounded on all sides by the 2016 Kaikoura earthquake and SSEs. Our models indicate the annual probability of a M≥7.8 earthquake over 1 year after the Kaikoura earthquake increases by 1.3-18 times relative to the pre-Kaikoura probability, and the absolute probability is in the range of 0.6-7%. We find that probabilities of a large earthquake are mainly controlled by the ratio of the total stressing rate induced by all nearby tectonic sources to the mean stress drop of earthquakes. Our method can be applied to evaluate the potential for triggering a megathrust earthquake following SSEs in other subduction zones.
An operational system of fire danger rating over Mediterranean Europe
NASA Astrophysics Data System (ADS)
Pinto, Miguel M.; DaCamara, Carlos C.; Trigo, Isabel F.; Trigo, Ricardo M.
2017-04-01
A methodology is presented to assess fire danger based on the probability of exceedance of prescribed thresholds of daily released energy. The procedure is developed and tested over Mediterranean Europe, defined by latitude circles of 35 and 45°N and meridians of 10°W and 27.5°E, for the period 2010-2016. The procedure involves estimating the so-called static and daily probabilities of exceedance. For a given point, the static probability is estimated by the ratio of the number of daily fire occurrences releasing energy above a given threshold to the total number of occurrences inside a cell centred at the point. The daily probability of exceedance which takes into account meteorological factors by means of the Canadian Fire Weather Index (FWI) is in turn estimated based on a Generalized Pareto distribution with static probability and FWI as covariates of the scale parameter. The rationale of the procedure is that small fires, assessed by the static probability, have a weak dependence on weather, whereas the larger fires strongly depend on concurrent meteorological conditions. It is shown that observed frequencies of exceedance over the study area for the period 2010-2016 match with the estimated values of probability based on the developed models for static and daily probabilities of exceedance. Some (small) variability is however found between different years suggesting that refinements can be made in future works by using a larger sample to further increase the robustness of the method. The developed methodology presents the advantage of evaluating fire danger with the same criteria for all the study area, making it a good parameter to harmonize fire danger forecasts and forest management studies. Research was performed within the framework of EUMETSAT Satellite Application Facility for Land Surface Analysis (LSA SAF). Part of methods developed and results obtained are on the basis of the platform supported by The Navigator Company that is currently providing information about fire meteorological danger for Portugal for a wide range of users.
Interaction in Asynchronous Web-Based Learning Environments
ERIC Educational Resources Information Center
Woo, Younghee; Reeves, Thomas C.
2008-01-01
Because of the perceived advantages and the promotion of Web-based learning environments (WBLEs) by commercial interests as well as educational technologists, knowing how to develop and implement WBLEs will probably not be a choice, but a necessity for most educators and trainers in the future. However, many instructors still don't understand the…
Risk of Skin Cancer from Space Radiation. Chapter 11
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; George, Kerry A.; Wu, Hong-Lu
2003-01-01
We review the methods for estimating the probability of increased incidence of skin cancers from space radiation exposure, and describe some of the individual factors that may contribute to risk projection models, including skin pigment, and synergistic effects of combined ionizing and UV exposure. The steep dose gradients from trapped electrons, protons, and heavy ions radiation during EVA and limitations in EVA dosimetry are important factors for projecting skin cancer risk of astronauts. We estimate that the probability of increased skin cancer risk varies more than 10-fold for individual astronauts and that the risk of skin cancer could exceed 1 % for future lunar base operations for astronauts with light skin color and hair. Limitations in physical dosimetry in estimating the distribution of dose at the skin suggest that new biodosimetry methods be developed for responding to accidental overexposure of the skin during future space missions.
Future WGCEP Models and the Need for Earthquake Simulators
NASA Astrophysics Data System (ADS)
Field, E. H.
2008-12-01
The 2008 Working Group on California Earthquake Probabilities (WGCEP) recently released the Uniform California Earthquake Rupture Forecast version 2 (UCERF 2), developed jointly by the USGS, CGS, and SCEC with significant support from the California Earthquake Authority. Although this model embodies several significant improvements over previous WGCEPs, the following are some of the significant shortcomings that we hope to resolve in a future UCERF3: 1) assumptions of fault segmentation and the lack of fault-to-fault ruptures; 2) the lack of an internally consistent methodology for computing time-dependent, elastic-rebound-motivated renewal probabilities; 3) the lack of earthquake clustering/triggering effects; and 4) unwarranted model complexity. It is believed by some that physics-based earthquake simulators will be key to resolving these issues, either as exploratory tools to help guide the present statistical approaches, or as a means to forecast earthquakes directly (although significant challenges remain with respect to the latter).
Mourão-Miranda, Janaina; Oliveira, Leticia; Ladouceur, Cecile D; Marquand, Andre; Brammer, Michael; Birmaher, Boris; Axelson, David; Phillips, Mary L
2012-01-01
There are no known biological measures that accurately predict future development of psychiatric disorders in individual at-risk adolescents. We investigated whether machine learning and fMRI could help to: 1. differentiate healthy adolescents genetically at-risk for bipolar disorder and other Axis I psychiatric disorders from healthy adolescents at low risk of developing these disorders; 2. identify those healthy genetically at-risk adolescents who were most likely to develop future Axis I disorders. 16 healthy offspring genetically at risk for bipolar disorder and other Axis I disorders by virtue of having a parent with bipolar disorder and 16 healthy, age- and gender-matched low-risk offspring of healthy parents with no history of psychiatric disorders (12-17 year-olds) performed two emotional face gender-labeling tasks (happy/neutral; fearful/neutral) during fMRI. We used Gaussian Process Classifiers (GPC), a machine learning approach that assigns a predictive probability of group membership to an individual person, to differentiate groups and to identify those at-risk adolescents most likely to develop future Axis I disorders. Using GPC, activity to neutral faces presented during the happy experiment accurately and significantly differentiated groups, achieving 75% accuracy (sensitivity = 75%, specificity = 75%). Furthermore, predictive probabilities were significantly higher for those at-risk adolescents who subsequently developed an Axis I disorder than for those at-risk adolescents remaining healthy at follow-up. We show that a combination of two promising techniques, machine learning and neuroimaging, not only discriminates healthy low-risk from healthy adolescents genetically at-risk for Axis I disorders, but may ultimately help to predict which at-risk adolescents subsequently develop these disorders.
ERIC Educational Resources Information Center
Jayawardena, Lal
This presentation reviews the key dimensions of the environment problem and estimates the probable costs of arresting future environmental damage by expenditures to be undertaken in support of sustainable development during the decade of the 90s. It deals with the problem of pursuing a minimum "socially necessary" growth rate in the world economy…
[The future of inflammatory bowel disease from the perspective of Digestive Disease Week 2012].
Gomollón, Fernando
2012-09-01
The new information presented in Digestive Disease Week has allowed us to speculate on the future of inflammatory bowel disease. Manipulation of diet and the microbioma will probably play an increasingly important role in the treatment of this disease and, in the long term, in its prevention. Biological agents will probably be used earlier and more widely; new information on levels of biological agents, mucosal healing and new comparative studies will also allow these agents to be used in a more precise and personalized way. In addition to infliximab, adalimumab, natalizumab and certolizumab, other biological agents will be employed; among the first of these to be used will be ustekinumab, golimumab and vedolizumab. In the near future, biological agents will be used as frequently in ulcerative colitis as in Crohn's disease. New healthcare models will be developed that will progressively include greater participation among patients and nurses. The ability to predict new diagnostic and prognostic models will allow decisions to be more individualized. Copyright © 2012 Elsevier España, S.L. All rights reserved.
Oddo, Perry C.; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies. PMID:28350884
Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.
Bittencourt, Flora; Alves, Jackeline S; Gaiotto, Fernanda A
2015-12-01
We developed microsatellite markers for Carpotroche brasiliensis (Flacourtiaceae), a dioecious tree that is used as a food resource by midsize animals of the Brazilian fauna. We designed 30 primer pairs using next-generation sequencing and classified 25 pairs as polymorphic. Observed heterozygosity ranged from 0.5 to 1.0, and expected heterozygosity ranged from 0.418 to 0.907. The combined probability of exclusion was greater than 0.999 and the combined probability of identity was less than 0.001, indicating that these microsatellites are appropriate for investigations of genetic structure, individual identification, and paternity testing. The developed molecular tools may contribute to future studies of population genetics, answering ecological and evolutionary questions regarding efficient conservation strategies for C. brasiliensis.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
French, Jeff
2009-11-01
Social marketing is a highly systematic approach to health improvement that sets out unambiguous success criteria focused on behaviour change. This paper reviews the key concepts and principles of social marketing and its recent rapid development across government in England in the public health field. This paper outlines the role of the National Social Marketing Centre and concludes with a discussion of the probable future impact of social marketing on public health practice. The paper argues that there is a close ideological match between social marketing and liberal democratic imperatives. Social marketing's focus on outcome, return on investment and its emphasis on developing interventions that can respond to diverse needs, means it is probable that social marketing will increasingly be required by governments as a standard part of public health programmes.
Renal angina: concept and development of pretest probability assessment in acute kidney injury.
Chawla, Lakhmir S; Goldstein, Stuart L; Kellum, John A; Ronco, Claudio
2015-02-27
The context of a diagnostic test is a critical component for the interpretation of its result. This context defines the pretest probability of the diagnosis and forms the basis for the interpretation and value of adding the diagnostic test. In the field of acute kidney injury, a multitude of early diagnostic biomarkers have been developed, but utilization in the appropriate context is less well understood and has not been codified until recently. In order to better operationalize the context and pretest probability assessment for acute kidney injury diagnosis, the renal angina concept was proposed in 2010 for use in both children and adults. Renal angina has been assessed in approximately 1,000 subjects. However, renal angina as a concept is still unfamiliar to most clinicians and the rationale for introducing the term is not obvious. We therefore review the concept and development of renal angina, and the currently available data validating it. We discuss the various arguments for and against this construct. Future research testing the performance of renal angina with acute kidney injury biomarkers is warranted.
Expressed Likelihood as Motivator: Creating Value through Engaging What’s Real
Higgins, E. Tory; Franks, Becca; Pavarini, Dana; Sehnert, Steen; Manley, Katie
2012-01-01
Our research tested two predictions regarding how likelihood can have motivational effects as a function of how a probability is expressed. We predicted that describing the probability of a future event that could be either A or B using the language of high likelihood (“80% A”) rather than low likelihood (“20% B”), i.e., high rather than low expressed likelihood, would make a present activity more real and engaging, as long as the future event had properties relevant to the present activity. We also predicted that strengthening engagement from the high (vs. low) expressed likelihood of a future event would intensify the value of present positive and negative objects (in opposite directions). Both predictions were supported. There was also evidence that this intensification effect from expressed likelihood was independent of the actual probability or valence of the future event. What mattered was whether high versus low likelihood language was used to describe the future event. PMID:23940411
Hardmeyer, Kent; Spencer, Michael A
2007-04-01
This article provides an overview of the use of risk-based analysis (RBA) in flood damage assessment, and it illustrates the use of Geographic Information Systems (GIS) in identifying flood-prone areas, which can aid in flood-mitigation planning assistance. We use RBA to calculate expected annual flood damages in an urban watershed in the state of Rhode Island, USA. The method accounts for the uncertainty in the three primary relationships used in computing flood damage: (1) the probability that a given flood will produce a given amount of floodwater, (2) the probability that a given amount of floodwater will reach a certain stage or height, and (3) the probability that a certain stage of floodwater will produce a given amount of damage. A greater than 50% increase in expected annual flood damage is estimated for the future if previous development patterns continue and flood-mitigation measures are not taken. GIS is then used to create a map that shows where and how often floods might occur in the future, which can help (1) identify priority areas for flood-mitigation planning assistance and (2) disseminate information to public officials and other decision-makers.
NASA Technical Reports Server (NTRS)
Beckenbach, E. S. (Editor)
1984-01-01
It is more important than ever that engineers have an understanding of the future needs of clinical and research medicine, and that physicians know somthing about probable future developments in instrumentation capabilities. Only by maintaining such a dialog can the most effective application of technological advances to medicine be achieved. This workshop attempted to provide this kind of information transfer in the limited field of diagnostic imaging. Biomedical research at the Jet Propulsion Laboratory is discussed, taking into account imaging results from space exploration missions, as well as biomedical research tasks based in these technologies. Attention is also given to current and future indications for magnetic resonance in medicine, high speed quantitative digital microscopy, computer processing of radiographic images, computed tomography and its modern applications, position emission tomography, and developments related to medical ultrasound.
NASA Astrophysics Data System (ADS)
Watanabe, S.; Utsumi, N.; Take, M.; Iida, A.
2016-12-01
This study aims to develop a new approach to assess the impact of climate change on the small oceanic islands in the Pacific. In the new approach, the change of the probabilities of various situations was projected with considering the spread of projection derived from ensemble simulations, instead of projecting the most probable situation. The database for Policy Decision making for Future climate change (d4PDF) is a database of long-term high-resolution climate ensemble experiments, which has the results of 100 ensemble simulations. We utilized the database for Policy Decision making for Future climate change (d4PDF), which was (a long-term and high-resolution database) composed of results of 100 ensemble experiments. A new methodology, Multi Threshold Ensemble Assessment (MTEA), was developed using the d4PDF in order to assess the impact of climate change. We focused on the impact of climate change on tourism because it has played an important role in the economy of the Pacific Islands. The Yaeyama Region, one of the tourist destinations in Okinawa, Japan, was selected as the case study site. Two kinds of impact were assessed: change in probability of extreme climate phenomena and tourist satisfaction associated with weather. The database of long-term high-resolution climate ensemble experiments and the questionnaire survey conducted by a local government were used for the assessment. The result indicated that the strength of extreme events would be increased, whereas the probability of occurrence would be decreased. This change should result in increase of the number of clear days and it could contribute to improve the tourist satisfaction.
NASA Astrophysics Data System (ADS)
Butler, G. V.
1981-04-01
Early space station designs are considered, taking into account Herman Oberth's first space station, the London Daily Mail Study, the first major space station design developed during the moon mission, and the Manned Orbiting Laboratory Program of DOD. Attention is given to Skylab, new space station studies, the Shuttle and Spacelab, communication satellites, solar power satellites, a 30 meter diameter radiometer for geological measurements and agricultural assessments, the mining of the moons, and questions of international cooperation. It is thought to be very probable that there will be very large space stations at some time in the future. However, for the more immediate future a step-by-step development that will start with Spacelab stations of 3-4 men is envisaged.
European Science Notes Information Bulletin Reports on Current European and Middle Eastern Science.
1993-01-01
Testud , G. Breger, P. Amayenc, M. "* To develop and calibrate future operational Chong, B. Nutter, and A. Sauvaget, "A observing systems for the North...the intense measuring 11. S. A. Clough and J. Testud , "The FRONTS- network will probably be to the northwest of Scot- 87 Experiment and Mesoscale
Light Pollution in Natural Science Textbooks in Spanish Secondary Education
ERIC Educational Resources Information Center
Contel, Teresa Muñoz; Ferrandis, Ignacio García; Ferrandis, Xavier García
2016-01-01
Light pollution has emerged with the industrial development in recent decades. It is becoming a significant environmental issue for cities today and it will probably become more important in the near future. However, very little research has been carried out on this issue in the field of science teaching, despite there being a general agreement…
Scenario analysis of the future of medicines.
Leufkens, H.; Haaijer-Ruskamp, F.; Bakker, A.; Dukes, G.
1994-01-01
Planning future policy for medicines poses difficult problems. The main players in the drug business have their own views as to how the world around them functions and how the future of medicines should be shaped. In this paper we show how a scenario analysis can provide a powerful teaching device to readjust peoples' preconceptions. Scenarios are plausible, not probable or preferable, portraits of alternative futures. A series of four of alternative scenarios were constructed: "sobriety in sufficiency," "risk avoidance," "technology on demand," and "free market unfettered." Each scenario was drawn as a narrative, documented quantitatively wherever possible, that described the world as it might be if particular trends were to dominate development. The medical community and health policy markers may use scenarios to take a long term view in order to be prepared adequately for the future. PMID:7987110
NASA Astrophysics Data System (ADS)
Zarola, Amit; Sil, Arjun
2018-04-01
This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.
An introduction of a new stochastic tropical cyclone model for Japan area
NASA Astrophysics Data System (ADS)
Suzuki, K.; Nakano, S.; Ueno, G.; Mori, N.; Nakajo, S.
2015-12-01
The extreme events such as tropical cyclones (TC), downpours, floods, and so on, have huge influences on the human life in the past, present, and future. In particular, the change in their risks on the human life under the future climate has been concerned by the governments and researchers. Our aim is to estimate the probabilities for frequencies of TC which could attack to Japan under the future climate that calculated by GCMs. For carrying out this subject, it is needed a suitable rare event sampling method to find TCs that land on big cities in Japan. Moreover, it requires sufficient reproductions of TCs for calculation of their probabilities, too. The model for TC reproductions is designed with three parts following the lifecycle of TC; formation, maturity and decay. However, we don't treat the part of maturity with physical equations because the maturity process is complicated to express as a stochastic model. The TC intensity model will take the place of this physical part. Several stochastic TC models have been developed for different purposes and problems. Our model is developed for the establishment of a rare event sampling method. Here, the comparisons of behaviors of TC tracks among several stochastic TC models will be discussed using Best Track data provided by Japan Meteorological Agency and MRI-AGCM data for the present climate.
Future southcentral US wildfire probability due to climate change
Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.
2018-01-01
Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.
Assessing the present and future probability of Hurricane Harvey's rainfall
NASA Astrophysics Data System (ADS)
Emanuel, Kerry
2017-11-01
We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981–2000 and will increase to 18% over the period 2081–2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century.
Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.
Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay
2018-04-17
In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.
Ultrasonic Phased Array Simulations of Welded Components at NASA
NASA Technical Reports Server (NTRS)
Roth, D. J.; Tokars, R. P.; Martin, R. E.; Rauser, R. W.; Aldrin, J. C.
2009-01-01
Comprehensive and accurate inspections of welded components have become of increasing importance as NASA develops new hardware such as Ares rocket segments for future exploration missions. Simulation and modeling will play an increasing role in the future for nondestructive evaluation in order to better understand the physics of the inspection process, to prove or disprove the feasibility for an inspection method or inspection scenario, for inspection optimization, for better understanding of experimental results, and for assessment of probability of detection. This study presents simulation and experimental results for an ultrasonic phased array inspection of a critical welded structure important for NASA future exploration vehicles. Keywords: nondestructive evaluation, computational simulation, ultrasonics, weld, modeling, phased array
Space Radiation Risk Assessment for Future Lunar Missions
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Ponomarev, Artem; Atwell, Bill; Cucinotta, Francis A.
2007-01-01
For lunar exploration mission design, radiation risk assessments require the understanding of future space radiation environments in support of resource management decisions, operational planning, and a go/no-go decision. The future GCR flux was estimated as a function of interplanetary deceleration potential, which was coupled with the estimated neutron monitor rate from the Climax monitor using a statistical model. A probability distribution function for solar particle event (SPE) occurrence was formed from proton fluence measurements of SPEs occurred during the past 5 solar cycles (19-23). Large proton SPEs identified from impulsive nitrate enhancements in polar ice for which the fluences are greater than 2 10(exp 9) protons/sq cm for energies greater than 30 MeV, were also combined to extend the probability calculation for high level of proton fluences. The probability with which any given proton fluence level of a SPE will be exceeded during a space mission of defined duration was then calculated. Analytic energy spectra of SPEs at different ranks of the integral fluences were constructed over broad energy ranges extending out to GeV, and representative exposure levels were analyzed at those fluences. For the development of an integrated strategy for radiation protection on lunar exploration missions, effective doses at various points inside a spacecraft were calculated with detailed geometry models representing proposed transfer vehicle and habitat concepts. Preliminary radiation risk assessments from SPE and GCR were compared for various configuration concepts of radiation shelter in exploratory-class spacecrafts.
Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.
Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054
NASA Astrophysics Data System (ADS)
Lee, J. Y.; Chae, B. S.; Wi, S.; KIm, T. W.
2017-12-01
Various climate change scenarios expect the rainfall in South Korea to increase by 3-10% in the future. The future increased rainfall has significant effect on the frequency of flood in future as well. This study analyzed the probability of future flood to investigate the stability of existing and new installed hydraulic structures and the possibility of increasing flood damage in mid-sized watersheds in South Korea. To achieve this goal, we first clarified the relationship between flood quantiles acquired from the flood-frequency analysis (FFA) and design rainfall-runoff analysis (DRRA) in gauged watersheds. Then, after synthetically generating the regional natural flow data according to RCP climate change scenarios, we developed mathematical formulas to estimate future flood quantiles based on the regression between DRRA and FFA incorporated with regional natural flows in unguaged watersheds. Finally, we developed a flood risk map to investigate the change of flood risk in terms of the return period for the past, present, and future. The results identified that the future flood quantiles and risks would increase in accordance with the RCP climate change scenarios. Because the regional flood risk was identified to increase in future comparing with the present status, comprehensive flood control will be needed to cope with extreme floods in future.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud to ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.
Probability-based hazard avoidance guidance for planetary landing
NASA Astrophysics Data System (ADS)
Yuan, Xu; Yu, Zhengshi; Cui, Pingyuan; Xu, Rui; Zhu, Shengying; Cao, Menglong; Luan, Enjie
2018-03-01
Future landing and sample return missions on planets and small bodies will seek landing sites with high scientific value, which may be located in hazardous terrains. Autonomous landing in such hazardous terrains and highly uncertain planetary environments is particularly challenging. Onboard hazard avoidance ability is indispensable, and the algorithms must be robust to uncertainties. In this paper, a novel probability-based hazard avoidance guidance method is developed for landing in hazardous terrains on planets or small bodies. By regarding the lander state as probabilistic, the proposed guidance algorithm exploits information on the uncertainty of lander position and calculates the probability of collision with each hazard. The collision probability serves as an accurate safety index, which quantifies the impact of uncertainties on the lander safety. Based on the collision probability evaluation, the state uncertainty of the lander is explicitly taken into account in the derivation of the hazard avoidance guidance law, which contributes to enhancing the robustness to the uncertain dynamics of planetary landing. The proposed probability-based method derives fully analytic expressions and does not require off-line trajectory generation. Therefore, it is appropriate for real-time implementation. The performance of the probability-based guidance law is investigated via a set of simulations, and the effectiveness and robustness under uncertainties are demonstrated.
II. MORE THAN JUST CONVENIENT: THE SCIENTIFIC MERITS OF HOMOGENEOUS CONVENIENCE SAMPLES.
Jager, Justin; Putnick, Diane L; Bornstein, Marc H
2017-06-01
Despite their disadvantaged generalizability relative to probability samples, nonprobability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional (or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. © 2017 The Society for Research in Child Development, Inc.
Estimation of probability of failure for damage-tolerant aerospace structures
NASA Astrophysics Data System (ADS)
Halbert, Keith
The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This dissertation describes and develops new PDTA methodologies that directly address the deficiencies of the currently used tools. The new methods are implemented as a free, publicly licensed and open source R software package that can be downloaded from the Comprehensive R Archive Network. The tools consist of two main components. First, an explicit (and expensive) Monte Carlo approach is presented which simulates the life of an aircraft structural component flight-by-flight. This straightforward MC routine can be used to provide defensible estimates of the failure probabilities for future flights and repair probabilities for future inspections under a variety of failure and maintenance scenarios. This routine is intended to provide baseline estimates against which to compare the results of other, more efficient approaches. Second, an original approach is described which models the fatigue process and future scheduled inspections as a hidden Markov model. This model is solved using a particle-based approximation and the sequential importance sampling algorithm, which provides an efficient solution to the PDTA problem. Sequential importance sampling is an extension of importance sampling to a Markov process, allowing for efficient Bayesian updating of model parameters. This model updating capability, the benefit of which is demonstrated, is lacking in other PDTA approaches. The results of this approach are shown to agree with the results of the explicit Monte Carlo routine for a number of PDTA problems. Extensions to the typical PDTA problem, which cannot be solved using currently available tools, are presented and solved in this work. These extensions include incorporating observed evidence (such as non-destructive inspection results), more realistic treatment of possible future repairs, and the modeling of failure involving more than one crack (the so-called continuing damage problem). The described hidden Markov model / sequential importance sampling approach to PDTA has the potential to improve aerospace structural safety and reduce maintenance costs by providing a more accurate assessment of the risk of failure and the likelihood of repairs throughout the life of an aircraft.
A species sensitivity distribution (SSD) is a probability model of the variation of species sensitivities to a stressor, in particular chemical exposure. The SSD approach has been used as a decision support tool in environmental protection and management since the 1980s, and the ...
ERIC Educational Resources Information Center
Burley, Stephanie
2012-01-01
History curriculum reform proposals and debates are a persistent feature of the contemporary educational landscape in England and, very probably, a "sign of the times" that can reveal a great deal about contemporary predicaments and concerns. History curriculum controversy is also a global phenomenon and one that can fruitfully--and,…
New educational tools to encourage high-school students' activity in stem
NASA Astrophysics Data System (ADS)
Mayorova, Vera; Grishko, Dmitriy; Leonov, Victor
2018-01-01
Many students have to choose their future profession during their last years in the high school and therefore to choose a university where they will get proper education. That choice may define their professional life for many years ahead or probably for the rest of their lives. Bauman Moscow State Technical University conducts various events to introduce future professions to high-school students. Such activity helps them to pick specialization in line with their interests and motivates them to study key scientific subjects. The paper focuses on newly developed educational tools to encourage high school students' interest in STEM disciplines. These tools include laboratory courses developed in the fields of physics, information technologies and mathematics. More than 2000 high school students already participated in these experimental courses. These activities are aimed at increasing the quality of STEM disciplines learning which will result in higher quality of training of future engineers.
Technology opportunities in a restructured electric industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehl, S.
1995-12-31
This paper describes the Strategic Research & Development (SR&D) program of the Electric Power Research Institute (EPRI). The intent of the program is to anticipate and shape the scientific and technological future of the electricity enterprise. SR&D serves those industry R&D needs that are more exploratory, precompetitive, and longer-term. To this end, SR&D seeks to anticipate technological change and, where possible, shape that change to the advantage of the electric utility enterprise and its customers. SR&D`s response to this challenge is research and development program that addresses the most probable future of the industry, but at the same time ismore » robust against alternative futures. The EPRI SR&D program is organized into several vectors, each with a mission that relates directly to one or more EPRI industry goals, which are summarized in the paper. 1 fig., 2 tabs.« less
Cant, Michael A; Llop, Justine B; Field, Jeremy
2006-06-01
Recent theory suggests that much of the wide variation in individual behavior that exists within cooperative animal societies can be explained by variation in the future direct component of fitness, or the probability of inheritance. Here we develop two models to explore the effect of variation in future fitness on social aggression. The models predict that rates of aggression will be highest toward the front of the queue to inherit and will be higher in larger, more productive groups. A third prediction is that, in seasonal animals, aggression will increase as the time available to inherit the breeding position runs out. We tested these predictions using a model social species, the paper wasp Polistes dominulus. We found that rates of both aggressive "displays" (aimed at individuals of lower rank) and aggressive "tests" (aimed at individuals of higher rank) decreased down the hierarchy, as predicted by our models. The only other significant factor affecting aggression rates was date, with more aggression observed later in the season, also as predicted. Variation in future fitness due to inheritance rank is the hidden factor accounting for much of the variation in aggressiveness among apparently equivalent individuals in this species.
Modulation and multiplexing in ultra-broadband photonic internet: Part II
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2011-06-01
In this paper, there is presented a review of our today's understanding of the ultimately broadband photonic Internet. A simple calculation is presented showing the estimate of the throughput of the core photonic network branches. Optoelectronic components, circuits, systems and signals, together with analogous electronic entities and common software layers, are building blocks of the contemporary Internet. Participation of photonics in development of the physical layer in the future Internet will probably increase. The photonics leads now to a better usage of the available bandwidth (increase of the spectral efficiency measured in Bit/s/Hz), increase in the transmission rate (from Gbps, via Tbps up to probably Pbps), increase in the transmission distance without signal regeneration (in distortion compensated active optical cables), increase in energy/power efficiency measured in W/Gbps, etc. Photonics may lead, in the future, to fully transparent optical networks and, thus, to essential increase in bandwidth and network reliability. It is expected that photonics (with biochemistry, electronics and mechatronics) may build psychological and physiological interface for humans to the future global network. The following optical signal multiplexing methods were considered, which are possible without O/E/O conversion: TDM-OTDM, FDM-CO-OFDM, OCDM-OCDMA, WDM-DWDM.
Ultra-broadband photonic internet
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2011-06-01
In this paper, there is presented a review of our today's understanding of the ultimately broadband photonic Internet. A simple calculation is presented showing the estimate of the throughput of the core photonic network branches. Optoelectronic components, circuits, systems and signals, together with analogous electronic entities and common software layers, are building blocks of the contemporary Internet. Participation of photonics in development of the physical layer in the future Internet will probably increase. The photonics leads now to a better usage of the available bandwidth (increase of the spectral efficiency measured in Bit/s/Hz), increase in the transmission rate (from Gbps, via Tbps up to probably Pbps), increase in the transmission distance without signal regeneration (in distortion compensated active optical cables), increase in energy/power efficiency measured in W/Gbps, etc. Photonics may lead, in the future, to fully transparent optical networks and, thus, to essential increase in bandwidth and network reliability. It is expected that photonics (with biochemistry, electronics and mechatronics) may build psychological and physiological interface for humans to the future global network. The following optical signal multiplexing methods were considered, which are possible without O/E/O conversion: TDM-OTDM, FDM-CO-OFDM, OCDM-OCDMA, WDM-DWDM.
Modulation and multiplexing in ultra-broadband photonic internet: Part I
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2011-06-01
In this paper, there is presented a review of our today's understanding of the ultimately broadband photonic Internet. A simple calculation is presented showing the estimate of the throughput of the core photonic network branches. Optoelectronic components, circuits, systems and signals, together with analogous electronic entities and common software layers, are building blocks of the contemporary Internet. Participation of photonics in development of the physical layer in the future Internet will probably increase. The photonics leads now to a better usage of the available bandwidth (increase of the spectral efficiency measured in Bit/s/Hz), increase in the transmission rate (from Gbps, via Tbps up to probably Pbps), increase in the transmission distance without signal regeneration (in distortion compensated active optical cables), increase in energy/power efficiency measured in W/Gbps, etc. Photonics may lead, in the future, to fully transparent optical networks and, thus, to essential increase in bandwidth and network reliability. It is expected that photonics (with biochemistry, electronics and mechatronics) may build psychological and physiological interface for humans to the future global network. The following optical signal multiplexing methods were considered, which are possible without O/E/O conversion: TDM-OTDM, FDM-CO-OFDM, OCDM-OCDMA, WDM-DWDM.
Assessing the present and future probability of Hurricane Harvey's rainfall.
Emanuel, Kerry
2017-11-28
We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981-2000 and will increase to 18% over the period 2081-2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century. Copyright © 2017 the Author(s). Published by PNAS.
NASA Astrophysics Data System (ADS)
Lindsay, Jan; Marzocchi, Warner; Jolly, Gill; Constantinescu, Robert; Selva, Jacopo; Sandri, Laura
2010-03-01
The Auckland Volcanic Field (AVF) is a young basaltic field that lies beneath the urban area of Auckland, New Zealand’s largest city. Over the past 250,000 years the AVF has produced at least 49 basaltic centers; the last eruption was only 600 years ago. In recognition of the high risk associated with a possible future eruption in Auckland, the New Zealand government ran Exercise Ruaumoko in March 2008, a test of New Zealand’s nation-wide preparedness for responding to a major disaster resulting from a volcanic eruption in Auckland City. The exercise scenario was developed in secret, and covered the period of precursory activity up until the eruption. During Exercise Ruaumoko we adapted a recently developed statistical code for eruption forecasting, namely BET_EF (Bayesian Event Tree for Eruption Forecasting), to independently track the unrest evolution and to forecast the most likely onset time, location and style of the initial phase of the simulated eruption. The code was set up before the start of the exercise by entering reliable information on the past history of the AVF as well as the monitoring signals expected in the event of magmatic unrest and an impending eruption. The average probabilities calculated by BET_EF during Exercise Ruaumoko corresponded well to the probabilities subjectively (and independently) estimated by the advising scientists (differences of few percentage units), and provided a sound forecast of the timing (before the event, the eruption probability reached 90%) and location of the eruption. This application of BET_EF to a volcanic field that has experienced no historical activity and for which otherwise limited prior information is available shows its versatility and potential usefulness as a tool to aid decision-making for a wide range of volcano types. Our near real-time application of BET_EF during Exercise Ruaumoko highlighted its potential to clarify and possibly optimize decision-making procedures in a future AVF eruption crisis, and as a rational starting point for discussions in a scientific advisory group. It also stimulated valuable scientific discussion around how a future AVF eruption might progress, and highlighted areas of future volcanological research that would reduce epistemic uncertainties through the development of better input models.
Landmark, Tormod; Dale, Ola; Romundstad, Pål; Woodhouse, Astrid; Kaasa, Stein; Borchgrevink, Petter C
2018-05-13
Epidemiological studies of chronic pain frequently report high prevalence estimates. However, there is little information about the development and natural course of chronic pain. We followed a random sample of participants from a population-based study (HUNT 3) with annual measures over four years. Among those without chronic pain at baseline, the probability of developing moderate to severe chronic pain (cumulative incidence) during the first year was 5%, a pain status that was maintained among 38% at the second follow-up. The probability of developing chronic pain diminished substantially for those who maintained a status of no chronic pain over several years. Subjects with moderate to severe chronic pain at baseline had an 8% probability of recovery into no chronic pain, a status that was maintained for 52% on the second follow-up. The probability of recovery diminished substantially as a status of chronic pain was prolonged for several years. Pain severity, widespread pain, pain catastrophizing, depression and sleep were significant predictors of future moderate to severe chronic pain, both among subjects with and without chronic pain at baseline. These findings suggest that the prognosis is fairly good after a new onset of chronic pain. When the pain has lasted for several years, the prognosis becomes poor. The same social and psychological factors predict new onset and the prognosis of chronic pain. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Future career plans of a cohort of senior doctors working in the National Health Service.
Taylor, Kathryn; Lambert, Trevor; Goldacre, Michael
2008-04-01
To report on the future career plans of senior doctors working in the NHS. Postal questionnaires. All doctors who qualified in 1977 from all UK medical schools. Future plans and whether participants had any unmet needs for advice on how to put their future plans into effect. 25% definitely intended to continue with their current employment on the same basis until they retired; 75% hoped for change. A reduction in working hours was the most commonly desired change; but a substantial percentage also wanted changes in job content. 50% of respondents intended definitely (17%) or probably (33%) to work in the NHS to their normal retirement age; and 37% definitely (20%) or probably (17%) intended to retire early. 48% had made plans, in addition to the standard pension, to facilitate early retirement. The main factors given for considering early retirement were family reasons and wanting more time for leisure, a desire to maintain good health, excessive pressure of work, and disillusionment with NHS changes. A reduction in workload would be the greatest inducement to stay. 31% of respondents reported that they had unmet needs for advice about their future plans. Of these, about half were needs for advice about planning for retirement. Many senior NHS doctors would like to reduce their working hours. Less than a quarter definitely intend to work in the NHS to normal retirement age. Even for senior doctors, advice on career development is needed.
Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.
2010-01-01
Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.
Dohm, J.M.; Ferris, J.C.; Barlow, N.G.; Baker, V.R.; Mahaney, W.C.; Anderson, R.C.; Hare, T.M.
2004-01-01
The northwestern slope valleys region is a prime candidate site for future science-driven Mars exploration because it records Noachian to Amazonian Tharsis development in a region that encapsulates (1) a diverse and temporally extensive stratigraphic record, (2) at least three distinct paleohydrologic regimes, (3) gargantuan structurally controlled flood valleys that generally correspond with gravity and magnetic anomalies, possibly marking ancient magnetized rock materials exposed by fluvial activity, (4) water enrichment, as indicated by Mars Odyssey and impact crater analyses, (5) long-lived magma and ground water/ice interactions that could be favorable for the development and sustenance of life, and (6) potential paleosol development. This region has high probability to yield significant geologic, climatic, and exobiologic information that could revolutionize our understanding of Mars. ?? 2003 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2014 CFR
2014-07-01
... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...
Code of Federal Regulations, 2013 CFR
2013-07-01
... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...
A Probabilistic, Facility-Centric Approach to Lightning Strike Location
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.
2012-01-01
A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
On the Question of the Current State of Autonomy of Higher Educational Institutions in Russia
ERIC Educational Resources Information Center
Gushchin, V. V.; Gureev, V. A.
2011-01-01
There is probably no one who doubts that education is a top-priority interest of the state. The future of the Russian Federation and its economic and legal stability depend on the level of development of the system of education. The right to an education is one of the most fundamental and essential constitutional rights of the citizens of the…
Guyette, Richard; Stambaugh, Michael C; Dey, Daniel; Muzika, Rose Marie
2017-01-01
The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature.
Guyette, Richard; Stambaugh, Michael C.; Dey, Daniel
2017-01-01
The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature. PMID:28704457
The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise
NASA Astrophysics Data System (ADS)
Plag, H.; Bye, B.
2011-12-01
Policy makers and stakeholders in the coastal zone are equally challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often densely populated coastal areas are under risk of increased inundation. More than 40% of the global population is living in or near the coastal zone and this fraction is steadily increasing. A rise in LSL will increase the vulnerability of coastal infrastructure and population dramatically, with potentially devastating consequences for the global economy, society, and environment. Policy makers are faced with a trade-off between imposing today the often very high costs of coastal protection and adaptation upon national economies and leaving the costs of potential major disasters to future generations. They are in need of actionable information that provides guidance for the development of coastal zones resilient to future sea level changes. Part of this actionable information comes from risk and vulnerability assessments, which require information on future LSL changes as input. In most cases, a deterministic approach has been applied based on predictions of the plausible range of future LSL trajectories as input. However, there is little consensus in the scientific community on how these trajectories should be determined, and what the boundaries of the plausible range are. Over the last few years, many publications in Science, Nature and other peer-reviewed scientific journals have revealed a broad range of possible futures and significant epistemic uncertainties and gaps concerning LSL changes. Based on the somewhat diffuse science input, policy and decision makers have made rather different choices for mitigation and adaptation in cases such as Venice, The Netherlands, New York City, and the San Francisco Bay area. Replacing the deterministic, prediction-based approach with a statistical one that fully accounts for the uncertainties and epistemic gaps would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.
A pilot study of naturally occurring high-probability request sequences in hostage negotiations.
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research.
A PILOT STUDY OF NATURALLY OCCURRING HIGH-PROBABILITY REQUEST SEQUENCES IN HOSTAGE NEGOTIATIONS
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research. PMID:19949541
Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Nolen, Matthew S.; Wagner, Brian K.
2017-01-01
Crayfish are ecologically important in freshwater systems worldwide and are imperiled in North America and globally. We sought to examine landscape- to local-scale environmental variables related to occupancy and detection probability of a suite of stream-dwelling crayfish species. We used a quantitative kickseine method to sample crayfish presence at 102 perennial stream sites with eight surveys per site. We modeled occupancy (psi) and detection probability (P) and local- and landscape-scale environmental covariates. We developed a set of a priori candidate models for each species and ranked models using (Q)AICc. Detection probabilities and occupancy estimates differed among crayfish species with Orconectes eupunctus, O. marchandi, and Cambarus hubbsi being relatively rare (psi < 0.20) with moderate (0.46–0.60) to high (0.81) detection probability and O. punctimanus and O. ozarkae being relatively common (psi > 0.60) with high detection probability (0.81). Detection probability was often related to local habitat variables current velocity, depth, or substrate size. Important environmental variables for crayfish occupancy were species dependent but were mainly landscape variables such as stream order, geology, slope, topography, and land use. Landscape variables strongly influenced crayfish occupancy and should be considered in future studies and conservation plans.
Coe, J.A.; Michael, J.A.; Crovelli, R.A.; Savage, W.Z.; Laprade, W.T.; Nashem, W.D.
2004-01-01
Ninety years of historical landslide records were used as input to the Poisson and binomial probability models. Results from these models show that, for precipitation-triggered landslides, approximately 9 percent of the area of Seattle has annual exceedance probabilities of 1 percent or greater. Application of the Poisson model for estimating the future occurrence of individual landslides results in a worst-case scenario map, with a maximum annual exceedance probability of 25 percent on a hillslope near Duwamish Head in West Seattle. Application of the binomial model for estimating the future occurrence of a year with one or more landslides results in a map with a maximum annual exceedance probability of 17 percent (also near Duwamish Head). Slope and geology both play a role in localizing the occurrence of landslides in Seattle. A positive correlation exists between slope and mean exceedance probability, with probability tending to increase as slope increases. Sixty-four percent of all historical landslide locations are within 150 m (500 ft, horizontal distance) of the Esperance Sand/Lawton Clay contact, but within this zone, no positive or negative correlation exists between exceedance probability and distance to the contact.
Converting MEMS technology into profits
NASA Astrophysics Data System (ADS)
Bryzek, Janusz
1998-08-01
This paper discusses issues related to transitioning a company from the advanced technology development phase (with a particular focus on MEMS) to a profitable business, with emphasis on start-up companies. It includes several case studies from (primarily) NovaSensor MEMS development history. These case studies illustrate strategic problems with which advanced MEMS technology developers have to be concerned. Conclusions from these case studies could be used as checkpoints for future MEMS developers to increase probability of profitable operations. The objective for this paper is to share the author's experience from multiple MEMS start-ups to accelerate development of the MEMS market by focusing state- of-the-art technologists on marketing issues.
Methods, apparatus and system for notification of predictable memory failure
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2017-01-03
A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.
The present state and future directions of PDF methods
NASA Technical Reports Server (NTRS)
Pope, S. B.
1992-01-01
The objectives of the workshop are presented in viewgraph format, as is this entire article. The objectives are to discuss the present status and the future direction of various levels of engineering turbulence modeling related to Computational Fluid Dynamics (CFD) computations for propulsion; to assure that combustion is an essential part of propulsion; and to discuss Probability Density Function (PDF) methods for turbulent combustion. Essential to the integration of turbulent combustion models is the development of turbulent model, chemical kinetics, and numerical method. Some turbulent combustion models typically used in industry are the k-epsilon turbulent model, the equilibrium/mixing limited combustion, and the finite volume codes.
Balotari-Chiebao, Fabio; Villers, Alexandre; Ijäs, Asko; Ovaskainen, Otso; Repka, Sari; Laaksonen, Toni
2016-11-01
The presence of poorly sited wind farms raises concerns for wildlife, including birds of prey. Therefore, there is a need to extend the knowledge of the potential human-wildlife conflicts associated with wind energy. Here, we report on the movements and habitat use of post-fledging satellite-tagged white-tailed eagles in Finland, where wind-energy development is expected to increase in the near future. In particular, we examine the probability of a fledgling approaching a hypothetical turbine that is placed at different distances from the nest. We found that this probability is high at short distances but considerably decreases with increasing distances to the nest. A utilisation-availability analysis showed that the coast was the preferred habitat. We argue that avoiding construction between active nests and the shoreline, as well as adopting the currently 2-km buffer zone for turbine deployment, can avoid or minimise potential impacts on post-fledging white-tailed eagles.
Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis
NASA Astrophysics Data System (ADS)
Verendel, Vilhelm; Häggström, Olle
2017-01-01
The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.
Design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Crouse, K. R.; Wechsler, D. B.
1987-01-01
The benefits of design knowledge availability are identifiable and pervasive. The implementation of design knowledge capture and storage using current technology increases the probability for success, while providing for a degree of access compatibility with future applications. The space station design definition should be expanded to include design knowledge. Design knowledge should be captured. A critical timing relationship exists between the space station development program, and the implementation of this project.
The abolition of war as a goal of environmental policy.
Snyder, Brian F; Ruyle, Leslie E
2017-12-15
Since the 1950s, select military and political leaders have had the capacity to kill all or nearly all human life on Earth. The number of people entrusted with this power grows each year through proliferation and the rise of new political leaders. If humans continue to maintain and develop nuclear weapons, it is highly probable that a nuclear exchange will occur again at some point in the future. This nuclear exchange may or may not annihilate the human species, but it will cause catastrophic effects on the biosphere. The international community has attempted to resolve this existential problem via treaties that control and potentially eliminate nuclear weapons, however, these treaties target only nuclear weapons, leaving the use of war as a normalized means for settling conflict. As long as war exists as a probable future, nations will be under pressure to develop more powerful weapons. Thus, we argue that the elimination of nuclear weapons alone is not a stable, long-term strategy. A far more secure strategy would be the elimination of war as a means of settling international disputes. Therefore, those concerned about environmental sustainability or the survival of the biosphere should work to abolish war. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mani, Amir; Tsai, Frank T. -C.; Kao, Shih-Chieh
Our study introduces a mixed integer linear fractional programming (MILFP) method to optimize conjunctive use of future surface water and groundwater resources under projected climate change scenarios. The conjunctive management model maximizes the ratio of groundwater usage to reservoir water usage. Future inflows to the reservoirs were estimated from the future runoffs projected through hydroclimate modeling considering the Variable Infiltration Capacity model, and 11 sets of downscaled Coupled Model Intercomparison Project phase 5 global climate model projections. Bayesian model averaging was adopted to quantify uncertainty in future runoff projections and reservoir inflow projections due to uncertain future climate projections. Optimizedmore » conjunctive management solutions were investigated for a water supply network in northern Louisiana which includes the Sparta aquifer. Runoff projections under climate change scenarios indicate that runoff will likely decrease in winter and increase in other seasons. Ultimately, results from the developed conjunctive management model with MILFP indicate that the future reservoir water, even at 2.5% low inflow cumulative probability level, could counterbalance groundwater pumping reduction to satisfy demands while improving the Sparta aquifer through conditional groundwater head constraint.« less
Mani, Amir; Tsai, Frank T. -C.; Kao, Shih-Chieh; ...
2016-06-16
Our study introduces a mixed integer linear fractional programming (MILFP) method to optimize conjunctive use of future surface water and groundwater resources under projected climate change scenarios. The conjunctive management model maximizes the ratio of groundwater usage to reservoir water usage. Future inflows to the reservoirs were estimated from the future runoffs projected through hydroclimate modeling considering the Variable Infiltration Capacity model, and 11 sets of downscaled Coupled Model Intercomparison Project phase 5 global climate model projections. Bayesian model averaging was adopted to quantify uncertainty in future runoff projections and reservoir inflow projections due to uncertain future climate projections. Optimizedmore » conjunctive management solutions were investigated for a water supply network in northern Louisiana which includes the Sparta aquifer. Runoff projections under climate change scenarios indicate that runoff will likely decrease in winter and increase in other seasons. Ultimately, results from the developed conjunctive management model with MILFP indicate that the future reservoir water, even at 2.5% low inflow cumulative probability level, could counterbalance groundwater pumping reduction to satisfy demands while improving the Sparta aquifer through conditional groundwater head constraint.« less
Teamwork Training Needs Analysis for Long-Duration Exploration Missions
NASA Technical Reports Server (NTRS)
Smith-Jentsch, Kimberly A.; Sierra, Mary Jane
2016-01-01
The success of future long-duration exploration missions (LDEMs) will be determined largely by the extent to which mission-critical personnel possess and effectively exercise essential teamwork competencies throughout the entire mission lifecycle (e.g., Galarza & Holland, 1999; Hysong, Galarza, & Holland, 2007; Noe, Dachner, Saxton, & Keeton, 2011). To ensure that such personnel develop and exercise these necessary teamwork competencies prior to and over the full course of future LDEMs, it is essential that a teamwork training curriculum be developed and put into place at NASA that is both 1) comprehensive, in that it targets all teamwork competencies critical for mission success and 2) structured around empirically-based best practices for enhancing teamwork training effectiveness. In response to this demand, the current teamwork-oriented training needs analysis (TNA) was initiated to 1) identify the teamwork training needs (i.e., essential teamwork-related competencies) of future LDEM crews, 2) identify critical gaps within NASA’s current and future teamwork training curriculum (i.e., gaps in the competencies targeted and in the training practices utilized) that threaten to impact the success of future LDEMs, and to 3) identify a broad set of practical nonprescriptive recommendations for enhancing the effectiveness of NASA’s teamwork training curriculum in order to increase the probability of future LDEM success.
Recent Advances in Model-Assisted Probability of Detection
NASA Technical Reports Server (NTRS)
Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.
2009-01-01
The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.
Long-term volcanic hazard assessment on El Hierro (Canary Islands)
NASA Astrophysics Data System (ADS)
Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.
2014-07-01
Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.
Dudley, Robert W.; Hodgkins, Glenn A.; Dickinson, Jesse
2017-01-01
We present a logistic regression approach for forecasting the probability of future groundwater levels declining or maintaining below specific groundwater-level thresholds. We tested our approach on 102 groundwater wells in different climatic regions and aquifers of the United States that are part of the U.S. Geological Survey Groundwater Climate Response Network. We evaluated the importance of current groundwater levels, precipitation, streamflow, seasonal variability, Palmer Drought Severity Index, and atmosphere/ocean indices for developing the logistic regression equations. Several diagnostics of model fit were used to evaluate the regression equations, including testing of autocorrelation of residuals, goodness-of-fit metrics, and bootstrap validation testing. The probabilistic predictions were most successful at wells with high persistence (low month-to-month variability) in their groundwater records and at wells where the groundwater level remained below the defined low threshold for sustained periods (generally three months or longer). The model fit was weakest at wells with strong seasonal variability in levels and with shorter duration low-threshold events. We identified challenges in deriving probabilistic-forecasting models and possible approaches for addressing those challenges.
Shen, Fuhai; Yuan, Juxiang; Sun, Zhiqian; Hua, Zhengbing; Qin, Tianbang; Yao, Sanqiao; Fan, Xueyun; Chen, Weihong; Liu, Hongbo; Chen, Jie
2013-01-01
Prior to 1970, coal mining technology and prevention measures in China were poor. Mechanized coal mining equipment and advanced protection measures were continuously installed in the mines after 1970. All these improvements may have resulted in a change in the incidence of coal workers' pneumoconiosis (CWP). Therefore, it is important to identify the characteristics of CWP today and trends for the incidence of CWP in the future. A total of 17,023 coal workers from the Kailuan Colliery Group were studied. A life-table method was used to calculate the cumulative incidence rate of CWP and predict the number of new CWP patients in the future. The probability of developing CWP was estimated by a multilayer perceptron artificial neural network for each coal worker without CWP. The results showed that the cumulative incidence rates of CWP for tunneling, mining, combining, and helping workers were 31.8%, 27.5%, 24.2%, and 2.6%, respectively, during the same observation period of 40 years. It was estimated that there would be 844 new CWP cases among 16,185 coal workers without CWP within their life expectancy. There would be 273.1, 273.1, 227.6, and 69.9 new CWP patients in the next <10, 10-, 20-, and 30- years respectively in the study cohort within their life expectancy. It was identified that coal workers whose risk probabilities were over 0.2 were at high risk for CWP, and whose risk probabilities were under 0.1 were at low risk. The present and future incidence trends of CWP remain high among coal workers. We suggest that coal workers at high risk of CWP undergo a physical examination for pneumoconiosis every year, and the coal workers at low risk of CWP be examined every 5 years.
Assessing risk based on uncertain avalanche activity patterns
NASA Astrophysics Data System (ADS)
Zeidler, Antonia; Fromm, Reinhard
2015-04-01
Avalanches may affect critical infrastructure and may cause great economic losses. The planning horizon of infrastructures, e.g. hydropower generation facilities, reaches well into the future. Based on the results of previous studies on the effect of changing meteorological parameters (precipitation, temperature) and the effect on avalanche activity we assume that there will be a change of the risk pattern in future. The decision makers need to understand what the future might bring to best formulate their mitigation strategies. Therefore, we explore a commercial risk software to calculate risk for the coming years that might help in decision processes. The software @risk, is known to many larger companies, and therefore we explore its capabilities to include avalanche risk simulations in order to guarantee a comparability of different risks. In a first step, we develop a model for a hydropower generation facility that reflects the problem of changing avalanche activity patterns in future by selecting relevant input parameters and assigning likely probability distributions. The uncertain input variables include the probability of avalanches affecting an object, the vulnerability of an object, the expected costs for repairing the object and the expected cost due to interruption. The crux is to find the distribution that best represents the input variables under changing meteorological conditions. Our focus is on including the uncertain probability of avalanches based on the analysis of past avalanche data and expert knowledge. In order to explore different likely outcomes we base the analysis on three different climate scenarios (likely, worst case, baseline). For some variables, it is possible to fit a distribution to historical data, whereas in cases where the past dataset is insufficient or not available the software allows to select from over 30 different distribution types. The Monte Carlo simulation uses the probability distribution of uncertain variables using all valid combinations of the values of input variables to simulate all possible outcomes. In our case the output is the expected risk (Euro/year) for each object (e.g. water intake) considered and the entire hydropower generation system. The output is again a distribution that is interpreted by the decision makers as the final strategy depends on the needs and requirements of the end-user, which may be driven by personal preferences. In this presentation, we will show a way on how we used the uncertain information on avalanche activity in future to subsequently use it in a commercial risk software and therefore bringing the knowledge of natural hazard experts to decision makers.
Convergence of Transition Probability Matrix in CLVMarkov Models
NASA Astrophysics Data System (ADS)
Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.
2018-04-01
A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.
A Mobile Decision Aid for Determining Detection Probabilities for Acoustic Targets
2002-08-01
propagation mobile application . Personal Computer Memory Card International Association, an organization of some 500 companies that has developed a...SENSOR: lHuman and possible outputs, it was felt that for a mobile application , the interface and number of output parameters should be kept simple...value could be computed on the server and transmitted back to the mobile application for display. FUTURE CAPABILITIES 2-D/3-D Displays The full ABFA
NASA Technical Reports Server (NTRS)
Whyte, A. A.
1978-01-01
A survey of the internal and external reporting and recordkeeping procedures of these programs was conducted and the major problems associated with them are outlined. The impact of probable future requirements on existing information systems is evaluated. This report also presents the benefits of combining the safety and health information systems into one computerized system and recommendations for the development and scope of that system.
Cost/benefit analysis of advanced materials technologies for future aircraft turbine engines
NASA Technical Reports Server (NTRS)
Bisset, J. W.
1976-01-01
The cost/benefits of advance commercial gas turbine materials are described. Development costs, estimated payoffs and probabilities of success are discussed. The materials technologies investigated are: (1) single crystal turbine blades, (2) high strength hot isostatic pressed turbine disk, (3) advanced oxide dispersion strengthened burner liner, (4) bore entry cooled hot isostatic pressed turbine disk, (5) turbine blade tip - outer airseal system, and (6) advance turbine blade alloys.
The Scent of the Future: Manned Space Travel and the Soviet Union.
1981-06-01
AND ECONOMIC APPLICATIONS 56 GREENHOUSES , BOOSTERS, AND SPACE PLANES: SOVIET SPACE-RELATED RESEARCH AND DEVELOPMENT 72 R.U.R. REVISITED: MANNED VERSUS... greenhouse that was part of their 12-square-meter closed environment.9 6 The successful conclusion of this test demonstrated the feasibility of a manned...will probably be timed to coincide with the XXVI Party Congress which convenes in February 1981. 71 GREENHOUSES , BOOSTERS, AND SPACE PLANES: SOVIET
The Effects of the Previous Outcome on Probabilistic Choice in Rats
Marshall, Andrew T.; Kirkpatrick, Kimberly
2014-01-01
This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915
Maximizing the Detection Probability of Kilonovae Associated with Gravitational Wave Observations
NASA Astrophysics Data System (ADS)
Chan, Man Leong; Hu, Yi-Ming; Messenger, Chris; Hendry, Martin; Heng, Ik Siong
2017-01-01
Estimates of the source sky location for gravitational wave signals are likely to span areas of up to hundreds of square degrees or more, making it very challenging for most telescopes to search for counterpart signals in the electromagnetic spectrum. To boost the chance of successfully observing such counterparts, we have developed an algorithm that optimizes the number of observing fields and their corresponding time allocations by maximizing the detection probability. As a proof-of-concept demonstration, we optimize follow-up observations targeting kilonovae using telescopes including the CTIO-Dark Energy Camera, Subaru-HyperSuprimeCam, Pan-STARRS, and the Palomar Transient Factory. We consider three simulated gravitational wave events with 90% credible error regions spanning areas from ∼ 30 {\\deg }2 to ∼ 300 {\\deg }2. Assuming a source at 200 {Mpc}, we demonstrate that to obtain a maximum detection probability, there is an optimized number of fields for any particular event that a telescope should observe. To inform future telescope design studies, we present the maximum detection probability and corresponding number of observing fields for a combination of limiting magnitudes and fields of view over a range of parameters. We show that for large gravitational wave error regions, telescope sensitivity rather than field of view is the dominating factor in maximizing the detection probability.
Rationale for continuing R&D in indirect coal liquefaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.; Tomlinson, G.
1995-12-31
The objective of this analysis is to use the world energy demand/supply model developed at MITRE to examine future liquid fuels supply scenarios both for the world and for the United States. This analysis has determined the probable extent of future oil resource shortages and the likely time frame in which the shortages will occur. The role that coal liquefaction could play in helping to alleviate this liquid fuels shortfall is also examined. The importance of continuing R&D to improve process performance and reduce the costs of coal-derived transportation fuel is quantified in terms of reducing the time when coalmore » liquids will become competitive with petroleum.« less
Spatial and temporal variability in rates of landsliding in seismically active mountain ranges
NASA Astrophysics Data System (ADS)
Parker, R.; Petley, D.; Rosser, N.; Densmore, A.; Gunasekera, R.; Brain, M.
2012-04-01
Where earthquake and precipitation driven disasters occur in steep, mountainous regions, landslides often account for a large proportion of the associated damage and losses. This research addresses spatial and temporal variability in rates of landslide occurrence in seismically active mountain ranges as a step towards developing better regional scale prediction of losses in such events. In the first part of this paper we attempt to explain reductively the variability in spatial rates of landslide occurrence, using data from five major earthquakes. This is achieved by fitting a regression-based conditional probability model to spatial probabilities of landslide occurrence, using as predictor variables proxies for spatial patterns of seismic ground motion and modelled hillslope stability. A combined model for all earthquakes performs well in hindcasting spatial probabilities of landslide occurrence as a function of readily-attainable spatial variables. We present validation of the model and demonstrate the extent to which it may be applied globally to derive landslide probabilities for future earthquakes. In part two we examine the temporal behaviour of rates of landslide occurrence. This is achieved through numerical modelling to simulate the behaviour of a hypothetical landscape. The model landscape is composed of hillslopes that continually weaken, fail and reset in response to temporally-discrete forcing events that represent earthquakes. Hillslopes with different geometries require different amounts of weakening to fail, such that they fail and reset at different temporal rates. Our results suggest that probabilities of landslide occurrence are not temporally constant, but rather vary with time, irrespective of changes in forcing event magnitudes or environmental conditions. Various parameters influencing the magnitude and temporal patterns of this variability are identified, highlighting areas where future research is needed. This model has important implications for landslide hazard and risk analysis in mountain areas as existing techniques usually assume that susceptibility to failure does not change with time.
A robust method to forecast volcanic ash clouds
Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin
2012-01-01
Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.
Elwood L. Shafer; George H. Moeller; Russell E. Getty
1974-01-01
As an aid to policy- and decision-making about future environmental problems, a panel of experts was asked to predict the probabilities of future events associated with natural-resource management, wildland-recreation management, environmental pollution, population-workforce-leisure, and urban environments. Though some of the predictions projected to the year 2050 may...
NASA Astrophysics Data System (ADS)
Amine, Lagheryeb; Zouhair, Benkhaldoun; Jonathan, Makela; Mohamed, Kaab; Aziza, Bounhir; Brian, Hardin; Dan, Fisher; Tmuthy, Duly
2016-04-01
T he Analysis of the seasonal variations of equatorial plasma bubble, occurrence using the 630.0 nm airglow images collected by the PICASSO imager deployed at the Oukkaimden observatory in Morocco. Data have been taken since November 2013 to december 2015. We show the monthly average of appearance of EPBs. A maximum probability for bubble development is seen in the data in January and between late February and early March. We also observe that there are a maximum period of appearance where the plasma is observed (3-5 nights successivies) and we will discuss its connection with the solar activity in storm time. Future analysis will compare the probability of bubble occurrence in our site with the data raised in other observation sites.
Psychophysics of the probability weighting function
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(
Schlaepfer, Daniel R.; Taylor, Kyle A.; Pennington, Victoria E.; Nelson, Kellen N.; Martin, Trace E.; Rottler, Caitlin M.; Lauenroth, William K.; Bradford, John B.
2015-01-01
Many semi-arid plant communities in western North America are dominated by big sagebrush. These ecosystems are being reduced in extent and quality due to economic development, invasive species, and climate change. These pervasive modifications have generated concern about the long-term viability of sagebrush habitat and sagebrush-obligate wildlife species (notably greater sage-grouse), highlighting the need for better understanding of the future big sagebrush distribution, particularly at the species' range margins. These leading and trailing edges of potential climate-driven sagebrush distribution shifts are likely to be areas most sensitive to climate change. We used a process-based regeneration model for big sagebrush, which simulates potential germination and seedling survival in response to climatic and edaphic conditions and tested expectations about current and future regeneration responses at trailing and leading edges that were previously identified using traditional species distribution models. Our results confirmed expectations of increased probability of regeneration at the leading edge and decreased probability of regeneration at the trailing edge below current levels. Our simulations indicated that soil water dynamics at the leading edge became more similar to the typical seasonal ecohydrological conditions observed within the current range of big sagebrush ecosystems. At the trailing edge, an increased winter and spring dryness represented a departure from conditions typically supportive of big sagebrush. Our results highlighted that minimum and maximum daily temperatures as well as soil water recharge and summer dry periods are important constraints for big sagebrush regeneration. Overall, our results confirmed previous predictions, i.e., we see consistent changes in areas identified as trailing and leading edges; however, we also identified potential local refugia within the trailing edge, mostly at sites at higher elevation. Decreasing regeneration probability at the trailing edge underscores the Schlaepfer et al. Future regeneration potential of big sagebrush potential futility of efforts to preserve and/or restore big sagebrush in these areas. Conversely, increasing regeneration probability at the leading edge suggest a growing potential for conflicts in management goals between maintaining existing grasslands by preventing sagebrush expansion versus accepting a shift in plant community composition to sagebrush dominance.
Constructing alternative futures
David N. Wear; Robert Huggett; John G. Greis
2013-01-01
The desired product of the Southern Forest Futures Project is a mechanism that will help southerners think about and prepare for future changes in their forests and the benefits they provide. Because any single projection of the worldâs (or a regionâs) biological, physical, and social systems has a high probability of being incorrect, the Futures Project instead...
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2016-04-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.
A satellite data terminal for land mobile use
NASA Technical Reports Server (NTRS)
Sutherland, Colin A.
1990-01-01
Telesat Mobile Incorporated (TMI) has recently introduced the Mobile Data Service (MDS) into Canada. This paper outlines the system design and some key aspects of the detailed design of the Mobile Earth Terminal (MET) developed by Canadian Aeronautics Limited (CAL) for use with the MDS. The technical requirements for the MET are outlined and the equipment architecture is described. The major design considerations for each functional module are then addressed. Environmental conditions unique to the land mobile service are highlighted, along with the measures taken to ensure satisfactory operation and survival of the MET. Finally, the probable direction of future developments is indicated.
Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds
Conway, C.J.; Gibbs, J.P.
2011-01-01
Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.
Seismic hazard assessment over time: Modelling earthquakes in Taiwan
NASA Astrophysics Data System (ADS)
Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting
2017-04-01
To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.
Impact risk assessment and planetary defense mission planning for asteroid 2015 PDC
NASA Astrophysics Data System (ADS)
Vardaxis, George; Sherman, Peter; Wie, Bong
2016-05-01
In this paper, an integrated utilization of analytic keyhole theory, B-plane mapping, and planetary encounter geometry, augmented by direct numerical simulation, is shown to be useful in determining the impact risk of an asteroid with the Earth on a given encounter, as well on potential future encounters via keyhole passages. The accurate estimation of the impact probability of hazardous asteroids is extremely important for planetary defense mission planning. Asteroids in Earth resonant orbits are particularly troublesome because of the continuous threat they pose in the future. Based on the trajectories of the asteroid and the Earth, feasible mission trajectories can be found to mitigate the impact threat of hazardous asteroids. In order to try to ensure mission success, trajectories are judged based on initial and final mission design parameters that would make the mission easier to complete. Given the potential of a short-warning time scenario, a disruption mission considered in this paper occurs approximately one year prior to the anticipated impact date. Expanding upon the established theory, a computational method is developed to estimate the impact probability of the hazardous asteroid, in order to assess the likelihood of an event, and then investigate the fragmentation of the asteroid due to a disruption mission and analyze its effects on the current and future encounters of the fragments with Earth. A fictional asteroid, designated as 2015 PDC - created as an example asteroid risk exercise for the 2015 Planetary Defence Conference, is used as a reference target asteroid to demonstrate the effectiveness and applicability of computational tools being developed for impact risk assessment and planetary defense mission planning for a hazardous asteroid or comet.
Developing our leaders in the future.
Hackett, M; Spurgeon, P
1998-01-01
The role of the chief executive in a transformed organisation is an extremely challenging one. The development of vision, building a commitment to it and communicating it constantly are key skills for a chief executive. However, the need to build and empower the stakeholders within and outside the organisation to support the changes required to deliver the vision requires leaders who can connect with a wide range of people and build alliances and partnerships to secure organisational success. A passion for understanding human intervention and behaviour is needed to encourage, cajole and drive teams and individuals to own and commit to change and a new direction. This requires leaders who have imagination and creativity--who seek connections and thread them together to create order out of incoherence. These skills are not taught in schools or textbooks, but are probably innate. They are what separate leaders from the rest. These skills need to be developed. A movement towards encouraging experimentation, career transfers and more individuality is needed if capable leaders of the future are to appear.
Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection
NASA Astrophysics Data System (ADS)
Denuit, Michel; Dhaene, Jan
2007-06-01
In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.
GREMEX- GODDARD RESEARCH AND ENGINEERING MANAGEMENT EXERCISE SIMULATION SYSTEM
NASA Technical Reports Server (NTRS)
Vaccaro, M. J.
1994-01-01
GREMEX is a man-machine management simulation game of a research and development project. It can be used to depict a project from just after the development of the project plan through the final construction phase. The GREMEX computer programs are basically a program evaluation and review technique (PERT) reporting system. In the usual PERT program, the operator inputs each month the amount of work performed on each activity and the computer does the bookkeeping to determine the expected completion date of the project. GREMEX automatically assumes that all activities due to be worked in the current month will be worked. GREMEX predicts new durations (and costs) each month based on management actions taken by the players and the contractor's abilities. Each activity is assigned the usual cost and duration estimates but must also be assigned three parameters that relate to the probability that the time estimate is correct, the probability that the cost estimate is correct, and the probability of technical success. Management actions usually can be expected to change these probabilities. For example, use of overtime or double shifts in research and development work will decrease duration and increase cost by known proportions and will also decrease the probability of technical success due to an increase in the likelihood of accidents or mistakes. These re-estimating future events and assigning probability factors provides life to the model. GREMEX is not a production job for project management. GREMEX is a game that can be used to train management personnel in the administration of research and development type projects. GREMEX poses no 'best way' to manage a project. The emphasis of GREMEX is to expose participants to many of the factors involved in decision making when managing a project in a government research and development environment. A management team can win the game by surpassing cost, schedule, and technical performance goals established when the simulation began. The serious management experimenter can use GREMEX to explore the results of management methods they could not risk in real life. GREMEX can operate with any research and development type project with up to 15 subcontractors and produces reports simulating monthly or quarterly updates of the project PERT network. Included with the program is a data deck for simulation of a fictitious spacecraft project. Instructions for substituting other projects are also included. GREMEX is written in FORTRAN IV for execution in the batch mode and has been implemented on an IBM 360 with a central memory requirement of approximately 350K (decimal) of 8 bit bytes. The GREMEX system was developed in 1973.
Evans, Jeffrey S; Kiesecker, Joseph M
2014-01-01
Global demand for energy has increased by more than 50 percent in the last half-century, and a similar increase is projected by 2030. This demand will increasingly be met with alternative and unconventional energy sources. Development of these resources causes disturbances that strongly impact terrestrial and freshwater ecosystems. The Marcellus Shale gas play covers more than 160,934 km(2) in an area that provides drinking water for over 22 million people in several of the largest metropolitan areas in the United States (e.g. New York City, Washington DC, Philadelphia & Pittsburgh). Here we created probability surfaces representing development potential of wind and shale gas for portions of six states in the Central Appalachians. We used these predictions and published projections to model future energy build-out scenarios to quantify future potential impacts on surface drinking water. Our analysis predicts up to 106,004 new wells and 10,798 new wind turbines resulting up to 535,023 ha of impervious surface (3% of the study area) and upwards of 447,134 ha of impacted forest (2% of the study area). In light of this new energy future, mitigating the impacts of energy development will be one of the major challenges in the coming decades.
Evans, Jeffrey S.; Kiesecker, Joseph M.
2014-01-01
Global demand for energy has increased by more than 50 percent in the last half-century, and a similar increase is projected by 2030. This demand will increasingly be met with alternative and unconventional energy sources. Development of these resources causes disturbances that strongly impact terrestrial and freshwater ecosystems. The Marcellus Shale gas play covers more than 160,934 km2 in an area that provides drinking water for over 22 million people in several of the largest metropolitan areas in the United States (e.g. New York City, Washington DC, Philadelphia & Pittsburgh). Here we created probability surfaces representing development potential of wind and shale gas for portions of six states in the Central Appalachians. We used these predictions and published projections to model future energy build-out scenarios to quantify future potential impacts on surface drinking water. Our analysis predicts up to 106,004 new wells and 10,798 new wind turbines resulting up to 535,023 ha of impervious surface (3% of the study area) and upwards of 447,134 ha of impacted forest (2% of the study area). In light of this new energy future, mitigating the impacts of energy development will be one of the major challenges in the coming decades. PMID:24586599
ERIC Educational Resources Information Center
Idaho State Library, Boise.
In l998, Idahoans gathered in a series of six Regional Futures Conferences to identify what they thought was probable during the next ten years, what was possible for libraries to do and be, and what a preferred future of Idaho libraries might be. Participants from the regional conferences then convened to refine and focus descriptions of the…
The Future Outlook for School Facilities Planning and Design.
ERIC Educational Resources Information Center
Brubaker, C. William
School design is influenced by four major factors: the education program, the community site, education technology, and building technology. Schools of the future are discussed in relation to the factors affecting school design. It is probable that future schools will be involved in a broader spectrum of programs and will serve a more diverse…
NASA Astrophysics Data System (ADS)
Falatkova, Kristyna; Schöner, Wolfgang; Häusler, Hermann; Reisenhofer, Stefan; Neureiter, Anton; Sobr, Miroslav; Jansky, Bohumir
2017-04-01
Mountain glacier retreat has a well-known impact on life of local population - besides anxiety over water supply for agriculture, industry, or households, it has proved to have a direct influence on glacier hazard occurrence. The paper focuses on lake outburst hazard specifically, and aims to describe the previous and future development of Adygine glacier complex and identify its relationship to the hazard. The observed glacier is situated in the Northern Tien Shan, with an area of 4 km2 in northern exposition at an elevation range of 3,500-4,200 m a.s.l. The study glacier ranks in the group of small-sized glaciers, therefore we expect it to respond faster to changes of the climate compared to larger ones. Below the glacier there is a three-level cascade of proglacial lakes at different stages of development. The site has been observed sporadically since 1960s, however, closer study has been carried out since 2007. Past development of the glacier-lake complex is analyzed by combination of satellite imagery interpretations and on-site measurements (geodetic and bathymetric survey). A glacier mass balance model is used to simulate future development of the glacier resulting from climate scenarios. We used the simulated future glacier extent and the glacier base topography provided by GPR survey to assess potential for future lake formation. This enables us to assess the outburst hazard for the three selected lakes with an outlook for possible/probable hazard changes linked to further complex succession/progression (originating from climate change scenarios). Considering the proximity of the capital Bishkek, spreading settlements, and increased demand for tourism-related infrastructure within the main valley, it is of high importance to identify the present and possible future hazards that have a potential to affect this region.
Land use planning and wildfire: development policies influence future probability of housing loss
Syphard, Alexandra D.; Massada, Avi Bar; Butsic, Van; Keeley, Jon E.
2013-01-01
Increasing numbers of homes are being destroyed by wildfire in the wildland-urban interface. With projections of climate change and housing growth potentially exacerbating the threat of wildfire to homes and property, effective fire-risk reduction alternatives are needed as part of a comprehensive fire management plan. Land use planning represents a shift in traditional thinking from trying to eliminate wildfires, or even increasing resilience to them, toward avoiding exposure to them through the informed placement of new residential structures. For land use planning to be effective, it needs to be based on solid understanding of where and how to locate and arrange new homes. We simulated three scenarios of future residential development and projected landscape-level wildfire risk to residential structures in a rapidly urbanizing, fire-prone region in southern California. We based all future development on an econometric subdivision model, but we varied the emphasis of subdivision decision-making based on three broad and common growth types: infill, expansion, and leapfrog. Simulation results showed that decision-making based on these growth types, when applied locally for subdivision of individual parcels, produced substantial landscape-level differences in pattern, location, and extent of development. These differences in development, in turn, affected the area and proportion of structures at risk from burning in wildfires. Scenarios with lower housing density and larger numbers of small, isolated clusters of development, i.e., resulting from leapfrog development, were generally predicted to have the highest predicted fire risk to the largest proportion of structures in the study area, and infill development was predicted to have the lowest risk. These results suggest that land use planning should be considered an important component to fire risk management and that consistently applied policies based on residential pattern may provide substantial benefits for future risk reduction.
NASA Technical Reports Server (NTRS)
Silva, P. M.; Silva, I. M.
1974-01-01
Various methods presently used for the dissemination of time at several levels of precision are described along with future projects in the field. Different aspects of time coordination are reviewed and a list of future laboratories participating in a National Time Scale will be presented. A Brazilian Atomic Time Scale will be obtained from as many of these laboratories as possible. The problem of intercomparison between the Brazilian National Time Scale and the International one will be presented and probable solutions will be discussed. Needs related to the TV Line-10 method will be explained and comments will be made on the legal aspects of time dissemination throughout the country.
The past and future of food stocks
NASA Astrophysics Data System (ADS)
Laio, Francesco; Ridolfi, Luca; D'Odorico, Paolo
2016-03-01
Human societies rely on food reserves and the importation of agricultural goods as means to cope with crop failures and associated food shortage. While food trade has been the subject of intensive investigations in recent years, food reserves remain poorly quantified. It is unclear how food stocks are changing and whether they are declining. In this study we use food stock records for 92 products to reconstruct 50 years of aggregated food reserves, expressed in caloric equivalent (kcal), at the regional and global scales. A detailed statistical analysis demonstrates that the overall regional and global per-capita food stocks are stationary, challenging a widespread impression that food reserves are shrinking. We develop a statistically-sound stochastic representation of stock dynamics and take the stock-halving probability as a measure of the natural variability of the process. We find that there is a 20% probability that the global per-capita stocks will be halved by 2050. There are, however, some strong regional differences: Western Europe and the region encompassing North Africa and the Middle East have smaller halving probabilities and smaller per-capita stocks, while North America and Oceania have greater halving probabilities and greater per-capita stocks than the global average. Africa exhibits low per-capita stocks and relatively high probability of stock halving by 2050, which reflects a state of higher food insecurity in this continent.
NASA Astrophysics Data System (ADS)
Keyser, A.; Westerling, A. L.; Jones, G.; Peery, M. Z.
2017-12-01
Sierra Nevada forests have experienced an increase in very large fires with significant areas of high burn severity, such as the Rim (2013) and King (2014) fires, that have impacted habitat of endangered species such as the California spotted owl. In order to support land manager forest management planning and risk assessment activities, we used historical wildfire histories from the Monitoring Trends in Burn Severity project and gridded hydroclimate and land surface characteristics data to develope statistical models to simulate the frequency, location and extent of high severity burned area in Sierra Nevada forest wildfires as functions of climate and land surface characteristics. We define high severity here as BA90 area: the area comprising patches with ninety percent or more basal area killed within a larger fire. We developed a system of statistical models to characterize the probability of large fire occurrence, the probability of significant BA90 area present given a large fire, and the total extent of BA90 area in a fire on a 1/16 degree lat/lon grid over the Sierra Nevada. Repeated draws from binomial and generalized pareto distributions using these probabilities generated a library of simulated histories of high severity fire for a range of near (50 yr) future climate and fuels management scenarios. Fuels management scenarios were provided by USFS Region 5. Simulated BA90 area was then downscaled to 30 m resolution using a statistical model we developed using Random Forest techniques to estimate the probability of adjacent 30m pixels burning with ninety percent basal kill as a function of fire size and vegetation and topographic features. The result is a library of simulated high resolution maps of BA90 burned areas for a range of climate and fuels management scenarios with which we estimated conditional probabilities of owl nesting sites being impacted by high severity wildfire.
2014-11-13
Cm) in a given set C ⊂ IRm . (5.7) Motivation for generalized regression comes from applications in which Y has the cost/loss orien- tation that we have...distribution. The corresponding probability measure on IRm is induced then by the multivariate distribution function FV1,...,Vm(v1, . . . , vm) = prob { (V1...could be generated by future observations of some variables V1, . . . , Vm, as above, in which case Ω would be a subset of IRm with elements ω = (v1
Rural-urban migration in a developing country: Botswana, Africa.
Tarver, J D; Miller, H M
1987-01-01
Trends in internal migration in Botswana are analyzed, with a focus on rural-urban migration. Data are from the 1981 census and from a survey carried out in 1979. The authors note that even though the predominance of subsistence agriculture acts as a deterrent to rural-urban migration, it is probable that the total and percentage of people living in urban areas will increase. However, the magnitude and pattern of future migration will fluctuate over time as social and economic conditions change.
Focus in High School Mathematics: Statistics and Probability
ERIC Educational Resources Information Center
National Council of Teachers of Mathematics, 2009
2009-01-01
Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…
Stochastic Modeling of Past Volcanic Crises
NASA Astrophysics Data System (ADS)
Woo, Gordon
2018-01-01
The statistical foundation of disaster risk analysis is past experience. From a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of Nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios. In particular, the dynamic perturbations that might have transitioned a volcano from an unrest to an eruptive state need to be considered. The stochastic modeling of past volcanic crises leads to estimates of eruption probability that can illuminate historical volcanic crisis decisions. It can also inform future economic risk management decisions in regions where there has been some volcanic unrest, but no actual eruption for at least hundreds of years. Furthermore, the availability of a library of past eruption probabilities would provide benchmark support for estimates of eruption probability in future volcanic crises.
Becher, M A; Grimm, V; Knapp, J; Horn, J; Twiston-Davies, G; Osborne, J L
2016-11-24
Social bees are central place foragers collecting floral resources from the surrounding landscape, but little is known about the probability of a scouting bee finding a particular flower patch. We therefore developed a software tool, BEESCOUT, to theoretically examine how bees might explore a landscape and distribute their scouting activities over time and space. An image file can be imported, which is interpreted by the model as a "forage map" with certain colours representing certain crops or habitat types as specified by the user. BEESCOUT calculates the size and location of these potential food sources in that landscape relative to a bee colony. An individual-based model then determines the detection probabilities of the food patches by bees, based on parameter values gathered from the flight patterns of radar-tracked honeybees and bumblebees. Various "search modes" describe hypothetical search strategies for the long-range exploration of scouting bees. The resulting detection probabilities of forage patches can be used as input for the recently developed honeybee model BEEHAVE, to explore realistic scenarios of colony growth and death in response to different stressors. In example simulations, we find that detection probabilities for food sources close to the colony fit empirical data reasonably well. However, for food sources further away no empirical data are available to validate model output. The simulated detection probabilities depend largely on the bees' search mode, and whether they exchange information about food source locations. Nevertheless, we show that landscape structure and connectivity of food sources can have a strong impact on the results. We believe that BEESCOUT is a valuable tool to better understand how landscape configurations and searching behaviour of bees affect detection probabilities of food sources. It can also guide the collection of relevant data and the design of experiments to close knowledge gaps, and provides a useful extension to the BEEHAVE honeybee model, enabling future users to explore how landscape structure and food availability affect the foraging decisions and patch visitation rates of the bees and, in consequence, to predict colony development and survival.
NASA Astrophysics Data System (ADS)
Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine
2016-04-01
Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.
NASA Technical Reports Server (NTRS)
Naumann, R. J.; Oran, W. A.; Whymark, R. R.; Rey, C.
1981-01-01
The single axis acoustic levitator that was flown on SPAR VI malfunctioned. The results of a series of tests, analyses, and investigation of hypotheses that were undertaken to determine the probable cause of failure are presented, together with recommendations for future flights of the apparatus. The most probable causes of the SPAR VI failure were lower than expected sound intensity due to mechanical degradation of the sound source, and an unexpected external force that caused the experiment sample to move radially and eventually be lost from the acoustic energy well.
Integrating Future Information through Scenarios. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Zentner, Rene D.
The way that higher education planners can take into account changes in the post-industrial society is discussed. The scenario method is proposed as a method of integrating futures information. The planner can be provided with several probable futures, each of which can be incorporated in a scenario. An effective scenario provides the planner…
Neurons That Update Representations of the Future.
Seriès, Peggy
2018-06-11
A recent article shows that the brain automatically estimates the probabilities of possible future actions before it has even received all the information necessary to decide what to do next. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
[Analysis and design structure of an aging society].
Fujimasa, Iwao
2012-01-01
On observing present Japanese society, we can find deep gaps between the present system and its probable future. One of the gaps may be due to the misconception that future societal make up is not definite. The aim of the current study was to investigate a future societal structure and to develop methods of adding a timed dimension policy to the societal structure. This is named "A theory of structuralism economics". We developed 3 societal structure projection engines and applied a system of dynamics language to estimate the future total population of Japan. The Japan total population reached a maximum in 2005, and thereafter depopulation begun. The populations in the younger working age group (from 25 to 54 years old) and those in the elderly working age group (from 55 to 84 years old) became almost equal in 2010. As economic growth rate depends upon an increase in the working population, the increase in national income rate of Japan approached over 10% per year between 1950 to 1970. The increased working age population of the same period exceeded 2.5% annually. However, after 2005 depopulation began in Japan. In future, national income will decrease proportional to the working age population, but personal national income will hold almost unchanged. We propose a new strategy for future society structure. The working age should be extended by 10 years. Labor power will come to exceed 60% of the population and will thereafter become stable.
Foxon, Timothy J
2010-07-28
This paper addresses the probable levels of investment needed in new technologies for energy conversion and storage that are essential to address climate change, drawing on past evidence on the rate of cost improvements in energy technologies. A range of energy materials and technologies with lower carbon emissions over their life cycle are being developed, including fuel cells (FCs), hydrogen storage, batteries, supercapacitors, solar energy and nuclear power, and it is probable that most, if not all, of these technologies will be needed to mitigate climate change. High rates of innovation and deployment will be needed to meet targets such as the UK's goal of reducing its greenhouse gas emissions by 80 per cent by 2050, which will require significant levels of investment. Learning curves observed for reductions in unit costs of energy technologies, such as photovoltaics and FCs, can provide evidence on the probable future levels of investment needed. The paper concludes by making recommendations for policy measures to promote such investment from both the public and private sectors.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
NASA Technical Reports Server (NTRS)
Smith, O. E.; Adelfang, S. I.
1998-01-01
The wind profile with all of its variations with respect to altitude has been, is now, and will continue to be important for aerospace vehicle design and operations. Wind profile databases and models are used for the vehicle ascent flight design for structural wind loading, flight control systems, performance analysis, and launch operations. This report presents the evolution of wind statistics and wind models from the empirical scalar wind profile model established for the Saturn Program through the development of the vector wind profile model used for the Space Shuttle design to the variations of this wind modeling concept for the X-33 program. Because wind is a vector quantity, the vector wind models use the rigorous mathematical probability properties of the multivariate normal probability distribution. When the vehicle ascent steering commands (ascent guidance) are wind biased to the wind profile measured on the day-of-launch, ascent structural wind loads are reduced and launch probability is increased. This wind load alleviation technique is recommended in the initial phase of vehicle development. The vehicle must fly through the largest load allowable versus altitude to achieve its mission. The Gumbel extreme value probability distribution is used to obtain the probability of exceeding (or not exceeding) the load allowable. The time conditional probability function is derived from the Gumbel bivariate extreme value distribution. This time conditional function is used for calculation of wind loads persistence increments using 3.5-hour Jimsphere wind pairs. These increments are used to protect the commit-to-launch decision. Other topics presented include the Shuttle Shuttle load-response to smoothed wind profiles, a new gust model, and advancements in wind profile measuring systems. From the lessons learned and knowledge gained from past vehicle programs, the development of future launch vehicles can be accelerated. However, new vehicle programs by their very nature will require specialized support for new databases and analyses for wind, atmospheric parameters (pressure, temperature, and density versus altitude), and weather. It is for this reason that project managers are encouraged to collaborate with natural environment specialists early in the conceptual design phase. Such action will give the lead time necessary to meet the natural environment design and operational requirements, and thus, reduce development costs.
Shen, Fuhai; Yuan, Juxiang; Sun, Zhiqian; Hua, Zhengbing; Qin, Tianbang; Yao, Sanqiao; Fan, Xueyun; Chen, Weihong; Liu, Hongbo; Chen, Jie
2013-01-01
Background Prior to 1970, coal mining technology and prevention measures in China were poor. Mechanized coal mining equipment and advanced protection measures were continuously installed in the mines after 1970. All these improvements may have resulted in a change in the incidence of coal workers’ pneumoconiosis (CWP). Therefore, it is important to identify the characteristics of CWP today and trends for the incidence of CWP in the future. Methodology/Principal Findings A total of 17,023 coal workers from the Kailuan Colliery Group were studied. A life-table method was used to calculate the cumulative incidence rate of CWP and predict the number of new CWP patients in the future. The probability of developing CWP was estimated by a multilayer perceptron artificial neural network for each coal worker without CWP. The results showed that the cumulative incidence rates of CWP for tunneling, mining, combining, and helping workers were 31.8%, 27.5%, 24.2%, and 2.6%, respectively, during the same observation period of 40 years. It was estimated that there would be 844 new CWP cases among 16,185 coal workers without CWP within their life expectancy. There would be 273.1, 273.1, 227.6, and 69.9 new CWP patients in the next <10, 10-, 20-, and 30- years respectively in the study cohort within their life expectancy. It was identified that coal workers whose risk probabilities were over 0.2 were at high risk for CWP, and whose risk probabilities were under 0.1 were at low risk. Conclusion/Significance The present and future incidence trends of CWP remain high among coal workers. We suggest that coal workers at high risk of CWP undergo a physical examination for pneumoconiosis every year, and the coal workers at low risk of CWP be examined every 5 years. PMID:24376519
Probabilistic rainfall warning system with an interactive user interface
NASA Astrophysics Data System (ADS)
Koistinen, Jarmo; Hohti, Harri; Kauhanen, Janne; Kilpinen, Juha; Kurki, Vesa; Lauri, Tuomo; Nurmi, Pertti; Rossi, Pekka; Jokelainen, Miikka; Heinonen, Mari; Fred, Tommi; Moisseev, Dmitri; Mäkelä, Antti
2013-04-01
A real time 24/7 automatic alert system is in operational use at the Finnish Meteorological Institute (FMI). It consists of gridded forecasts of the exceedance probabilities of rainfall class thresholds in the continuous lead time range of 1 hour to 5 days. Nowcasting up to six hours applies ensemble member extrapolations of weather radar measurements. With 2.8 GHz processors using 8 threads it takes about 20 seconds to generate 51 radar based ensemble members in a grid of 760 x 1226 points. Nowcasting exploits also lightning density and satellite based pseudo rainfall estimates. The latter ones utilize convective rain rate (CRR) estimate from Meteosat Second Generation. The extrapolation technique applies atmospheric motion vectors (AMV) originally developed for upper wind estimation with satellite images. Exceedance probabilities of four rainfall accumulation categories are computed for the future 1 h and 6 h periods and they are updated every 15 minutes. For longer forecasts exceedance probabilities are calculated for future 6 and 24 h periods during the next 4 days. From approximately 1 hour to 2 days Poor man's Ensemble Prediction System (PEPS) is used applying e.g. the high resolution short range Numerical Weather Prediction models HIRLAM and AROME. The longest forecasts apply EPS data from the European Centre for Medium Range Weather Forecasts (ECMWF). The blending of the ensemble sets from the various forecast sources is performed applying mixing of accumulations with equal exceedance probabilities. The blending system contains a real time adaptive estimator of the predictability of radar based extrapolations. The uncompressed output data are written to file for each member, having total size of 10 GB. Ensemble data from other sources (satellite, lightning, NWP) are converted to the same geometry as the radar data and blended as was explained above. A verification system utilizing telemetering rain gauges has been established. Alert dissemination e.g. for citizens and professional end users applies SMS messages and, in near future, smartphone maps. The present interactive user interface facilitates free selection of alert sites and two warning thresholds (any rain, heavy rain) at any location in Finland. The pilot service was tested by 1000-3000 users during summers 2010 and 2012. As an example of dedicated end-user services gridded exceedance scenarios (of probabilities 5 %, 50 % and 90 %) of hourly rainfall accumulations for the next 3 hours have been utilized as an online input data for the influent model at the Greater Helsinki Wastewater Treatment Plant.
Lennernäs, Hans; Abrahamsson, Bertil
2005-03-01
Bioavailability (BA) and bioequivalence (BE) play a central role in pharmaceutical product development and BE studies are presently being conducted for New Drug Applications (NDAs) of new compounds, in supplementary NDAs for new medical indications and product line extensions, in Abbreviated New Drug Applications (ANDAs) of generic products and in applications for scale-up and post-approval changes. The Biopharmaceutics Classification System (BCS) has been developed to provide a scientific approach for classifying drug compounds based on solubility as related to dose and intestinal permeability in combination with the dissolution properties of the oral immediaterelease (IR) dosage form. The aim of the BCS is to provide a regulatory tool for replacing certain BE studies by accurate in-vitro dissolution tests. The aim of this review is to present the status of the BCS and discuss its future application in pharmaceutical product development. The future application of the BCS is most likely increasingly important when the present framework gains increased recognition, which will probably be the case if the BCS borders for certain class II and III drugs are extended. The future revision of the BCS guidelines by the regulatory agencies in communication with academic and industrial scientists is exciting and will hopefully result in an increased applicability in drug development. Finally, we emphasize the great use of the BCS as a simple tool in early drug development to determine the rate-limiting step in the oral absorption process, which has facilitated the information between different experts involved in the overall drug development process. This increased awareness of a proper biopharmaceutical characterization of new drugs may in the future result in drug molecules with a sufficiently high permeability, solubility and dissolution rate, and that will automatically increase the importance of the BCS as a regulatory tool over time.
Gariepy, Aileen M; Creinin, Mitchell D; Smith, Kenneth J; Xu, Xiao
2014-08-01
To compare the expected probability of pregnancy after hysteroscopic versus laparoscopic sterilization based on available data using decision analysis. We developed an evidence-based Markov model to estimate the probability of pregnancy over 10 years after three different female sterilization procedures: hysteroscopic, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation. Parameter estimates for procedure success, probability of completing follow-up testing and risk of pregnancy after different sterilization procedures were obtained from published sources. In the base case analysis at all points in time after the sterilization procedure, the initial and cumulative risk of pregnancy after sterilization is higher in women opting for hysteroscopic than either laparoscopic band or bipolar sterilization. The expected pregnancy rates per 1000 women at 1 year are 57, 7 and 3 for hysteroscopic sterilization, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation, respectively. At 10 years, the cumulative pregnancy rates per 1000 women are 96, 24 and 30, respectively. Sensitivity analyses suggest that the three procedures would have an equivalent pregnancy risk of approximately 80 per 1000 women at 10 years if the probability of successful laparoscopic (band or bipolar) sterilization drops below 90% and successful coil placement on first hysteroscopic attempt increases to 98% or if the probability of undergoing a hysterosalpingogram increases to 100%. Based on available data, the expected population risk of pregnancy is higher after hysteroscopic than laparoscopic sterilization. Consistent with existing contraceptive classification, future characterization of hysteroscopic sterilization should distinguish "perfect" and "typical" use failure rates. Pregnancy probability at 1 year and over 10 years is expected to be higher in women having hysteroscopic as compared to laparoscopic sterilization. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Hudson, Nicolas; Lin, Ying; Barengoltz, Jack
2010-01-01
A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.
Only the carrot, not the stick: incorporating trust into the enforcement of regulation.
Mendoza, Juan P; Wielhouwer, Jacco L
2015-01-01
New enforcement strategies allow agents to gain the regulator's trust and consequently face a lower audit probability. Prior research suggests that, in order to prevent lower compliance, a reduction in the audit probability (the "carrot") must be compensated with the introduction of a higher penalty for non-compliance (the "stick"). However, such carrot-and-stick strategies reflect neither the concept of trust nor the strategies observed in practice. In response to this, we define trust-based regulation as a strategy that incorporates rules that allow trust to develop, and using a generic (non-cooperative) game of tax compliance, we examine whether trust-based regulation is feasible (i.e., whether, in equilibrium, a reduction in the audit probability, without ever increasing the penalty for non-compliance, does not lead to reduced compliance). The model shows that trust-based regulation is feasible when the agent sufficiently values the future. In line with the concept of trust, this strategy is feasible when the regulator is uncertain about the agent's intentions. Moreover, the model shows that (i) introducing higher penalties makes trust-based regulation less feasible, and (ii) combining trust and forgiveness can lead to a lower audit probability for both trusted and distrusted agents. Policy recommendations often point toward increasing deterrence. This model shows that the opposite can be optimal.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.
Potential future land use threats to California's protected areas
Wilson, Tamara Sue; Sleeter, Benjamin Michael; Davis, Adam Wilkinson
2015-01-01
Increasing pressures from land use coupled with future changes in climate will present unique challenges for California’s protected areas. We assessed the potential for future land use conversion on land surrounding existing protected areas in California’s twelve ecoregions, utilizing annual, spatially explicit (250 m) scenario projections of land use for 2006–2100 based on the Intergovernmental Panel on Climate Change Special Report on Emission Scenarios to examine future changes in development, agriculture, and logging. We calculated a conversion threat index (CTI) for each unprotected pixel, combining land use conversion potential with proximity to protected area boundaries, in order to identify ecoregions and protected areas at greatest potential risk of proximal land conversion. Our results indicate that California’s Coast Range ecoregion had the highest CTI with competition for extractive logging placing the greatest demand on land in close proximity to existing protected areas. For more permanent land use conversions into agriculture and developed uses, our CTI results indicate that protected areas in the Central California Valley and Oak Woodlands are most vulnerable. Overall, the Eastern Cascades, Central California Valley, and Oak Woodlands ecoregions had the lowest areal percent of protected lands and highest conversion threat values. With limited resources and time, rapid, landscape-level analysis of potential land use threats can help quickly identify areas with higher conversion probability of future land use and potential changes to both habitat and potential ecosystem reserves. Given the broad range of future uncertainties, LULC projections are a useful tool allowing land managers to visualize alternative landscape futures, improve planning, and optimize management practices.
ERIC Educational Resources Information Center
Harris, Adam J. L.; Corner, Adam
2011-01-01
Verbal probability expressions are frequently used to communicate risk and uncertainty. The Intergovernmental Panel on Climate Change (IPCC), for example, uses them to convey risks associated with climate change. Given the potential for human action to mitigate future environmental risks, it is important to understand how people respond to these…
A new algorithm for finding survival coefficients employed in reliability equations
NASA Technical Reports Server (NTRS)
Bouricius, W. G.; Flehinger, B. J.
1973-01-01
Product reliabilities are predicted from past failure rates and reasonable estimate of future failure rates. Algorithm is used to calculate probability that product will function correctly. Algorithm sums the probabilities of each survival pattern and number of permutations for that pattern, over all possible ways in which product can survive.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY... effort to assess the probability of future behavior which could have an effect adverse to the national... the past but necessarily anticipating the future. Rarely is proof of trustworthiness and reliability...
Global forecasts of urban expansion to 2030 and direct impacts on biodiversity and carbon pools.
Seto, Karen C; Güneralp, Burak; Hutyra, Lucy R
2012-10-02
Urban land-cover change threatens biodiversity and affects ecosystem productivity through loss of habitat, biomass, and carbon storage. However, despite projections that world urban populations will increase to nearly 5 billion by 2030, little is known about future locations, magnitudes, and rates of urban expansion. Here we develop spatially explicit probabilistic forecasts of global urban land-cover change and explore the direct impacts on biodiversity hotspots and tropical carbon biomass. If current trends in population density continue and all areas with high probabilities of urban expansion undergo change, then by 2030, urban land cover will increase by 1.2 million km(2), nearly tripling the global urban land area circa 2000. This increase would result in considerable loss of habitats in key biodiversity hotspots, with the highest rates of forecasted urban growth to take place in regions that were relatively undisturbed by urban development in 2000: the Eastern Afromontane, the Guinean Forests of West Africa, and the Western Ghats and Sri Lanka hotspots. Within the pan-tropics, loss in vegetation biomass from areas with high probability of urban expansion is estimated to be 1.38 PgC (0.05 PgC yr(-1)), equal to ∼5% of emissions from tropical deforestation and land-use change. Although urbanization is often considered a local issue, the aggregate global impacts of projected urban expansion will require significant policy changes to affect future growth trajectories to minimize global biodiversity and vegetation carbon losses.
Global forecasts of urban expansion to 2030 and direct impacts on biodiversity and carbon pools
Seto, Karen C.; Güneralp, Burak; Hutyra, Lucy R.
2012-01-01
Urban land-cover change threatens biodiversity and affects ecosystem productivity through loss of habitat, biomass, and carbon storage. However, despite projections that world urban populations will increase to nearly 5 billion by 2030, little is known about future locations, magnitudes, and rates of urban expansion. Here we develop spatially explicit probabilistic forecasts of global urban land-cover change and explore the direct impacts on biodiversity hotspots and tropical carbon biomass. If current trends in population density continue and all areas with high probabilities of urban expansion undergo change, then by 2030, urban land cover will increase by 1.2 million km2, nearly tripling the global urban land area circa 2000. This increase would result in considerable loss of habitats in key biodiversity hotspots, with the highest rates of forecasted urban growth to take place in regions that were relatively undisturbed by urban development in 2000: the Eastern Afromontane, the Guinean Forests of West Africa, and the Western Ghats and Sri Lanka hotspots. Within the pan-tropics, loss in vegetation biomass from areas with high probability of urban expansion is estimated to be 1.38 PgC (0.05 PgC yr−1), equal to ∼5% of emissions from tropical deforestation and land-use change. Although urbanization is often considered a local issue, the aggregate global impacts of projected urban expansion will require significant policy changes to affect future growth trajectories to minimize global biodiversity and vegetation carbon losses. PMID:22988086
NASA Astrophysics Data System (ADS)
Keiler, M.
2003-04-01
Reports on catastrophes with high damage caused by natural hazards seem to have increased in number recently. A new trend in dealing with these natural processes leads to the integration of risk into natural hazards evaluations and approaches of integral risk management. The risk resulting from natural hazards can be derived from the combination of parameters of physical processes (intensity and recurrence probability) and damage potential (probability of presence and expected damage value). Natural hazard research focuses mainly on the examination, modelling and estimation of individual geomorphological processes as well as on future developments caused by climate change. Even though damage potential has been taken into account more frequently, quantifying statements are still missing. Due to the changes of the socio-economic structures in mountain regions (urban sprawl, population growth, increased mobility and tourism) these studies are mandatory. This study presents a conceptual method that records the damage potential (probability of physical presence, evaluation of buildings) and shows the development of the damage potential resulting from avalanches since 1950. The study area is the community of Galtür, Austria. 36 percent of the existing buildings are found in officially declared avalanche hazard zones. The majority of these buildings are either agricultural or accommodation facilities. Additionally, the effects of physical planning and/or technical measures on the spatial development of the potential damage are illustrated. The results serve to improve risk determination and point out an unnoticed increase of damage potential and risk in apparently safe settlement areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merchant, Bion J
2015-12-22
NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less
Blank, Robert D
2011-01-01
The 2010 Position Development Conference addressed four questions related to the impact of previous fractures on 10-year fracture risk as calculated by FRAX(®). To address these questions, PubMed was searched on the keywords "fracture, epidemiology, osteoporosis." Titles of retrieved articles were reviewed for an indication that risk for future fracture was discussed. Abstracts of these articles were reviewed for an indication that one or more of the questions listed above was discussed. For those that did, the articles were reviewed in greater detail to extract the findings and to find additional past work and citing works that also bore on the questions. The official positions and the supporting literature review are presented here. FRAX(®) underestimates fracture probability in persons with a history of multiple fractures (good, A, W). FRAX(®) may underestimate fracture probability in individuals with prevalent severe vertebral fractures (good, A, W). While there is evidence that hip, vertebral, and humeral fractures appear to confer greater risk of subsequent fracture than fractures at other sites, quantification of this incremental risk in FRAX(®) is not possible (fair, B, W). FRAX(®) may underestimate fracture probability in individuals with a parental history of non-hip fragility fracture (fair, B, W). Limitations of the methodology include performance by a single reviewer, preliminary review of the literature being confined to titles, and secondary review being limited to abstracts. Limitations of the evidence base include publication bias, overrepresentation of persons of European descent in the published studies, and technical differences in the methods used to identify prevalent and incident fractures. Emerging topics for future research include fracture epidemiology in non-European populations and men, the impact of fractures in family members other than parents, and the genetic contribution to fracture risk. Copyright © 2011 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
Zimmerman, Tammy M.
2006-01-01
The Lake Erie shoreline in Pennsylvania spans nearly 40 miles and is a valuable recreational resource for Erie County. Nearly 7 miles of the Lake Erie shoreline lies within Presque Isle State Park in Erie, Pa. Concentrations of Escherichia coli (E. coli) bacteria at permitted Presque Isle beaches occasionally exceed the single-sample bathing-water standard, resulting in unsafe swimming conditions and closure of the beaches. E. coli concentrations and other water-quality and environmental data collected at Presque Isle Beach 2 during the 2004 and 2005 recreational seasons were used to develop models using tobit regression analyses to predict E. coli concentrations. All variables statistically related to E. coli concentrations were included in the initial regression analyses, and after several iterations, only those explanatory variables that made the models significantly better at predicting E. coli concentrations were included in the final models. Regression models were developed using data from 2004, 2005, and the combined 2-year dataset. Variables in the 2004 model and the combined 2004-2005 model were log10 turbidity, rain weight, wave height (calculated), and wind direction. Variables in the 2005 model were log10 turbidity and wind direction. Explanatory variables not included in the final models were water temperature, streamflow, wind speed, and current speed; model results indicated these variables did not meet significance criteria at the 95-percent confidence level (probabilities were greater than 0.05). The predicted E. coli concentrations produced by the models were used to develop probabilities that concentrations would exceed the single-sample bathing-water standard for E. coli of 235 colonies per 100 milliliters. Analysis of the exceedence probabilities helped determine a threshold probability for each model, chosen such that the correct number of exceedences and nonexceedences was maximized and the number of false positives and false negatives was minimized. Future samples with computed exceedence probabilities higher than the selected threshold probability, as determined by the model, will likely exceed the E. coli standard and a beach advisory or closing may need to be issued; computed exceedence probabilities lower than the threshold probability will likely indicate the standard will not be exceeded. Additional data collected each year can be used to test and possibly improve the model. This study will aid beach managers in more rapidly determining when waters are not safe for recreational use and, subsequently, when to issue beach advisories or closings.
Stochastic Modelling of Past Volcanic Crises
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
It is customary to have continuous monitoring of volcanoes showing signs of unrest that might lead to an eruption threatening local populations. Despite scientific progress in estimating the probability of an eruption occurring, the concept of continuously tracking eruption probability remains a future aspiration for volcano risk analysts. During some recent major volcanic crises, attempts have been made to estimate the eruption probability in real time to support government decision-making. These include the possibility of an eruption of Katla linked with the eruption of Eyjafjallajökull in 2010, and the Santorini crisis of 2011-2012. However, once a crisis fades, interest in analyzing the probability that there might have been an eruption tends to wane. There is an inherent outcome bias well known to psychologists: if disaster was avoided, there is perceived to be little purpose in exploring scenarios where a disaster might have happened. Yet the better that previous periods of unrest are understood and modelled, the better that the risk associated with future periods of unrest will be quantified. Scenarios are counterfactual histories of the future. The task of quantifying the probability of an eruption for a past period of unrest should not be merely a statistical calculation, but should serve to elucidate and refine geophysical models of the eruptive processes. This is achieved by using a Bayesian Belief Network approach, in which monitoring observations are used to draw inferences on the underlying causal factors. Specifically, risk analysts are interested in identifying what dynamical perturbations might have tipped an unrest period in history over towards an eruption, and assessing what was the likelihood of such perturbations. Furthermore, in what ways might a historical volcano crisis have turned for the worse? Such important counterfactual questions are addressed in this paper.
The role of magical thinking in forecasting the future.
Stavrova, Olga; Meckel, Andrea
2017-02-01
This article explores the role of magical thinking in the subjective probabilities of future chance events. In five experiments, we show that individuals tend to predict a more lucky future (reflected in probability judgements of lucky and unfortunate chance events) for someone who happened to purchase a product associated with a highly moral person than for someone who unknowingly purchased a product associated with a highly immoral person. In the former case, positive events were considered more likely than negative events, whereas in the latter case, the difference in the likelihood judgement of positive and negative events disappeared or even reversed. Our results indicate that this effect is unlikely to be driven by participants' immanent justice beliefs, the availability heuristic, or experimenter demand. Finally, we show that individuals rely more heavily on magical thinking when their need for control is threatened, thus suggesting that lack of control represents a factor in driving magical thinking in making predictions about the future. © 2016 The British Psychological Society.
Probability of criminal acts of violence: a test of jury predictive accuracy.
Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D
2013-01-01
The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.
Visualizing uncertainty about the future.
Spiegelhalter, David; Pearson, Mike; Short, Ian
2011-09-09
We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge.
Thirty years of Batten disease research: present status and future goals.
Rider, J A; Rider, D L
1999-04-01
From a meager beginning in 1968, when Batten disease or neuronal ceroid lipofuscinosis was practically unheard of, tremendous advances have been made. It is now recognized worldwide as the most common neurodegenerative disease in children and young adults. It is recognized as a genetic disease. The infantile form has been localized to chromosome 1 p32 and the juvenile form, to 16p12.1; the gene for the late infantile is on chromosome 11p15 and for a variant form of the late infantile, the gene lies on chromosome 15q21-23. Finally, the molecular basis of the late infantile form is probably a pepstatin-insensitive lysomal peptidase. The future is to identify carriers, prevent the disease, and develop treatment by gene and enzyme replacement. Copyright 1999 Academic Press.
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
NASA Astrophysics Data System (ADS)
Rouhani, Hassan; Leconte, Robert
2018-06-01
Climate change will affect precipitation and flood regimes. It is anticipated that the Probable Maximum Precipitation (PMP) and Probable Maximum Flood (PMF) will be modified in a changing climate. This paper aims to quantify and analyze climate change influences on PMP and PMF in three watersheds with different climatic conditions across the province of Québec, Canada. Output data from the Canadian Regional Climate Model (CRCM) was used to estimate PMP and Probable Maximum Snow Accumulation (PMSA) in future climate projections, which was then used to force the SWAT hydrological model to estimate PMF. PMP and PMF values were estimated for two time horizons each spanning 30 years: 1961-1990 (recent past) and 2041-2070 (future). PMP and PMF were separately analyzed for two seasons: summer-fall and spring. Results show that PMF in the watershed located in southern Québec would remain unchanged in the future horizon, but the trend for the watersheds located in the northeastern and northern areas of the province is an increase of up to 11%.
Risk assessment for tephra dispersal and sedimentation: the example of four Icelandic volcanoes
NASA Astrophysics Data System (ADS)
Biass, Sebastien; Scaini, Chiara; Bonadonna, Costanza; Smith, Kate; Folch, Arnau; Höskuldsson, Armann; Galderisi, Adriana
2014-05-01
In order to assist the elaboration of proactive measures for the management of future Icelandic volcanic eruptions, we developed a new approach to assess the impact associated with tephra dispersal and sedimentation at various scales and for multiple sources. Target volcanoes are Hekla, Katla, Eyjafjallajökull and Askja, selected for their high probabilities of eruption and/or their high potential impact. We combined stratigraphic studies, probabilistic strategies and numerical modelling to develop comprehensive eruption scenarios and compile hazard maps for local ground deposition and regional atmospheric concentration using both TEPHRA2 and FALL3D models. New algorithms for the identification of comprehensive probability density functions of eruptive source parameters were developed for both short and long-lasting activity scenarios. A vulnerability assessment of socioeconomic and territorial aspects was also performed at both national and continental scales. The identification of relevant vulnerability indicators allowed for the identification of the most critical areas and territorial nodes. At a national scale, the vulnerability of economic activities and the accessibility to critical infrastructures was assessed. At a continental scale, we assessed the vulnerability of the main airline routes and airports. Resulting impact and risk were finally assessed by combining hazard and vulnerability analysis.
Modeling climate change impacts on water trading.
Luo, Bin; Maqsood, Imran; Gong, Yazhen
2010-04-01
This paper presents a new method of evaluating the impacts of climate change on the long-term performance of water trading programs, through designing an indicator to measure the mean of periodic water volume that can be released by trading through a water-use system. The indicator is computed with a stochastic optimization model which can reflect the random uncertainty of water availability. The developed method was demonstrated in the Swift Current Creek watershed of Prairie Canada under two future scenarios simulated by a Canadian Regional Climate Model, in which total water availabilities under future scenarios were estimated using a monthly water balance model. Frequency analysis was performed to obtain the best probability distributions for both observed and simulated water quantity data. Results from the case study indicate that the performance of a trading system is highly scenario-dependent in future climate, with trading effectiveness highly optimistic or undesirable under different future scenarios. Trading effectiveness also largely depends on trading costs, with high costs resulting in failure of the trading program. (c) 2010 Elsevier B.V. All rights reserved.
Potential economic benefits of adapting agricultural production systems to future climate change
Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.
2010-01-01
Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.
Potential economic benefits of adapting agricultural production systems to future climate change.
Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R
2010-03-01
Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.
Drought forecasting in Luanhe River basin involving climatic indices
NASA Astrophysics Data System (ADS)
Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.
2017-11-01
Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.
Venable, J M; Ma, Q L; Ginter, P M; Duncan, W J
1993-01-01
Scenario analysis is a strategic planning technique used to describe and evaluate an organization's external environment. A methodology for conducting scenario analysis using the Jefferson County Department of Health and the national, State, and county issues confronting it is outlined. Key health care and organizational issues were identified using published sources, focus groups, questionnaires, and personal interviews. The most important of these issues were selected by asking health department managers to evaluate the issues according to their probability of occurrence and likely impact on the health department. The high-probability, high-impact issues formed the basis for developing scenario logics that constitute the story line holding the scenario together. The results were a set of plausible scenarios that aided in strategic planning, encouraged strategic thinking among managers, eliminated or reduced surprise about environmental changes, and improved managerial discussion and communication. PMID:8265754
Toward an Objectivistic Theory of Probability
1956-01-01
OBJECTIVISTIC PROBABILITY THEORY 2 7 three potential acts: the individual may choose an apple, an orange or a banana . Each of these acts corresponds to a point...its veneer having begun to peel at one corner, etc., etc. Its future there-ness lies in that it may have its legs gnawed at by the new puppy in the
Recent research on the high-probability instructional sequence: A brief review.
Lipschultz, Joshua; Wilder, David A
2017-04-01
The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.
2011 Souris River flood—Will it happen again?
Nustad, Rochelle A.; Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.
2016-09-29
The Souris River Basin is a 61,000 square kilometer basin in the provinces of Saskatchewan and Manitoba and the state of North Dakota. Record setting rains in May and June of 2011 led to record flooding with peak annual streamflow values (762 cubic meters per second [m3/s]) more than twice that of any previously recorded peak streamflow and more than five times the estimated 100 year postregulation streamflow (142 m3/s) at the U.S. Geological Survey (USGS) streamflow-gaging station above Minot, North Dakota. Upstream from Minot, N. Dak., the Souris River is regulated by three reservoirs in Saskatchewan (Rafferty, Boundary, and Alameda) and Lake Darling in North Dakota. During the 2011 flood, the city of Minot, N. Dak., experienced devastating damages with more than 4,000 homes flooded and 11,000 evacuated. As a result, the Souris River Basin Task Force recommended the U.S. Geological Survey (in cooperation with the North Dakota State Water Commission) develop a model for estimating the probabilities of future flooding and drought. The model that was developed took on four parts: (1) looking at past climate, (2) predicting future climate, (3) developing a streamflow model in response to certain climatic variables, and (4) combining future climate estimates with the streamflow model to predict future streamflow events. By taking into consideration historical climate record and trends in basin response to various climatic conditions, it was determined flood risk will remain high in the Souris River Basin until the wet climate state ends.
Domestic and world trends affecting the future of aviation (1980 - 2000), appendix C
NASA Technical Reports Server (NTRS)
1976-01-01
The results are presented of a study of variables affecting aviation in the United States during the last fifth of the twentieth century. A series of key trends relating to economic, social, political, technological, ecological, and environmental developments are identified and discussed with relation to their possible effects on aviation. From this analysis a series of scenarios is developed representing an array of possibilities ranging from severe economic depression and high international tension on the one hand to a world of detente which enjoys an unprecedented economic growth rate and relaxation of tensions on the other. A scenario is presented which represents the manner in which events will most probably develop and their effect on the aviation industry.
Artificial neural networks in gynaecological diseases: current and potential future applications.
Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios
2010-10-01
Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.
Probability judgments under ambiguity and conflict
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081
Probability judgments under ambiguity and conflict.
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.
Shirley, Matthew H.; Dorazio, Robert M.; Abassery, Ekramy; Elhady, Amr A.; Mekki, Mohammed S.; Asran, Hosni H.
2012-01-01
As part of the development of a management program for Nile crocodiles in Lake Nasser, Egypt, we used a dependent double-observer sampling protocol with multiple observers to compute estimates of population size. To analyze the data, we developed a hierarchical model that allowed us to assess variation in detection probabilities among observers and survey dates, as well as account for variation in crocodile abundance among sites and habitats. We conducted surveys from July 2008-June 2009 in 15 areas of Lake Nasser that were representative of 3 main habitat categories. During these surveys, we sampled 1,086 km of lake shore wherein we detected 386 crocodiles. Analysis of the data revealed significant variability in both inter- and intra-observer detection probabilities. Our raw encounter rate was 0.355 crocodiles/km. When we accounted for observer effects and habitat, we estimated a surface population abundance of 2,581 (2,239-2,987, 95% credible intervals) crocodiles in Lake Nasser. Our results underscore the importance of well-trained, experienced monitoring personnel in order to decrease heterogeneity in intra-observer detection probability and to better detect changes in the population based on survey indices. This study will assist the Egyptian government establish a monitoring program as an integral part of future crocodile harvest activities in Lake Nasser
Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay
Anderson, C.R.; Sapiano, M.R.P.; Prasad, M.B.K.; Long, W.; Tango, P.J.; Brown, C.W.; Murtugudde, R.
2010-01-01
Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (???10cellsmL-1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100cellsmL-1) to large- threshold (1000cellsmL-1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of -53%, a Probability of Detection ~75%, a False Alarm Ratio of ~52%, and a Probability of False Detection ~9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed. ?? 2010 Elsevier B.V.
Interrelated structure of high altitude atmospheric profiles
NASA Technical Reports Server (NTRS)
Engler, N. A.; Goldschmidt, M. A.
1972-01-01
A preliminary development of a mathematical model to compute probabilities of thermodynamic profiles is presented. The model assumes an exponential expression for pressure and utilizes the hydrostatic law and equation of state in the determination of density and temperature. It is shown that each thermodynamic variable can be factored into the produce of steady state and perturbation functions. The steady state functions have profiles similar to those of the 1962 standard atmosphere while the perturbation functions oscillate about 1. Limitations of the model and recommendations for future work are presented.
Cost/benefit analysis of advanced materials technologies for future aircraft turbine engines
NASA Technical Reports Server (NTRS)
Stephens, G. E.
1980-01-01
The materials technologies studied included thermal barrier coatings for turbine airfoils, turbine disks, cases, turbine vanes and engine and nacelle composite materials. The cost/benefit of each technology was determined in terms of Relative Value defined as change in return on investment times probability of success divided by development cost. A recommended final ranking of technologies was based primarily on consideration of Relative Values with secondary consideration given to changes in other economic parameters. Technologies showing the most promising cost/benefits were thermal barrier coated temperature nacelle/engine system composites.
Complex Faulting Across the Los Angeles Portion of the Pacific-North American Plate Boundary
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Granat, Robert; Glasscae, Maggi; Lyzenga, Greg; Grant Ludwig, Lisa; Rundle, John
2011-01-01
We propose to observe seismically and tectonically active regions in northern and southern California using UAVSAR to support EarthScope activities. We will test the earthquake forecasting methodology developed by Rundle through NASA's QuakeSim project by observing regions indicated as having high probability for earthquakes in the near future (5-10 years). The UAVSAR flights will serve as a baseline for pre-earthquake activity. Should an earthquake occur during the course of this project, we will also be able to observe postseismic motions associated with the earthquakes.
Zhang, Yequn; Djordjevic, Ivan B; Gao, Xin
2012-08-01
Inspired by recent demonstrations of orbital angular momentum-(OAM)-based single-photon communications, we propose two quantum-channel models: (i) the multidimensional quantum-key distribution model and (ii) the quantum teleportation model. Both models employ operator-sum representation for Kraus operators derived from OAM eigenkets transition probabilities. These models are highly important for future development of quantum-error correction schemes to extend the transmission distance and improve date rates of OAM quantum communications. By using these models, we calculate corresponding quantum-channel capacities in the presence of atmospheric turbulence.
A prospective approach to coastal geography from satellite. [technological forecasting
NASA Technical Reports Server (NTRS)
Munday, J. C., Jr.
1981-01-01
A forecasting protocol termed the "prospective approach' was used to examine probable futures relative to coastal applications of satellite data. Significant variables include the energy situation, the national economy, national Earth satellite programs, and coastal zone research, commercial activity, and regulatory activity. Alternative scenarios for the period until 1986 are presented. Possible response by state/local remote sensing centers include operational applications for users, input to geo-base information systems (GIS), development of decision-making algorithms using GIS data, and long term research programs for coastal management using merged satellite and traditional data.
System Risk Balancing Profiles: Software Component
NASA Technical Reports Server (NTRS)
Kelly, John C.; Sigal, Burton C.; Gindorf, Tom
2000-01-01
The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.
Stenehjem, Jo S; Friesen, Melissa C; Eggen, Tone; Kjærheim, Kristina; Bråtveit, Magne; Grimsrud, Tom K
2016-01-01
The objective of this study was to examine self-reported frequency of occupational exposure reported by 28,000 Norwegian offshore oil workers in a 1998 survey. Predictors of self-reported exposure frequency were identified to aid future refinements of an expert-based job-exposure-time matrix (JEM). We focus here on reported frequencies for skin contact with oil and diesel, exposure to oil vapor from shaker, to exhaust fumes, vapor from mixing chemicals used for drilling, natural gas, chemicals used for water injection and processing, and to solvent vapor. Exposure frequency was reported by participants as the exposed proportion of the work shift, defined by six categories, in their current or last position offshore (between 1965 and 1999). Binary Poisson regression models with robust variance were used to examine the probabilities of reporting frequent exposure (≥¼ vs. <¼ of work shift) according to main activity, time period, supervisory position, type of company, type of installation, work schedule, and education. Holding a non-supervisory position, working shifts, being employed in the early period of the offshore industry, and having only compulsory education increased the probability of reporting frequent exposure. The identified predictors and group-level patterns may aid future refinement of the JEM previously developed for the present cohort. PMID:25671393
Stenehjem, Jo S; Friesen, Melissa C; Eggen, Tone; Kjærheim, Kristina; Bråtveit, Magne; Grimsrud, Tom K
2015-01-01
The objective of this study was to examine self-reported frequency of occupational exposure reported by 28,000 Norwegian offshore oil workers in a 1998 survey. Predictors of self-reported exposure frequency were identified to aid future refinements of an expert-based job-exposure-time matrix (JEM). We focus here on reported frequencies for skin contact with oil and diesel; exposure to oil vapor from shaker, to exhaust fumes, vapor from mixing chemicals used for drilling, natural gas, chemicals used for water injection and processing, and to solvent vapor. Exposure frequency was reported by participants as the exposed proportion of the work shift, defined by six categories, in their current or last position offshore (between 1965 and 1999). Binary Poisson regression models with robust variance were used to examine the probabilities of reporting frequent exposure (≥¼ vs. <¼ of work shift) according to main activity, time period, supervisory position, type of company, type of installation, work schedule, and education. Holding a non-supervisory position, working shifts, being employed in the early period of the offshore industry, and having only compulsory education increased the probability of reporting frequent exposure. The identified predictors and group-level patterns may aid future refinement of the JEM previously developed for the present cohort.
Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.
Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak
2018-02-01
The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.
Space Link Extension (SLE) Emulation for High-Throughput Network Communication
NASA Technical Reports Server (NTRS)
Murawski, Robert W.; Tchorowski, Nicole; Golden, Bert
2014-01-01
As the data rate requirements for space communications increases, significant stress is placed not only on the wireless satellite communication links, but also on the ground networks which forward data from end-users to remote ground stations. These wide area network (WAN) connections add delay and jitter to the end-to-end satellite communication link, effects which can have significant impacts on the wireless communication link. It is imperative that any ground communication protocol can react to these effects such that the ground network does not become a bottleneck in the communication path to the satellite. In this paper, we present our SCENIC Emulation Lab testbed which was developed to test the CCSDS SLE protocol implementations proposed for use on future NASA communication networks. Our results show that in the presence of realistic levels of network delay, high-throughput SLE communication links can experience significant data rate throttling. Based on our observations, we present some insight into why this data throttling happens, and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented as well to the SLE implementation developers which, based on our reports, developed a new release for SLE which we show fixes the SLE blocking issue and greatly improves the protocol throughput. In this paper, we also discuss future developments for our end-to-end emulation lab and how these improvements can be used to develop and test future space communication technologies.
Space Link Extension (SLE) Emulation for High-Throughput Network Communication
NASA Technical Reports Server (NTRS)
Murawski, Robert; Tchorowski, Nicole; Golden, Bert
2014-01-01
As the data rate requirements for space communications increases, signicant stressis placed not only on the wireless satellite communication links, but also on the groundnetworks which forward data from end-users to remote ground stations. These wide areanetwork (WAN) connections add delay and jitter to the end-to-end satellite communicationlink, eects which can have signicant impacts on the wireless communication link. It isimperative that any ground communication protocol can react to these eects such that theground network does not become a bottleneck in the communication path to the satellite.In this paper, we present our SCENIC Emulation Lab testbed which was developed to testthe CCSDS SLE protocol implementations proposed for use on future NASA communica-tion networks. Our results show that in the presence of realistic levels of network delay,high-throughput SLE communication links can experience signicant data rate throttling.Based on our observations, we present some insight into why this data throttling happens,and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented aswell to the SLE implementation developers which, based on our reports, developed a newrelease for SLE which we show xes the SLE blocking issue and greatly improves the pro-tocol throughput. In this paper, we also discuss future developments for our end-to-endemulation lab and how these improvements can be used to develop and test future spacecommunication technologies.
A Deterministic Approach to Active Debris Removal Target Selection
NASA Astrophysics Data System (ADS)
Lidtke, A.; Lewis, H.; Armellin, R.
2014-09-01
Many decisions, with widespread economic, political and legal consequences, are being considered based on space debris simulations that show that Active Debris Removal (ADR) may be necessary as the concerns about the sustainability of spaceflight are increasing. The debris environment predictions are based on low-accuracy ephemerides and propagators. This raises doubts about the accuracy of those prognoses themselves but also the potential ADR target-lists that are produced. Target selection is considered highly important as removal of many objects will increase the overall mission cost. Selecting the most-likely candidates as soon as possible would be desirable as it would enable accurate mission design and allow thorough evaluation of in-orbit validations, which are likely to occur in the near-future, before any large investments are made and implementations realized. One of the primary factors that should be used in ADR target selection is the accumulated collision probability of every object. A conjunction detection algorithm, based on the smart sieve method, has been developed. Another algorithm is then applied to the found conjunctions to compute the maximum and true probabilities of collisions taking place. The entire framework has been verified against the Conjunction Analysis Tools in AGIs Systems Toolkit and relative probability error smaller than 1.5% has been achieved in the final maximum collision probability. Two target-lists are produced based on the ranking of the objects according to the probability they will take part in any collision over the simulated time window. These probabilities are computed using the maximum probability approach, that is time-invariant, and estimates of the true collision probability that were computed with covariance information. The top-priority targets are compared, and the impacts of the data accuracy and its decay are highlighted. General conclusions regarding the importance of Space Surveillance and Tracking for the purpose of ADR are also drawn and a deterministic method for ADR target selection, which could reduce the number of ADR missions to be performed, is proposed.
A Transient Initialization Routine of the Community Ice Sheet Model for the Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
van der Laan, Larissa; van den Broeke, Michiel; Noël, Brice; van de Wal, Roderik
2017-04-01
The Community Ice Sheet Model (CISM) is to be applied in future simulations of the Greenland Ice Sheet under a range of climate change scenarios, determining the sensitivity of the ice sheet to individual climatic forcings. In order to achieve reliable results regarding ice sheet stability and assess the probability of future occurrence of tipping points, a realistic initial ice sheet geometry is essential. The current work describes and evaluates the development of a transient initialization routine, using NGRIP 18O isotope data to create a temperature anomaly field. Based on the latter, surface mass balance components runoff and precipitation are perturbed for the past 125k years. The precipitation and runoff fields originate from a downscaled 1 km resolution version of the regional climate model RACMO2.3 for the period 1961-1990. The result of the initialization routine is a present-day ice sheet with a transient memory of the last glacial-interglacial cycle, which will serve as the future runs' initial condition.
Oil and gas development footprint in the Piceance Basin, western Colorado
Martinez, Cericia D.; Preston, Todd M.
2018-01-01
Understanding long-term implications of energy development on ecosystem functionrequires establishing regional datasets to quantify past development and determine relationships to predict future development. The Piceance Basin in western Colorado has a history of energy production and development is expected to continue into the foreseeable future due to abundant natural gas resources. To facilitate analyses of regional energy development we digitized all well pads in the Colorado portion of the basin, determined the previous land cover of areas converted to well pads over three time periods (2002–2006, 2007–2011, and 2012–2016), and explored the relationship between number of wells per pad and pad area to model future development. We also calculated the area of pads constructed prior to 2002. Over 21 million m2 has been converted to well pads with approximately 13 million m2 converted since 2002. The largest land conversion since 2002 occurred in shrub/scrub (7.9 million m2), evergreen (2.1 million m2), and deciduous (1.3 million m2) forest environments based on National Land Cover Database classifications. Operational practices have transitioned from single well pads to multi-well pads, increasing the average number of wells per pad from 2.5 prior to 2002, to 9.1 between 2012 and 2016. During the same time period the pad area per well has increased from 2030 m2 to 3504 m2. Kernel density estimation was used to model the relationship between the number of wells per pad and pad area, with these curves exhibiting a lognormal distribution. Therefore, either kernel density estimation or lognormal probability distributions may potentially be used to model land use requirements for future development. Digitized well pad locations in the Piceance Basin contribute to a growing body of spatial data on energy infrastructure and, coupled with study results, will facilitate future regional and national studies assessing the spatial and temporal effects of energy development on ecosystem function.
Adult Services in the Third Millennium.
ERIC Educational Resources Information Center
Monroe, Margaret E.
1979-01-01
Presents a four-step model for "planning" or "forecasting" future of adult services in public libraries: (1) identification of forces at work; (2) analysis of probable impacts of one force upon another; (3) identification of preferred (and rejected) elements of future with forces that control elements; and (4) strategies to be…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... for ABC is the projected yield stream with a 70 percent probability of rebuilding success. The Council... to have an 81 percent chance of rebuilding in 10 years, greater than the 70 percent probability... AM applications. Should this ACT be used in the future to trigger AMs, then it may be expected to...
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Vaughan, William W.
1998-01-01
A summary is presented of basic lightning characteristics/criteria for current and future NASA aerospace vehicles. The paper estimates the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating probabilities of launch vehicles/objects being struck by lightning. This paper presents these results.
Pre-seismic anomalies from optical satellite observations: a review
NASA Astrophysics Data System (ADS)
Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian
2018-04-01
Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.
NASA Astrophysics Data System (ADS)
Pfeiffer, Franz
2018-01-01
X-ray ptychographic microscopy combines the advantages of raster scanning X-ray microscopy with the more recently developed techniques of coherent diffraction imaging. It is limited neither by the fabricational challenges associated with X-ray optics nor by the requirements of isolated specimen preparation, and offers in principle wavelength-limited resolution, as well as stable access and solution to the phase problem. In this Review, we discuss the basic principles of X-ray ptychography and summarize the main milestones in the evolution of X-ray ptychographic microscopy and tomography over the past ten years, since its first demonstration with X-rays. We also highlight the potential for applications in the life and materials sciences, and discuss the latest advanced concepts and probable future developments.
Collision frequency of artificial satellites - The creation of a debris belt
NASA Technical Reports Server (NTRS)
Kessler, D. J.; Cour-Palais, B. G.
1978-01-01
The probability of satellite collisions increases with the number of satellites. In the present paper, possible time scales for the growth of a debris belt from collision fragments are determined, and possible consequences of continued unrestrained launch activities are examined. Use is made of techniques formerly developed for studying the evolution (growth) of the asteroid belt. A model describing the flux from the known earth-orbiting satellites is developed, and the results from this model are extrapolated in time to predict the collision frequency between satellites. Hypervelocity impact phenomena are then examined to predict the debris flux resulting from collisions. The results are applied to design requirements for three types of future space missions.
NASA Astrophysics Data System (ADS)
Moretti, Michael
1992-05-01
I am very optimistic about the business potential of medical lasers in general. On the other hand, it's very easy to point out and criticize some severe failures that there have been. So I think that we will be able to do better in the future in terms of predicting where the lowest risk business opportunities are and where we should invest our time, energy, and money for business development. And I think that if I can read my audience correctly, business development is probably important here. Some of you may have a good, strong business in the medical area but I think you would all welcome new opportunities. That is really the focus of our talk today: new opportunities.
[Molecular techniques in mycology].
Rodríguez-Tudela, Juan Luis; Cuesta, Isabel; Gómez-López, Alicia; Alastruey-Izquierdo, Ana; Bernal-Martínez, Leticia; Cuenca-Estrella, Manuel
2008-11-01
An increasing number of molecular techniques for the diagnosis of fungal infections have been developed in the last few years, due to the growing prevalence of mycoses and the length of time required for diagnosis when classical microbiological methods are used. These methods are designed to resolve the following aspects of mycological diagnosis: a) Identification of fungi to species level by means of sequencing relevant taxonomic targets; b) early clinical diagnosis of invasive fungal infections; c) detection of molecular mechanisms of resistance to antifungal agents; and d) molecular typing of fungi. Currently, these methods are restricted to highly developed laboratories. However, some of these techniques will probably be available in daily clinical practice in the near future.
Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.
2014-01-01
The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.
NASA Technical Reports Server (NTRS)
1990-01-01
The current state is reviewed of the study of chemical evolution and planetary biology and the probable future is discussed of the field, at least for the near term. To this end, the report lists the goals and objectives of future research and makes detailed, comprehensive recommendations for accomplishing them, emphasizing those issues that were inadequately discussed in earlier Space Studies Board reports.
Transition probabilities of health states for workers in Malaysia using a Markov chain model
NASA Astrophysics Data System (ADS)
Samsuddin, Shamshimah; Ismail, Noriszura
2017-04-01
The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.
Modeling nonbreeding distributions of shorebirds and waterfowl in response to climate change
Reese, Gordon; Skagen, Susan K.
2017-01-01
To identify areas on the landscape that may contribute to a robust network of conservation areas, we modeled the probabilities of occurrence of several en route migratory shorebirds and wintering waterfowl in the southern Great Plains of North America, including responses to changing climate. We predominantly used data from the eBird citizen-science project to model probabilities of occurrence relative to land-use patterns, spatial distribution of wetlands, and climate. We projected models to potential future climate conditions using five representative general circulation models of the Coupled Model Intercomparison Project 5 (CMIP5). We used Random Forests to model probabilities of occurrence and compared the time periods 1981–2010 (hindcast) and 2041–2070 (forecast) in “model space.” Projected changes in shorebird probabilities of occurrence varied with species-specific general distribution pattern, migration distance, and spatial extent. Species using the western and northern portion of the study area exhibited the greatest likelihoods of decline, whereas species with more easterly occurrences, mostly long-distance migrants, had the greatest projected increases in probability of occurrence. At an ecoregional extent, differences in probabilities of shorebird occurrence ranged from −0.015 to 0.045 when averaged across climate models, with the largest increases occurring early in migration. Spatial shifts are predicted for several shorebird species. Probabilities of occurrence of wintering Mallards and Northern Pintail are predicted to increase by 0.046 and 0.061, respectively, with northward shifts projected for both species. When incorporated into partner land management decision tools, results at ecoregional extents can be used to identify wetland complexes with the greatest potential to support birds in the nonbreeding season under a wide range of future climate scenarios.
Probability of Future Observations Exceeding One-Sided, Normal, Upper Tolerance Limits
Edwards, Timothy S.
2014-10-29
Normal tolerance limits are frequently used in dynamic environments specifications of aerospace systems as a method to account for aleatory variability in the environments. Upper tolerance limits, when used in this way, are computed from records of the environment and used to enforce conservatism in the specification by describing upper extreme values the environment may take in the future. Components and systems are designed to withstand these extreme loads to ensure they do not fail under normal use conditions. The degree of conservatism in the upper tolerance limits is controlled by specifying the coverage and confidence level (usually written inmore » “coverage/confidence” form). Moreover, in high-consequence systems it is common to specify tolerance limits at 95% or 99% coverage and confidence at the 50% or 90% level. Despite the ubiquity of upper tolerance limits in the aerospace community, analysts and decision-makers frequently misinterpret their meaning. The misinterpretation extends into the standards that govern much of the acceptance and qualification of commercial and government aerospace systems. As a result, the risk of a future observation of the environment exceeding the upper tolerance limit is sometimes significantly underestimated by decision makers. This note explains the meaning of upper tolerance limits and a related measure, the upper prediction limit. So, the objective of this work is to clarify the probability of exceeding these limits in flight so that decision-makers can better understand the risk associated with exceeding design and test levels during flight and balance the cost of design and development with that of mission failure.« less
Brown, Michelle L.; Donovan, Therese; Schwenk, W. Scott; Theobald, David M.
2014-01-01
Forest loss and fragmentation are among the largest threats to forest-dwelling wildlife species today, and projected increases in human population growth are expected to increase these threats in the next century. We combined spatially-explicit growth models with wildlife distribution models to predict the effects of human development on 5 forest-dependent bird species in Vermont, New Hampshire, and Massachusetts, USA. We used single-species occupancy models to derive the probability of occupancy for each species across the study area in the years 2000 and 2050. Over half a million new housing units were predicted to be added to the landscape. The maximum change in housing density was nearly 30 houses per hectare; however, 30% of the towns in the study area were projected to add less than 1 housing unit per hectare. In the face of predicted human growth, the overall occupancy of each species decreased by as much as 38% (ranging from 19% to 38% declines in the worst-case scenario) in the year 2050. These declines were greater outside of protected areas than within protected lands. Ninety-seven percent of towns experienced some decline in species occupancy within their borders, highlighting the value of spatially-explicit models. The mean decrease in occupancy probability within towns ranged from 3% for hairy woodpecker to 8% for ovenbird and hermit thrush. Reductions in occupancy probability occurred on the perimeters of cities and towns where exurban development is predicted to increase in the study area. This spatial approach to wildlife planning provides data to evaluate trade-offs between development scenarios and forest-dependent wildlife species.
Land Use Planning and Wildfire: Development Policies Influence Future Probability of Housing Loss
Syphard, Alexandra D.; Bar Massada, Avi; Butsic, Van; Keeley, Jon E.
2013-01-01
Increasing numbers of homes are being destroyed by wildfire in the wildland-urban interface. With projections of climate change and housing growth potentially exacerbating the threat of wildfire to homes and property, effective fire-risk reduction alternatives are needed as part of a comprehensive fire management plan. Land use planning represents a shift in traditional thinking from trying to eliminate wildfires, or even increasing resilience to them, toward avoiding exposure to them through the informed placement of new residential structures. For land use planning to be effective, it needs to be based on solid understanding of where and how to locate and arrange new homes. We simulated three scenarios of future residential development and projected landscape-level wildfire risk to residential structures in a rapidly urbanizing, fire-prone region in southern California. We based all future development on an econometric subdivision model, but we varied the emphasis of subdivision decision-making based on three broad and common growth types: infill, expansion, and leapfrog. Simulation results showed that decision-making based on these growth types, when applied locally for subdivision of individual parcels, produced substantial landscape-level differences in pattern, location, and extent of development. These differences in development, in turn, affected the area and proportion of structures at risk from burning in wildfires. Scenarios with lower housing density and larger numbers of small, isolated clusters of development, i.e., resulting from leapfrog development, were generally predicted to have the highest predicted fire risk to the largest proportion of structures in the study area, and infill development was predicted to have the lowest risk. These results suggest that land use planning should be considered an important component to fire risk management and that consistently applied policies based on residential pattern may provide substantial benefits for future risk reduction. PMID:23977120
Buotte, Polly C; Hicke, Jeffrey A; Preisler, Haiganoush K; Abatzoglou, John T; Raffa, Kenneth F; Logan, Jesse A
2016-12-01
Extensive mortality of whitebark pine, beginning in the early to mid-2000s, occurred in the Greater Yellowstone Ecosystem (GYE) of the western USA, primarily from mountain pine beetle but also from other threats such as white pine blister rust. The climatic drivers of this recent mortality and the potential for future whitebark pine mortality from mountain pine beetle are not well understood, yet are important considerations in whether to list whitebark pine as a threatened or endangered species. We sought to increase the understanding of climate influences on mountain pine beetle outbreaks in whitebark pine forests, which are less well understood than in lodgepole pine, by quantifying climate-beetle relationships, analyzing climate influences during the recent outbreak, and estimating the suitability of future climate for beetle outbreaks. We developed a statistical model of the probability of whitebark pine mortality in the GYE that included temperature effects on beetle development and survival, precipitation effects on host tree condition, beetle population size, and stand characteristics. Estimated probability of whitebark pine mortality increased with higher winter minimum temperature, indicating greater beetle winter survival; higher fall temperature, indicating synchronous beetle emergence; lower two-year summer precipitation, indicating increased potential for host tree stress; increasing beetle populations; stand age; and increasing percent composition of whitebark pine within a stand. The recent outbreak occurred during a period of higher-than-normal regional winter temperatures, suitable fall temperatures, and low summer precipitation. In contrast to lodgepole pine systems, area with mortality was linked to precipitation variability even at high beetle populations. Projections from climate models indicate future climate conditions will likely provide favorable conditions for beetle outbreaks within nearly all current whitebark pine habitat in the GYE by the middle of this century. Therefore, when surviving and regenerating trees reach ages suitable for beetle attack, there is strong potential for continued whitebark pine mortality due to mountain pine beetle. © 2016 by the Ecological Society of America.
A changing climate: impacts on human exposures to O3 using ...
Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposures due to these impacts was developed by linking climate, air quality, land-use, and human exposure models. This methodology was then applied to characterize changes in predicted human exposures to O3 under multiple future scenarios. Regional climate projections for the U.S. were developed by downscaling global circulation model (GCM) scenarios for three of the Intergovernmental Panel on Climate Change’s (IPCC’s) Representative Concentration Pathways (RCPs) using the Weather Research and Forecasting (WRF) model. The regional climate results were in turn used to generate air quality (concentration) projections using the Community Multiscale Air Quality (CMAQ) model. For each of the climate change scenarios, future U.S. census-tract level population distributions from the Integrated Climate and Land Use Scenarios (ICLUS) model for four future scenarios based on the IPCC’s Special Report on Emissions Scenarios (SRES) storylines were used. These climate, air quality, and population projections were used as inputs to EPA’s Air Pollutants Exposure (APEX) model for 12 U.S. cities. Probability density functions show changes in the population distribution of 8 h maximum daily O3 exposur
van der Put, Claudia E; Stams, Geert Jan J M
2013-12-01
In the juvenile justice system, much attention is paid to estimating the risk for recidivism among juvenile offenders. However, it is also important to estimate the risk for problematic child-rearing situations (care needs) in juvenile offenders, because these problems are not always related to recidivism. In the present study, an actuarial care needs assessment tool for juvenile offenders, the Youth Offender Care Needs Assessment Tool (YO-CNAT), was developed to predict the probability of (a) a future supervision order imposed by the child welfare agency, (b) a future entitlement to care indicated by the youth care agency, and (c) future incidents involving child abuse, domestic violence, and/or sexual norm trespassing behavior at the juvenile's address. The YO-CNAT has been developed for use by the police and is based solely on information available in police registration systems. It is designed to assist a police officer without clinical expertise in making a quick assessment of the risk for problematic child-rearing situations. The YO-CNAT was developed on a sample of 1,955 juvenile offenders and was validated on another sample of 2,045 juvenile offenders. The predictive validity (area under the receiver-operating-characteristic curve) scores ranged between .70 (for predicting future entitlement to care) and .75 (for predicting future worrisome incidents at the juvenile's address); therefore, the predictive accuracy of the test scores of the YO-CNAT was sufficient to justify its use as a screening instrument for the police in deciding to refer a juvenile offender to the youth care agency for further assessment into care needs.
Only the Carrot, Not the Stick: Incorporating Trust into the Enforcement of Regulation
Mendoza, Juan P.; Wielhouwer, Jacco L.
2015-01-01
New enforcement strategies allow agents to gain the regulator’s trust and consequently face a lower audit probability. Prior research suggests that, in order to prevent lower compliance, a reduction in the audit probability (the “carrot”) must be compensated with the introduction of a higher penalty for non-compliance (the “stick”). However, such carrot-and-stick strategies reflect neither the concept of trust nor the strategies observed in practice. In response to this, we define trust-based regulation as a strategy that incorporates rules that allow trust to develop, and using a generic (non-cooperative) game of tax compliance, we examine whether trust-based regulation is feasible (i.e., whether, in equilibrium, a reduction in the audit probability, without ever increasing the penalty for non-compliance, does not lead to reduced compliance). The model shows that trust-based regulation is feasible when the agent sufficiently values the future. In line with the concept of trust, this strategy is feasible when the regulator is uncertain about the agent’s intentions. Moreover, the model shows that (i) introducing higher penalties makes trust-based regulation less feasible, and (ii) combining trust and forgiveness can lead to a lower audit probability for both trusted and distrusted agents. Policy recommendations often point toward increasing deterrence. This model shows that the opposite can be optimal. PMID:25705898
Assessing risk factors in the organic control system: evidence from inspection data in Italy.
Zanoli, Raffaele; Gambelli, Danilo; Solfanelli, Francesco
2014-12-01
Certification is an essential feature in organic farming, and it is based on inspections to verify compliance with respect to European Council Regulation-EC Reg. No 834/2007. A risk-based approach to noncompliance that alerts the control bodies to activate planning inspections would contribute to a more efficient and cost-effective certification system. An analysis of factors that can affect the probability of noncompliance in organic farming has thus been developed. This article examines the application of zero-inflated count data models to farm-level panel data from inspection results and sanctions obtained from the Ethical and Environmental Certification Institute, one of the main control bodies in Italy. We tested many a priori hypotheses related to the risk of noncompliance. We find evidence of an important role for past noncompliant behavior in predicting future noncompliance, while farm size and the occurrence of livestock also have roles in an increased probability of noncompliance. We conclude the article proposing that an efficient risk-based inspection system should be designed, weighting up the known probability of occurrence of a given noncompliance according to the severity of its impact. © 2014 Society for Risk Analysis.
[OMICS AND BIG DATA, MAJOR ADVANCES TOWARDS PERSONALIZED MEDICINE OF THE FUTURE?].
Scheen, A J
2015-01-01
The increasing interest for personalized medicine evolves together with two major technological advances. First, the new-generation, rapid and less expensive, DNA sequencing method, combined with remarkable progresses in molecular biology leading to the post-genomic era (transcriptomics, proteomics, metabolomics). Second, the refinement of computing tools (IT), which allows the immediate analysis of a huge amount of data (especially, those resulting from the omics approaches) and, thus, creates a new universe for medical research, that of analyzed by computerized modelling. This article for scientific communication and popularization briefly describes the main advances in these two fields of interest. These technological progresses are combined with those occurring in communication, which makes possible the development of artificial intelligence. These major advances will most probably represent the grounds of the future personalized medicine.
Using the FORE-SCE model to project land-cover change in the southeastern United States
Sohl, Terry; Sayler, Kristi L.
2008-01-01
A wide variety of ecological applications require spatially explicit current and projected land-use and land-cover data. The southeastern United States has experienced massive land-use change since European settlement and continues to experience extremely high rates of forest cutting, significant urban development, and changes in agricultural land use. Forest-cover patterns and structure are projected to change dramatically in the southeastern United States in the next 50 years due to population growth and demand for wood products [Wear, D.N., Greis, J.G. (Eds.), 2002. Southern Forest Resource Assessment. General Technical Report SRS-53. U.S. Department of Agriculture, Forest Service, Southern Research Station, Asheville, NC, 635 pp]. Along with our climate partners, we are examining the potential effects of southeastern U.S. land-cover change on regional climate. The U.S. Geological Survey (USGS) Land Cover Trends project is analyzing contemporary (1973-2000) land-cover change in the conterminous United States, providing ecoregion-by-ecoregion estimates of the rates of change, descriptive transition matrices, and changes in landscape metrics. The FORecasting SCEnarios of future land-cover (FORE-SCE) model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land-cover change through 2050 for the southeastern United States. Prescriptions for future proportions of land cover for this application were provided by ecoregion-based extrapolations of historical change. Logistic regression was used to develop relationships between suspected drivers of land-cover change and land cover, resulting in the development of probability-of-occurrence surfaces for each unique land-cover type. Forest stand age was initially established with Forest Inventory and Analysis (FIA) data and tracked through model iterations. The spatial allocation procedure placed patches of new land cover on the landscape until the scenario prescriptions were met, using measured Land Cover Trends data to guide patch characteristics and the probability surfaces to guide placement. The approach provides an efficient method for extrapolating historical land-cover trends and is amenable to the incorporation of more detailed and focused studies for the establishment of scenario prescriptions.
Changes in the probability of co-occurring extreme climate events
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.
Robust Bayesian Experimental Design for Conceptual Model Discrimination
NASA Astrophysics Data System (ADS)
Pham, H. V.; Tsai, F. T. C.
2015-12-01
A robust Bayesian optimal experimental design under uncertainty is presented to provide firm information for model discrimination, given the least number of pumping wells and observation wells. Firm information is the maximum information of a system can be guaranteed from an experimental design. The design is based on the Box-Hill expected entropy decrease (EED) before and after the experiment design and the Bayesian model averaging (BMA) framework. A max-min programming is introduced to choose the robust design that maximizes the minimal Box-Hill EED subject to that the highest expected posterior model probability satisfies a desired probability threshold. The EED is calculated by the Gauss-Hermite quadrature. The BMA method is used to predict future observations and to quantify future observation uncertainty arising from conceptual and parametric uncertainties in calculating EED. Monte Carlo approach is adopted to quantify the uncertainty in the posterior model probabilities. The optimal experimental design is tested by a synthetic 5-layer anisotropic confined aquifer. Nine conceptual groundwater models are constructed due to uncertain geological architecture and boundary condition. High-performance computing is used to enumerate all possible design solutions in order to identify the most plausible groundwater model. Results highlight the impacts of scedasticity in future observation data as well as uncertainty sources on potential pumping and observation locations.
NASA Astrophysics Data System (ADS)
Kalantari, Z.
2015-12-01
In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. This study was built on a conceptual framework for looking at SedInConnect model, topography, land use, soil data and other PCDs and climate change in an integrated way to pave the way for more integrated policy making. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. This framework can provide a region with an effective tool to inform a broad range of watershed planning activities within a region. Regional planners, decision-makers, etc. can utilize this tool to identify the most vulnerable points in a watershed and along roads to plan for interventions and actions to alter impacts of high flows and other extreme weather events on roads construction. The application of the model over a large scale can give a realistic spatial characterization of sediment connectivity for the optimal management of debris flow to road structures. The ability of the model to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.
Imagining flood futures: risk assessment and management in practice.
Lane, Stuart N; Landström, Catharina; Whatmore, Sarah J
2011-05-13
The mantra that policy and management should be 'evidence-based' is well established. Less so are the implications that follow from 'evidence' being predictions of the future (forecasts, scenarios, horizons) even though such futures define the actions taken today to make the future sustainable. Here, we consider the tension between 'evidence', reliable because it is observed, and predictions of the future, unobservable in conventional terms. For flood risk management in England and Wales, we show that futures are actively constituted, and so imagined, through 'suites of practices' entwining policy, management and scientific analysis. Management has to constrain analysis because of the many ways in which flood futures can be constructed, but also because of commitment to an accounting calculus, which requires risk to be expressed in monetary terms. It is grounded in numerical simulation, undertaken by scientific consultants who follow policy/management guidelines that define the futures to be considered. Historical evidence is needed to deal with process and parameter uncertainties and the futures imagined are tied to pasts experienced. Reliance on past events is a challenge for prediction, given changing probability (e.g. climate change) and consequence (e.g. development on floodplains). So, risk management allows some elements of risk analysis to become unstable (notably in relation to climate change) but forces others to remain stable (e.g. invoking regulation to prevent inappropriate floodplain development). We conclude that the assumed separation of risk assessment and management is false because the risk calculation has to be defined by management. Making this process accountable requires openness about the procedures that make flood risk analysis more (or less) reliable to those we entrust to produce and act upon them such that, unlike the 'pseudosciences', they can be put to the test of public interrogation by those who have to live with their consequences. © 2011 Royal Society
Eaton, Mitchell J.; Hughes, Phillip T.; Hines, James E.; Nichols, James D.
2014-01-01
Metapopulation ecology is a field that is richer in theory than in empirical results. Many existing empirical studies use an incidence function approach based on spatial patterns and key assumptions about extinction and colonization rates. Here we recast these assumptions as hypotheses to be tested using 18 years of historic detection survey data combined with four years of data from a new monitoring program for the Lower Keys marsh rabbit. We developed a new model to estimate probabilities of local extinction and colonization in the presence of nondetection, while accounting for estimated occupancy levels of neighboring patches. We used model selection to identify important drivers of population turnover and estimate the effective neighborhood size for this system. Several key relationships related to patch size and isolation that are often assumed in metapopulation models were supported: patch size was negatively related to the probability of extinction and positively related to colonization, and estimated occupancy of neighboring patches was positively related to colonization and negatively related to extinction probabilities. This latter relationship suggested the existence of rescue effects. In our study system, we inferred that coastal patches experienced higher probabilities of extinction and colonization than interior patches. Interior patches exhibited higher occupancy probabilities and may serve as refugia, permitting colonization of coastal patches following disturbances such as hurricanes and storm surges. Our modeling approach should be useful for incorporating neighbor occupancy into future metapopulation analyses and in dealing with other historic occupancy surveys that may not include the recommended levels of sampling replication.
Moxie matters: associations of future orientation with active life expectancy.
Laditka, Sarah B; Laditka, James N
2017-10-01
Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.
Battlefield Air Interdiction: Airpower for the Future
1980-01-01
recommendations for the effective use of airpower for this purpose are made. A future war will probably be against the Soviet Union or one of its...emphasis will be placed upon the Soviet forces since it is likely that any future belligerence will be against the _ _......6 I Soviet Union or one of its...offensive operations (see figure 3) stress rapid, continuous movement. Objectives are established which demand high rates of advance. A regiment, for
Developing a Scenario for widespread use: Best practices, lessons learned
Perry, S.; Jones, L.; Cox, D.
2011-01-01
The ShakeOut Scenario is probably the most widely known and used earthquake scenario created to date. Much of the credit for its widespread dissemination and application lies with scenario development criteria that focused on the needs and involvement of end users and with a suite of products that tailored communication of the results to varied end users, who ranged from emergency managers to the general public, from corporations to grassroots organizations. Products were most effective when they were highly visual, when they emphasized the findings of social scientists, and when they communicated the experience of living through the earthquake. This paper summarizes the development criteria and the products that made the ShakeOut Scenario so widely known and used, and it provides some suggestions for future improvements. ?? 2011, Earthquake Engineering Research Institute.
Björkman, Mari; Rantala, Juha; Nees, Matthias; Kallioniemi, Olli
2010-10-01
Alterations in epigenetic processes probably underlie most human malignancies. Novel genome-wide techniques, such as chromatin immunoprecipitation and high-throughput sequencing, have become state-of-the-art methods to map the epigenomic landscape of development and disease, such as in cancers. Despite these advances, the functional significance of epigenetic enzymes in cancer progression, such as prostate cancer, remain incompletely understood. A comprehensive mapping and functional understanding of the cancer epigenome will hopefully help to facilitate development of novel cancer therapy targets and improve future diagnostics. The authors have developed a novel cell microarray-based high-content siRNA screening technique suitable to address the putative functional role and impact of all known putative and novel epigenetic enzymes in cancer, including prostate cancer.
Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast
Geist, E.L.; Parsons, T.
2009-01-01
Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.
Using scenarios to assess possible future impacts of invasive species in the Laurentian Great Lakes
Lauber, T. Bruce; Stedman, Richard C.; Connelly, Nancy A; Rudstam, Lars G.; Ready, Richard C; Poe, Gregory L; Bunnell, David B.; Hook, Tomas O.; Koops, Marten A.; Ludsin, Stuart A.; Rutherford, Edward S; Wittmann, Marion E.
2016-01-01
The expected impacts of invasive species are key considerations in selecting policy responses to potential invasions. But predicting the impacts of invasive species is daunting, particularly in large systems threatened by multiple invasive species, such as North America’s Laurentian Great Lakes. We developed and evaluated a scenario-building process that relied on an expert panel to assess possible future impacts of aquatic invasive species on recreational fishing in the Great Lakes. To maximize its usefulness to policy makers, this process was designed to be implemented relatively rapidly and consider a range of species. The expert panel developed plausible, internally-consistent invasion scenarios for 5 aquatic invasive species, along with subjective probabilities of those scenarios. We describe these scenarios and evaluate this approach for assessing future invasive species impacts. The panel held diverse opinions about the likelihood of the scenarios, and only one scenario with impacts on sportfish species was considered likely by most of the experts. These outcomes are consistent with the literature on scenario building, which advocates for developing a range of plausible scenarios in decision making because the uncertainty of future conditions makes the likelihood of any particular scenario low. We believe that this scenario-building approach could contribute to policy decisions about whether and how to address the possible impacts of invasive species. In this case, scenarios could allow policy makers to narrow the range of possible impacts on Great Lakes fisheries they consider and help set a research agenda for further refining invasive species predictions.
Jahanishakib, Fatemeh; Mirkarimi, Seyed Hamed; Salmanmahiny, Abdolrassoul; Poodat, Fatemeh
2018-05-08
Efficient land use management requires awareness of past changes, present actions, and plans for future developments. Part of these requirements is achieved using scenarios that describe a future situation and the course of changes. This research aims to link scenario results with spatially explicit and quantitative forecasting of land use development. To develop land use scenarios, SMIC PROB-EXPERT and MORPHOL methods were used. It revealed eight scenarios as the most probable. To apply the scenarios, we considered population growth rate and used a cellular automata-Markov chain (CA-MC) model to implement the quantified changes described by each scenario. For each scenario, a set of landscape metrics was used to assess the ecological integrity of land use classes in terms of fragmentation and structural connectivity. The approach enabled us to develop spatial scenarios of land use change and detect their differences for choosing the most integrated landscape pattern in terms of landscape metrics. Finally, the comparison between paired forecasted scenarios based on landscape metrics indicates that scenarios 1-1, 2-2, 3-2, and 4-1 have a more suitable integrity. The proposed methodology for developing spatial scenarios helps executive managers to create scenarios with many repetitions and customize spatial patterns in real world applications and policies.
A temporal forecast of radiation environments for future space exploration missions.
Kim, Myung-Hee Y; Cucinotta, Francis A; Wilson, John W
2007-06-01
The understanding of future space radiation environments is an important goal for space mission operations, design, and risk assessment. We have developed a solar cycle statistical model in which sunspot number is coupled to space-related quantities, such as the galactic cosmic radiation (GCR) deceleration potential (phi) and the mean occurrence frequency of solar particle events (SPEs). Future GCR fluxes were derived from a predictive model, in which the temporal dependence represented by phi was derived from GCR flux and ground-based Climax neutron monitor rate measurements over the last four decades. These results showed that the point dose equivalent inside a typical spacecraft in interplanetary space was influenced by solar modulation by up to a factor of three. It also has been shown that a strong relationship exists between large SPE occurrences and phi. For future space exploration missions, cumulative probabilities of SPEs at various integral fluence levels during short-period missions were defined using a database of proton fluences of past SPEs. Analytic energy spectra of SPEs at different ranks of the integral fluences for energies greater than 30 MeV were constructed over broad energy ranges extending out to GeV for the analysis of representative exposure levels at those fluences. Results will guide the design of protection systems for astronauts during future space exploration missions.
NASA Technical Reports Server (NTRS)
1976-01-01
Trends in civil and military aviation in the period 1980-2000 are examined in terms of the role that NASA should play in aeronautical research and development during this period. Factors considered include the pattern of industry and government relationships, the character of the aircraft to be developed, and the technology advances that will be required as well as demographic, economic, and social factors. Trends are expressed in terms of the most probable developments in civil air transportation and air defense and several characteristically different directions for future development are defined. The longer term opportunities created by developments in air transporation extending into the next century are also examined. Within this framework, a preferred NASA role and a preferred set of objectives are formulated for the research and technology which should be undertaken by NASA during the period 1976-1985.
Martin, Petra; Leighl, Natasha B
2017-06-01
This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice.
Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Vaughan, William W.
2000-01-01
A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.
Why and How. The Future of the Central Questions of Consciousness
Havlík, Marek; Kozáková, Eva; Horáček, Jiří
2017-01-01
In this review, we deal with two central questions of consciousness how and why, and we outline their possible future development. The question how refers to the empirical endeavor to reveal the neural correlates and mechanisms that form consciousness. On the other hand, the question why generally refers to the “hard problem” of consciousness, which claims that empirical science will always fail to provide a satisfactory answer to the question why is there conscious experience at all. Unfortunately, the hard problem of consciousness will probably never completely disappear because it will always have its most committed supporters. However, there is a good chance that its weight and importance will be highly reduced by empirically tackling consciousness in the near future. We expect that future empirical endeavor of consciousness will be based on a unifying brain theory and will answer the question as to what is the function of conscious experience, which will in turn replace the implications of the hard problem. The candidate of such a unifying brain theory is predictive coding, which will have to explain both perceptual consciousness and conscious mind-wandering in order to become the truly unifying theory of brain functioning. PMID:29075226
NASA Astrophysics Data System (ADS)
McCloskey, John
2008-03-01
The Sumatra-Andaman earthquake of 26 December 2004 (Boxing Day 2004) and its tsunami will endure in our memories as one of the worst natural disasters of our time. For geophysicists, the scale of the devastation and the likelihood of another equally destructive earthquake set out a series of challenges of how we might use science not only to understand the earthquake and its aftermath but also to help in planning for future earthquakes in the region. In this article a brief account of these efforts is presented. Earthquake prediction is probably impossible, but earth scientists are now able to identify particularly dangerous places for future events by developing an understanding of the physics of stress interaction. Having identified such a dangerous area, a series of numerical Monte Carlo simulations is described which allow us to get an idea of what the most likely consequences of a future earthquake are by modelling the tsunami generated by lots of possible, individually unpredictable, future events. As this article was being written, another earthquake occurred in the region, which had many expected characteristics but was enigmatic in other ways. This has spawned a series of further theories which will contribute to our understanding of this extremely complex problem.
Suzuki, Teppei; Tani, Yuji; Ogasawara, Katsuhiko
2016-07-25
Consistent with the "attention, interest, desire, memory, action" (AIDMA) model of consumer behavior, patients collect information about available medical institutions using the Internet to select information for their particular needs. Studies of consumer behavior may be found in areas other than medical institution websites. Such research uses Web access logs for visitor search behavior. At this time, research applying the patient searching behavior model to medical institution website visitors is lacking. We have developed a hospital website search behavior model using a Bayesian approach to clarify the behavior of medical institution website visitors and determine the probability of their visits, classified by search keyword. We used the website data access log of a clinic of internal medicine and gastroenterology in the Sapporo suburbs, collecting data from January 1 through June 31, 2011. The contents of the 6 website pages included the following: home, news, content introduction for medical examinations, mammography screening, holiday person-on-duty information, and other. The search keywords we identified as best expressing website visitor needs were listed as the top 4 headings from the access log: clinic name, clinic name + regional name, clinic name + medical examination, and mammography screening. Using the search keywords as the explaining variable, we built a binomial probit model that allows inspection of the contents of each purpose variable. Using this model, we determined a beta value and generated a posterior distribution. We performed the simulation using Markov Chain Monte Carlo methods with a noninformation prior distribution for this model and determined the visit probability classified by keyword for each category. In the case of the keyword "clinic name," the visit probability to the website, repeated visit to the website, and contents page for medical examination was positive. In the case of the keyword "clinic name and regional name," the probability for a repeated visit to the website and the mammography screening page was negative. In the case of the keyword "clinic name + medical examination," the visit probability to the website was positive, and the visit probability to the information page was negative. When visitors referred to the keywords "mammography screening," the visit probability to the mammography screening page was positive (95% highest posterior density interval = 3.38-26.66). Further analysis for not only the clinic website but also various other medical institution websites is necessary to build a general inspection model for medical institution websites; we want to consider this in future research. Additionally, we hope to use the results obtained in this study as a prior distribution for future work to conduct higher-precision analysis.
Tani, Yuji
2016-01-01
Background Consistent with the “attention, interest, desire, memory, action” (AIDMA) model of consumer behavior, patients collect information about available medical institutions using the Internet to select information for their particular needs. Studies of consumer behavior may be found in areas other than medical institution websites. Such research uses Web access logs for visitor search behavior. At this time, research applying the patient searching behavior model to medical institution website visitors is lacking. Objective We have developed a hospital website search behavior model using a Bayesian approach to clarify the behavior of medical institution website visitors and determine the probability of their visits, classified by search keyword. Methods We used the website data access log of a clinic of internal medicine and gastroenterology in the Sapporo suburbs, collecting data from January 1 through June 31, 2011. The contents of the 6 website pages included the following: home, news, content introduction for medical examinations, mammography screening, holiday person-on-duty information, and other. The search keywords we identified as best expressing website visitor needs were listed as the top 4 headings from the access log: clinic name, clinic name + regional name, clinic name + medical examination, and mammography screening. Using the search keywords as the explaining variable, we built a binomial probit model that allows inspection of the contents of each purpose variable. Using this model, we determined a beta value and generated a posterior distribution. We performed the simulation using Markov Chain Monte Carlo methods with a noninformation prior distribution for this model and determined the visit probability classified by keyword for each category. Results In the case of the keyword “clinic name,” the visit probability to the website, repeated visit to the website, and contents page for medical examination was positive. In the case of the keyword “clinic name and regional name,” the probability for a repeated visit to the website and the mammography screening page was negative. In the case of the keyword “clinic name + medical examination,” the visit probability to the website was positive, and the visit probability to the information page was negative. When visitors referred to the keywords “mammography screening,” the visit probability to the mammography screening page was positive (95% highest posterior density interval = 3.38-26.66). Conclusions Further analysis for not only the clinic website but also various other medical institution websites is necessary to build a general inspection model for medical institution websites; we want to consider this in future research. Additionally, we hope to use the results obtained in this study as a prior distribution for future work to conduct higher-precision analysis. PMID:27457537
Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay
NASA Astrophysics Data System (ADS)
Anderson, Clarissa R.; Sapiano, Mathew R. P.; Prasad, M. Bala Krishna; Long, Wen; Tango, Peter J.; Brown, Christopher W.; Murtugudde, Raghu
2010-11-01
Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (≥10 cells mL -1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100 cells mL -1) to large- threshold (1000 cells mL -1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of ~53%, a Probability of Detection ˜ 75%, a False Alarm Ratio of ˜ 52%, and a Probability of False Detection ˜9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed.
Treatment and prophylaxis of melioidosis
Dance, David
2014-01-01
Melioidosis, infection with Burkholderia pseudomallei, is being recognised with increasing frequency and is probably more common than currently appreciated. Treatment recommendations are based on a series of clinical trials conducted in Thailand over the past 25 years. Treatment is usually divided into two phases: in the first, or acute phase, parenteral drugs are given for ≥10 days with the aim of preventing death from overwhelming sepsis; in the second, or eradication phase, oral drugs are given, usually to complete a total of 20 weeks, with the aim of preventing relapse. Specific treatment for individual patients needs to be tailored according to clinical manifestations and response, and there remain many unanswered questions. Some patients with very mild infections can probably be cured by oral agents alone. Ceftazidime is the mainstay of acute-phase treatment, with carbapenems reserved for severe infections or treatment failures and amoxicillin/clavulanic acid (co-amoxiclav) as second-line therapy. Trimethoprim/sulfamethoxazole (co-trimoxazole) is preferred for the eradication phase, with the alternative of co-amoxiclav. In addition, the best available supportive care is needed, along with drainage of abscesses whenever possible. Treatment for melioidosis is unaffordable for many in endemic areas of the developing world, but the relative costs have reduced over the past decade. Unfortunately there is no likelihood of any new or cheaper options becoming available in the immediate future. Recommendations for prophylaxis following exposure to B. pseudomallei have been made, but the evidence suggests that they would probably only delay rather than prevent the development of infection. PMID:24613038
Teenage smoking, attempts to quit, and school performance.
Hu, T W; Lin, Z; Keeler, T E
1998-01-01
OBJECTIVES: This study examined the relationship between school performance, smoking, and quitting attempts among teenagers. METHODS: A logistic regression model was used to predict the probability of being a current smoker or a former smoker. Data were derived from the 1990 California Youth Tobacco Survey. RESULTS: Students' school performance was a key factor in predicting smoking and quitting attempts when other sociodemographic and family income factors were controlled. CONCLUSIONS: Developing academic or remedial classes designed to improve students' school performance may lead to a reduction in smoking rates among teenagers while simultaneously providing a human capital investment in their futures. PMID:9618625
NASA Technical Reports Server (NTRS)
Miller, Robert A.; Kuczmarski, Maria A.
2015-01-01
Thermodynamic and computational fluid dynamics modeling has been conducted to examine the feasibility of adapting the NASA-Glenn erosion burner rigs for use in studies of corrosion of environmental barrier coatings by the deposition of molten CMAS. The effect of burner temperature, Mach number, particle preheat, duct heating, particle size, and particle phase (crystalline vs. glass) were analyzed. Detailed strategies for achieving complete melting of CMAS particles were developed, thereby greatly improving the probability of future successful experimental outcomes.
Oudgenoeg-Paz, Ora; Boom, Jan; Volman, M Chiel J M; Leseman, Paul P M
2016-06-01
Within a perception-action framework, exploration is seen as a driving force in young children's development. Through exploration, children become skilled in perceiving the affordances in their environment and acting on them. Using a perception-action framework, the current study examined the development of children's exploration of the spatial-relational properties of objects such as the possibility of containing or stacking. A total of 61 children, belonging to two age cohorts, were followed from 9 to 24 months and from 20 to 36 months of age, respectively. Exploration of a standard set of objects was observed in five home visits in each cohort conducted every 4 months. A cohort-sequential augmented growth model for categorical data, incorporating assumptions of item response theory, was constructed that fitted the data well, showing that the development of exploration of spatial-relational object properties follows an overlapping waves pattern. This is in line with Siegler's model (Emerging Minds, 1996), which suggested that skill development can be seen as ebbing and flowing of alternative (simple and advanced) behaviors. Although the probability of observing the more complex forms of exploration increased with age, the simpler forms did not disappear altogether but only became less probable. Findings support a perception-action view on development. Individual differences in observed exploration and their relations with other variables, as well as future directions for research, are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Development of Scoring Functions for Antibody Sequence Assessment and Optimization
Seeliger, Daniel
2013-01-01
Antibody development is still associated with substantial risks and difficulties as single mutations can radically change molecule properties like thermodynamic stability, solubility or viscosity. Since antibody generation methodologies cannot select and optimize for molecule properties which are important for biotechnological applications, careful sequence analysis and optimization is necessary to develop antibodies that fulfil the ambitious requirements of future drugs. While efforts to grab the physical principles of undesired molecule properties from the very bottom are becoming increasingly powerful, the wealth of publically available antibody sequences provides an alternative way to develop early assessment strategies for antibodies using a statistical approach which is the objective of this paper. Here, publically available sequences were used to develop heuristic potentials for the framework regions of heavy and light chains of antibodies of human and murine origin. The potentials take into account position dependent probabilities of individual amino acids but also conditional probabilities which are inevitable for sequence assessment and optimization. It is shown that the potentials derived from human sequences clearly distinguish between human sequences and sequences from mice and, hence, can be used as a measure of humaness which compares a given sequence with the phenotypic pool of human sequences instead of comparing sequence identities to germline genes. Following this line, it is demonstrated that, using the developed potentials, humanization of an antibody can be described as a simple mathematical optimization problem and that the in-silico generated framework variants closely resemble native sequences in terms of predicted immunogenicity. PMID:24204701
Exploration Health Risks: Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley
2006-01-01
Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.
NASA Astrophysics Data System (ADS)
Coleman, N.; Abramson, L.
2004-05-01
Yucca Mt. (YM) is a potential repository site for high-level radioactive waste and spent fuel. One issue is the potential for future igneous activity to intersect the repository. If the event probability is <1E-8/yr, it need not be considered in licensing. Plio-Quaternary volcanos and older basalts occur near YM. Connor et al (JGR, 2000) estimate a probability of 1E-8/yr to 1E-7/yr for a basaltic dike to intersect the potential repository. Based on aeromagnetic data, Hill and Stamatakos (CNWRA, 2002) propose that additional volcanos may lie buried in nearby basins. They suggest if these volcanos are part of temporal-clustered volcanic activity, the probability of an intrusion may be as high as 1E-6/yr. We examine whether recurrence probabilities >2E-7/yr are realistic given that no dikes have been found in or above the 1.3E7 yr-old potential repository block. For 2E-7/yr (or 1E-6/yr), the expected number of penetrating dikes is 2.6 (respectively, 13), and the probability of at least one penetration is 0.93 (0.999). These results are not consistent with the exploration evidence. YM is one of the most intensively studied places on Earth. Over 20 yrs of studies have included surface and subsurface mapping, geophysical surveys, construction of 10+ km of tunnels in the mountain, drilling of many boreholes, and construction of many pits (DOE, Site Recommendation, 2002). It seems unlikely that multiple dikes could exist within the proposed repository footprint and escape detection. A dike complex dated 11.7 Ma (Smith et al, UNLV, 1997) or 10 Ma (Carr and Parrish, 1985) does exist NW of YM and west of the main Solitario Canyon Fault. These basalts intruded the Tiva Canyon Tuff (12.7 Ma) in an epoch of caldera-forming pyroclastic eruptions that ended millions of yrs ago. We would conclude that basaltic volcanism related to Miocene silicic volcanism may also have ended. Given the nondetection of dikes in the potential repository, we can use a Poisson model to estimate an upper-bound probability of 2E-7/yr (95% conf. level) for an igneous intrusion over the next 1E4 yrs. If we assume one undiscovered dike exists, the upper-bound probability would rise to 4E-7/yr. Higher probabilities may be possible if conditions that fostered Plio-Quaternary volcanism became enhanced over time. To the contrary, basalts of the past 11 Ma in Crater Flat have erupted in four episodes that together show a declining trend in erupted magma volume (DOE, TBD13, 2003). Smith et al (GSA Today, 2002) suggest there may be a common magma source for volcanism in Crater Flat and the Lunar Crater volcanic field, and that recurrence rates for YM could be underestimated. Their interpretation is highly speculative given the 130-km (80-mi) distance between these zones. A claim that crustal extension at YM is anomalously large, possibly favoring renewed volcanism (Wernicke et al, Science, 1999), was contradicted by later work (Savage et al, JGR, 2000). Spatial-temporal models that predict future intrusion probabilities of >2E-7/yr may be overly conservative and unrealistic. Along with currently planned site characterization activities, realistic models could be developed by considering the non-detection of basaltic dikes in the potential repository footprint. (The views expressed are the authors' and do not reflect any final judgment or determination by the Advisory Committee on Nuclear Waste or the Nuclear Regulatory Commission regarding the matters addressed or the acceptability of a license application for a geologic repository at Yucca Mt.)
Temperature and tree growth [editorial
Michael G. Ryan
2010-01-01
Tree growth helps US forests take up 12% of the fossil fuels emitted in the USA (Woodbury et al. 2007), so predicting tree growth for future climates matters. Predicting future climates themselves is uncertain, but climate scientists probably have the most confidence in predictions for temperature. Temperatures are projected to rise by 0.2 °C in the next two decades,...
Why do we find ourselves around a yellow star instead of a red star?
NASA Astrophysics Data System (ADS)
Haqq-Misra, Jacob; Kopparapu, Ravi Kumar; Wolf, Eric T.
2018-01-01
M-dwarf stars are more abundant than G-dwarf stars, so our position as observers on a planet orbiting a G-dwarf raises questions about the suitability of other stellar types for supporting life. If we consider ourselves as typical, in the anthropic sense that our environment is probably a typical one for conscious observers, then we are led to the conclusion that planets orbiting in the habitable zone of G-dwarf stars should be the best place for conscious life to develop. But such a conclusion neglects the possibility that K-dwarfs or M-dwarfs could provide more numerous sites for life to develop, both now and in the future. In this paper we analyse this problem through Bayesian inference to demonstrate that our occurrence around a G-dwarf might be a slight statistical anomaly, but only the sort of chance event that we expect to occur regularly. Even if M-dwarfs provide more numerous habitable planets today and in the future, we still expect mid G- to early K-dwarfs stars to be the most likely place for observers like ourselves. This suggests that observers with similar cognitive capabilities as us are most likely to be found at the present time and place, rather than in the future or around much smaller stars.
Predicting the Uncertain Future of Aptamer-Based Diagnostics and Therapeutics.
Bruno, John G
2015-04-16
Despite the great promise of nucleic acid aptamers in the areas of diagnostics and therapeutics for their facile in vitro development, lack of immunogenicity and other desirable properties, few truly successful aptamer-based products exist in the clinical or other markets. Core reasons for these commercial deficiencies probably stem from industrial commitment to antibodies including a huge financial investment in humanized monoclonal antibodies and a general ignorance about aptamers and their performance among the research and development community. Given the early failures of some strong commercial efforts to gain government approval and bring aptamer-based products to market, it may seem that aptamers are doomed to take a backseat to antibodies forever. However, the key advantages of aptamers over antibodies coupled with niche market needs that only aptamers can fill and more recent published data still point to a bright commercial future for aptamers in areas such as infectious disease and cancer diagnostics and therapeutics. As more researchers and entrepreneurs become familiar with aptamers, it seems inevitable that aptamers will at least be considered for expanded roles in diagnostics and therapeutics. This review also examines new aptamer modifications and attempts to predict new aptamer applications that could revolutionize biomedical technology in the future and lead to marketed products.
NASA Astrophysics Data System (ADS)
Malek, Žiga; Boerboom, Luc; Glade, Thomas
2015-11-01
This study focuses on future forest cover change in Buzau Subcarpathians, a landslide prone region in Romania. Past and current trends suggest that the area might expect a future increase in deforestation. We developed spatially explicit scenarios until 2040 to analyze the spatial pattern of future forest cover change and potential changes to landslide risk. First, we generated transition probability maps using the weights of evidence method, followed by a cellular automata allocation model. We performed expert interviews, to develop two future forest management scenarios. The Alternative scenario (ALT) was defined by 67 % more deforestation than the Business as Usual scenario (BAU). We integrated the simulated scenarios with a landslide susceptibility map. In both scenarios, most of deforestation was projected in areas where landslides are less likely to occur. Still, 483 (ALT) and 276 (BAU) ha of deforestation were projected on areas with a high-landslide occurrence likelihood. Thus, deforestation could lead to a local-scale increase in landslide risk, in particular near or adjacent to forestry roads. The parallel process of near 10 % forest expansion until 2040 was projected to occur mostly on areas with high-landslide susceptibility. On a regional scale, forest expansion could so result in improved slope stability. We modeled two additional scenarios with an implemented landslide risk policy, excluding high-risk zones. The reduction of deforestation on high-risk areas was achieved without a drastic decrease in the accessibility of the areas. Together with forest expansion, it could therefore be used as a risk reduction strategy.
Malek, Žiga; Boerboom, Luc; Glade, Thomas
2015-11-01
This study focuses on future forest cover change in Buzau Subcarpathians, a landslide prone region in Romania. Past and current trends suggest that the area might expect a future increase in deforestation. We developed spatially explicit scenarios until 2040 to analyze the spatial pattern of future forest cover change and potential changes to landslide risk. First, we generated transition probability maps using the weights of evidence method, followed by a cellular automata allocation model. We performed expert interviews, to develop two future forest management scenarios. The Alternative scenario (ALT) was defined by 67% more deforestation than the Business as Usual scenario (BAU). We integrated the simulated scenarios with a landslide susceptibility map. In both scenarios, most of deforestation was projected in areas where landslides are less likely to occur. Still, 483 (ALT) and 276 (BAU) ha of deforestation were projected on areas with a high-landslide occurrence likelihood. Thus, deforestation could lead to a local-scale increase in landslide risk, in particular near or adjacent to forestry roads. The parallel process of near 10% forest expansion until 2040 was projected to occur mostly on areas with high-landslide susceptibility. On a regional scale, forest expansion could so result in improved slope stability. We modeled two additional scenarios with an implemented landslide risk policy, excluding high-risk zones. The reduction of deforestation on high-risk areas was achieved without a drastic decrease in the accessibility of the areas. Together with forest expansion, it could therefore be used as a risk reduction strategy.
Regenerative medicine in kidney disease: where we stand and where to go.
Borges, Fernanda T; Schor, Nestor
2017-07-22
The kidney is a complex organ with more than 20 types of specialized cells that play an important role in maintaining the body's homeostasis. The epithelial tubular cell is formed during embryonic development and has little proliferative capacity under physiological conditions, but after acute injury the kidney does have regenerative capacity. However, after repetitive or severe lesions, it may undergo a maladaptation process that predisposes it to chronic kidney injury. Regenerative medicine includes various repair and regeneration techniques, and these have gained increasing attention in the scientific literature. In the future, not only will these techniques contribute to the repair and regeneration of the human kidney, but probably also to the construction of an entire organ. New mechanisms studied for kidney regeneration and repair include circulating stem cells as mesenchymal stromal/stem cells and their paracrine mechanisms of action; renal progenitor stem cells; the leading role of tubular epithelial cells in the tubular repair process; the study of zebrafish larvae to understand the process of nephron development, kidney scaffold and its repopulation; and, finally, the development of organoids. This review elucidates where we are in terms of current scientific knowledge regarding these mechanisms and the promises of future scientific perspectives.
Extra-terrestrial life in the European Space Agency's Cosmic Vision plan and beyond.
Fridlund, Malcolm
2011-02-13
Our exciting time allows us to contemplate the moment in the not-too-distant future when we can detect the presence of life on worlds orbiting stars other than our Sun. It will not be easy and will require the development and use of the very latest technologies. It also very probably demands deployment in space of relevant instrumentation in order to carry out these investigations. The European Space Agency has been involved in the studies and development of the required technologies for more than a decade and is currently formulating a roadmap for how to achieve the ultimate detection of signs of life as we know it on terrestrial exoplanets. The major elements of the roadmap consist of the following. First, the search for and detection of terrestrial exoplanets. Here, some progress has been made recently and is reported in this paper. Second, the more and more detailed study of the physical characteristics of such exoplanets. Finally, the search for biomarkers--indicators of biological activity--that can be observed at interstellar distances. The last is probably one of the most difficult problems ever contemplated by observational astronomy.
Sato, Tatsuhiko; Masunaga, Shin-Ichiro; Kumada, Hiroaki; Hamada, Nobuyuki
2018-01-17
We here propose a new model for estimating the biological effectiveness for boron neutron capture therapy (BNCT) considering intra- and intercellular heterogeneity in 10 B distribution. The new model was developed from our previously established stochastic microdosimetric kinetic model that determines the surviving fraction of cells irradiated with any radiations. In the model, the probability density of the absorbed doses in microscopic scales is the fundamental physical index for characterizing the radiation fields. A new computational method was established to determine the probability density for application to BNCT using the Particle and Heavy Ion Transport code System PHITS. The parameters used in the model were determined from the measured surviving fraction of tumor cells administrated with two kinds of 10 B compounds. The model quantitatively highlighted the indispensable need to consider the synergetic effect and the dose dependence of the biological effectiveness in the estimate of the therapeutic effect of BNCT. The model can predict the biological effectiveness of newly developed 10 B compounds based on their intra- and intercellular distributions, and thus, it can play important roles not only in treatment planning but also in drug discovery research for future BNCT.
Regional Permafrost Probability Modelling in the northwestern Cordillera, 59°N - 61°N, Canada
NASA Astrophysics Data System (ADS)
Bonnaventure, P. P.; Lewkowicz, A. G.
2010-12-01
High resolution (30 x 30 m) permafrost probability models were created for eight mountainous areas in the Yukon and northernmost British Columbia. Empirical-statistical modelling based on the Basal Temperature of Snow (BTS) method was used to develop spatial relationships. Model inputs include equivalent elevation (a variable that incorporates non-uniform temperature change with elevation), potential incoming solar radiation and slope. Probability relationships between predicted BTS and permafrost presence were developed for each area using late-summer physical observations in pits, or by using year-round ground temperature measurements. A high-resolution spatial model for the region has now been generated based on seven of the area models. Each was applied to the entire region, and their predictions were then blended based on a distance decay function from the model source area. The regional model is challenging to validate independently because there are few boreholes in the region. However, a comparison of results to a recently established inventory of rock glaciers for the Yukon suggests its validity because predicted permafrost probabilities were 0.8 or greater for almost 90% of these landforms. Furthermore, the regional model results have a similar spatial pattern to those modelled independently in the eighth area, although predicted probabilities using the regional model are generally higher. The regional model predicts that permafrost underlies about half of the non-glaciated terrain in the region, with probabilities increasing regionally from south to north and from east to west. Elevation is significant, but not always linked in a straightforward fashion because of weak or inverted trends in permafrost probability below treeline. Above treeline, however, permafrost probabilities increase and approach 1.0 in very high elevation areas throughout the study region. The regional model shows many similarities to previous Canadian permafrost maps (Heginbottom and Radburn, 1992; Heginbottom et al., 1995) but is several orders of magnitude more detailed. It also exhibits some significant differences, including the presence of an area of valley-floor continuous permafrost around Beaver Creek near the Alaskan border in the west, as well as higher probabilities of permafrost in the central parts of the region near the boundaries of the sporadic and extensive discontinuous zones. In addition, parts of the northernmost portion of the region would be classified as sporadic discontinuous permafrost because of inversions in the terrestrial surface lapse rate which cause permafrost probabilities to decrease with elevation through the forest. These model predictions are expected to of direct use for infrastructure planning and northern development and can serve as a benchmark for future studies of permafrost distribution in the Yukon. References Heginbottom JR, Dubreuil MA and Haker PT. 1995. Canada Permafrost. (1:7,500,000 scale). In The National Atlas of Canada, 5th Edition, sheet MCR 4177. Ottawa: National Resources Canada. Heginbottom, J.A. and Radburn, L.K. 1992. Permafrost and ground ice conditions of northwestern Canada; Geological Survey of Canada, Map 1691A, scale 1:1,000,000. Digitized by S. Smith, Geological Survey of Canada.
Natural and technologic hazardous material releases during and after natural disasters: a review.
Young, Stacy; Balluz, Lina; Malilay, Josephine
2004-04-25
Natural disasters may be powerful and prominent mechanisms of direct and indirect hazardous material (hazmat) releases. Hazardous materials that are released as the result of a technologic malfunction precipitated by a natural event are referred to as natural-technologic or na-tech events. Na-tech events pose unique environmental and human hazards. Disaster-associated hazardous material releases are of concern, given increases in population density and accelerating industrial development in areas subject to natural disasters. These trends increase the probability of catastrophic future disasters and the potential for mass human exposure to hazardous materials released during disasters. This systematic review summarizes direct and indirect disaster-associated releases, as well as environmental contamination and adverse human health effects that have resulted from natural disaster-related hazmat incidents. Thorough examination of historic disaster-related hazmat releases can be used to identify future threats and improve mitigation and prevention efforts.
Predicting hospital visits from geo-tagged Internet search logs.
Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H
2016-01-01
The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user's future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources-a crucial prerequisite for securing healthcare access for everyone in the days to come.
Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K
2014-09-01
Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. © 2014 American Society for Nutrition.
Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties
NASA Astrophysics Data System (ADS)
Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris
2014-08-01
We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R; Gobbato, Maurizio; Conte, Joel
2009-01-01
The extensive use of lightweight advanced composite materials in unmanned aerial vehicles (UAVs) drastically increases the sensitivity to both fatigue- and impact-induced damage of their critical structural components (e.g., wings and tail stabilizers) during service life. The spar-to-skin adhesive joints are considered one of the most fatigue sensitive subcomponents of a lightweight UAV composite wing with damage progressively evolving from the wing root. This paper presents a comprehensive probabilistic methodology for predicting the remaining service life of adhesively-bonded joints in laminated composite structural components of UAVs. Non-destructive evaluation techniques and Bayesian inference are used to (i) assess the current statemore » of damage of the system and, (ii) update the probability distribution of the damage extent at various locations. A probabilistic model for future loads and a mechanics-based damage model are then used to stochastically propagate damage through the joint. Combined local (e.g., exceedance of a critical damage size) and global (e.g.. flutter instability) failure criteria are finally used to compute the probability of component failure at future times. The applicability and the partial validation of the proposed methodology are then briefly discussed by analyzing the debonding propagation, along a pre-defined adhesive interface, in a simply supported laminated composite beam with solid rectangular cross section, subjected to a concentrated load applied at mid-span. A specially developed Eliler-Bernoulli beam finite element with interlaminar slip along the damageable interface is used in combination with a cohesive zone model to study the fatigue-induced degradation in the adhesive material. The preliminary numerical results presented are promising for the future validation of the methodology.« less
Hansen, Maj; Hyland, Philip; Armour, Cherie
2016-05-01
Recently studies have indicated the existence of both posttraumatic stress disorder (PTSD) and acute stress disorder (ASD) subtypes but no studies have investigated their mutual association. Although ASD may not be a precursor of PTSD per se, there are potential benefits associated with early identification of victims at risk of developing PTSD subtypes. The present study investigates ASD and PTSD subtypes using latent class analysis (LCA) following bank robbery (N=371). Moreover, we assessed if highly symptomatic ASD and selected risk factors increased the probability of highly symptomatic PTSD. The results of LCA revealed a three class solution for ASD and a two class solution for PTSD. Negative cognitions about self (OR=1.08), neuroticism (OR=1.09) and membership of the 'High symptomatic ASD' class (OR=20.41) significantly increased the probability of 'symptomatic PTSD' class membership. Future studies are needed to investigate the existence of ASD and PTSD subtypes and their mutual relationship. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dosimetry in nuclear medicine therapy: radiobiology application and results.
Strigari, L; Benassi, M; Chiesa, C; Cremonesi, M; Bodei, L; D'Andrea, M
2011-04-01
The linear quadratic model (LQM) has largely been used to assess the radiobiological damage to tissue by external beam fractionated radiotherapy and more recently has been extended to encompass a general continuous time varying dose rate protocol such as targeted radionuclide therapy (TRT). In this review, we provide the basic aspects of radiobiology, from a theoretical point of view, starting from the "four Rs" of radiobiology and introducing the biologically effective doses, which may be used to quantify the impact of a treatment on both tumors and normal tissues. We also present the main parameters required in the LQM, and illustrate the main models of tumor control probability and normal tissue complication probability and summarize the main dose-effect responses, reported in literature, which demonstrate the tentative link between targeted radiotherapy doses and those used in conventional radiotherapy. A better understanding of the radiobiology and mechanisms of action of TRT could contribute to describe the clinical data and guide the development of future compounds and the designing of prospective clinical trials.
Flies, worms and the Free Radical Theory of ageing.
Clancy, David; Birdsall, John
2013-01-01
Drosophila and Caenorhabditis elegans have provided the largest body of evidence addressing the Free Radical Theory of ageing, however the evidence has not been unequivocally supportive. Oxidative damage to DNA is probably not a major contributor, damage to lipids is assuming greater importance and damage to proteins probably the source of pathology. On balance the evidence does not support a primary role of oxidative damage in ageing in C. elegans, perhaps because of its particular energy metabolic and stress resistance profile. Evidence is more numerous, varied and consistent and hence more compelling for Drosophila, although not conclusive. However there is good evidence for a role of oxidative damage in later life pathology. Future work should: 1/ make more use of protein oxidative damage measurements; 2/ use inducible transgenic systems or pharmacotherapy to ensure genetic equivalence of controls and avoid confounding effects during development; 3/ to try to delay ageing, target interventions which reduce and/or repair protein oxidative damage. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.
McCarthy, Matthew W.; Petraitiene, Ruta; Walsh, Thomas J.
2017-01-01
Early diagnosis and prompt initiation of appropriate antimicrobial therapy are crucial steps in the management of patients with invasive fungal infections. However, the diagnosis of invasive mycoses remains a major challenge in clinical practice, because presenting symptoms may be subtle and non-invasive diagnostic assays often lack sensitivity and specificity. Diagnosis is often expressed on a scale of probability (proven, probable and possible) based on a constellation of imaging findings, microbiological tools and histopathology, as there is no stand-alone assay for diagnosis. Recent data suggest that the carbohydrate biomarker (1→3)-β-d-glucan may be useful in both the diagnosis and therapeutic monitoring of invasive fungal infections due to some yeasts, molds, and dimorphic fungi. In this paper, we review recent advances in the use of (1→3)-β-d-glucan to monitor clinical response to antifungal therapy and explore how this assay may be used in the future. PMID:28538702
Testing option pricing with the Edgeworth expansion
NASA Astrophysics Data System (ADS)
Balieiro Filho, Ruy Gabriel; Rosenfeld, Rogerio
2004-12-01
There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments.
NASA Astrophysics Data System (ADS)
Arena, Dylan A.; Schwartz, Daniel L.
2014-08-01
Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.
A Bayesian approach to microwave precipitation profile retrieval
NASA Technical Reports Server (NTRS)
Evans, K. Franklin; Turk, Joseph; Wong, Takmeng; Stephens, Graeme L.
1995-01-01
A multichannel passive microwave precipitation retrieval algorithm is developed. Bayes theorem is used to combine statistical information from numerical cloud models with forward radiative transfer modeling. A multivariate lognormal prior probability distribution contains the covariance information about hydrometeor distribution that resolves the nonuniqueness inherent in the inversion process. Hydrometeor profiles are retrieved by maximizing the posterior probability density for each vector of observations. The hydrometeor profile retrieval method is tested with data from the Advanced Microwave Precipitation Radiometer (10, 19, 37, and 85 GHz) of convection over ocean and land in Florida. The CP-2 multiparameter radar data are used to verify the retrieved profiles. The results show that the method can retrieve approximate hydrometeor profiles, with larger errors over land than water. There is considerably greater accuracy in the retrieval of integrated hydrometeor contents than of profiles. Many of the retrieval errors are traced to problems with the cloud model microphysical information, and future improvements to the algorithm are suggested.
More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples
Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.
2017-01-01
Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254
Forecasting relative impacts of land use on anadromous fish habitat to guide conservation planning.
Lohse, Kathleen A; Newburn, David A; Opperman, Jeff J; Merenlender, Adina M
2008-03-01
Land use change can adversely affect water quality and freshwater ecosystems, yet our ability to predict how systems will respond to different land uses, particularly rural-residential development, is limited by data availability and our understanding of biophysical thresholds. In this study, we use spatially explicit parcel-level data to examine the influence of land use (including urban, rural-residential, and vineyard) on salmon spawning substrate quality in tributaries of the Russian River in California. We develop a land use change model to forecast the probability of losses in high-quality spawning habitat and recommend priority areas for incentive-based land conservation efforts. Ordinal logistic regression results indicate that all three land use types were negatively associated with spawning substrate quality, with urban development having the largest marginal impact. For two reasons, however, forecasted rural-residential and vineyard development have much larger influences on decreasing spawning substrate quality relative to urban development. First, the land use change model estimates 10 times greater land use conversion to both rural-residential and vineyard compared to urban. Second, forecasted urban development is concentrated in the most developed watersheds, which already have poor spawning substrate quality, such that the marginal response to future urban development is less significant. To meet the goals of protecting salmonid spawning habitat and optimizing investments in salmon recovery, we suggest investing in watersheds where future rural-residential development and vineyards threaten high-quality fish habitat, rather than the most developed watersheds, where land values are higher.
Neural response to reward anticipation under risk is nonlinear in probabilities.
Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F
2009-02-18
A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.
Rodent models in Down syndrome research: impact and future opportunities
2017-01-01
ABSTRACT Down syndrome is caused by trisomy of chromosome 21. To date, a multiplicity of mouse models with Down-syndrome-related features has been developed to understand this complex human chromosomal disorder. These mouse models have been important for determining genotype-phenotype relationships and identification of dosage-sensitive genes involved in the pathophysiology of the condition, and in exploring the impact of the additional chromosome on the whole genome. Mouse models of Down syndrome have also been used to test therapeutic strategies. Here, we provide an overview of research in the last 15 years dedicated to the development and application of rodent models for Down syndrome. We also speculate on possible and probable future directions of research in this fast-moving field. As our understanding of the syndrome improves and genome engineering technologies evolve, it is necessary to coordinate efforts to make all Down syndrome models available to the community, to test therapeutics in models that replicate the whole trisomy and design new animal models to promote further discovery of potential therapeutic targets. PMID:28993310
Rodent models in Down syndrome research: impact and future opportunities.
Herault, Yann; Delabar, Jean M; Fisher, Elizabeth M C; Tybulewicz, Victor L J; Yu, Eugene; Brault, Veronique
2017-10-01
Down syndrome is caused by trisomy of chromosome 21. To date, a multiplicity of mouse models with Down-syndrome-related features has been developed to understand this complex human chromosomal disorder. These mouse models have been important for determining genotype-phenotype relationships and identification of dosage-sensitive genes involved in the pathophysiology of the condition, and in exploring the impact of the additional chromosome on the whole genome. Mouse models of Down syndrome have also been used to test therapeutic strategies. Here, we provide an overview of research in the last 15 years dedicated to the development and application of rodent models for Down syndrome. We also speculate on possible and probable future directions of research in this fast-moving field. As our understanding of the syndrome improves and genome engineering technologies evolve, it is necessary to coordinate efforts to make all Down syndrome models available to the community, to test therapeutics in models that replicate the whole trisomy and design new animal models to promote further discovery of potential therapeutic targets. © 2017. Published by The Company of Biologists Ltd.
Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.
2003-01-01
Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.
Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria
2012-01-01
This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodman, B.W.; Begley, J.A.; Brown, S.D.
1995-12-01
The analysis of the issue of upper bundle axial ODSCC as it apples to steam generator tube structural integrity in Unit 1 at the Palo Verde Nuclear generating Station is presented in this study. Based on past inspection results for Units 2 and 3 at Palo Verde, the detection of secondary side stress corrosion cracks in the upper bundle region of Unit 1 may occur at some future date. The following discussion provides a description and analysis of the probability of axial ODSCC in Unit 1 leading to the exceedance of Regulatory Guide 1.121 structural limits. The probabilities of structuralmore » limit exceedance are estimated as function of run time using a conservative approach. The chosen approach models the historical development of cracks, crack growth, detection of cracks and subsequent removal from service and the initiation and growth of new cracks during a given cycle of operation. Past performance of all Palo Verde Units as well as the historical performance of other steam generators was considered in the development of cracking statistics for application to Unit 1. Data in the literature and Unit 2 pulled tube examination results were used to construct probability of detection curves for the detection of axial IGSCC/IGA using an MRPC (multi-frequency rotating panake coil) eddy current probe. Crack growth rates were estimated from Unit 2 eddy current inspection data combined with pulled tube examination results and data in the literature. A Monte-Carlo probabilistic model is developed to provide an overall assessment of the risk of Regulatory Guide exceedance during plant operation.« less
NASA Astrophysics Data System (ADS)
Vico, Giulia; Porporato, Amilcare
2013-04-01
Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.
Controlling the Growth of Future LEO Debris Populations with Active Debris Removal
NASA Technical Reports Server (NTRS)
Liou, J.-C.; Johnson, N. L.; Hill, N. M.
2008-01-01
Active debris removal (ADR) was suggested as a potential means to remediate the low Earth orbit (LEO) debris environment as early as the 1980s. The reasons ADR has not become practical are due to its technical difficulties and the high cost associated with the approach. However, as the LEO debris populations continue to increase, ADR may be the only option to preserve the near-Earth environment for future generations. An initial study was completed in 2007 to demonstrate that a simple ADR target selection criterion could be developed to reduce the future debris population growth. The present paper summarizes a comprehensive study based on more realistic simulation scenarios, including fragments generated from the 2007 Fengyun-1C event, mitigation measures, and other target selection options. The simulations were based on the NASA long-term orbital debris projection model, LEGEND. A scenario, where at the end of mission lifetimes, spacecraft and upper stages were moved to 25-year decay orbits, was adopted as the baseline environment for comparison. Different annual removal rates and different ADR target selection criteria were tested, and the resulting 200-year future environment projections were compared with the baseline scenario. Results of this parametric study indicate that (1) an effective removal strategy can be developed based on the mass and collision probability of each object as the selection criterion, and (2) the LEO environment can be stabilized in the next 200 years with an ADR removal rate of five objects per year.
Participatory design of probability-based decision support tools for in-hospital nurses.
Jeffery, Alvin D; Novak, Laurie L; Kennedy, Betsy; Dietrich, Mary S; Mion, Lorraine C
2017-11-01
To describe nurses' preferences for the design of a probability-based clinical decision support (PB-CDS) tool for in-hospital clinical deterioration. A convenience sample of bedside nurses, charge nurses, and rapid response nurses (n = 20) from adult and pediatric hospitals completed participatory design sessions with researchers in a simulation laboratory to elicit preferred design considerations for a PB-CDS tool. Following theme-based content analysis, we shared findings with user interface designers and created a low-fidelity prototype. Three major themes and several considerations for design elements of a PB-CDS tool surfaced from end users. Themes focused on "painting a picture" of the patient condition over time, promoting empowerment, and aligning probability information with what a nurse already believes about the patient. The most notable design element consideration included visualizing a temporal trend of the predicted probability of the outcome along with user-selected overlapping depictions of vital signs, laboratory values, and outcome-related treatments and interventions. Participants expressed that the prototype adequately operationalized requests from the design sessions. Participatory design served as a valuable method in taking the first step toward developing PB-CDS tools for nurses. This information about preferred design elements of tools that support, rather than interrupt, nurses' cognitive workflows can benefit future studies in this field as well as nurses' practice. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.
NASA Technical Reports Server (NTRS)
Chase, Thomas D.; Splawn, Keith; Christiansen, Eric L.
2007-01-01
The NASA Extravehicular Mobility Unit (EMU) micrometeoroid and orbital debris protection ability has recently been assessed against an updated, higher threat space environment model. The new environment was analyzed in conjunction with a revised EMU solid model using a NASA computer code. Results showed that the EMU exceeds the required mathematical Probability of having No Penetrations (PNP) of any suit pressure bladder over the remaining life of the program (2,700 projected hours of 2 person spacewalks). The success probability was calculated to be 0.94, versus a requirement of >0.91, for the current spacesuit s outer protective garment. In parallel to the probability assessment, potential improvements to the current spacesuit s outer protective garment were built and impact tested. A NASA light gas gun was used to launch projectiles at test items, at speeds of approximately 7 km per second. Test results showed that substantial garment improvements could be made, with mild material enhancements and moderate assembly development. The spacesuit s PNP would improve marginally with the tested enhancements, if they were available for immediate incorporation. This paper discusses the results of the model assessment process and test program. These findings add confidence to the continued use of the existing NASA EMU during International Space Station (ISS) assembly and Shuttle Operations. They provide a viable avenue for improved hypervelocity impact protection for the EMU, or for future space suits.
Tectonics and hydrocarbon potential of the Barents Megatrough
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baturin, D.; Vinogradov, A.; Yunov, A.
1991-08-01
Interpretation of geophysical data shows that the geological structure of the Eastern Barents Shelf, named Barents Megatrough (BM), extends sublongitudinally almost from the Baltic shield to the Franz Josef Land archipelago. The earth crust within the axis part of the BM is attenuated up to 28-30 km, whereas in adjacent areas its thickness exceeds 35 km. The depression is filled with of more than 15 km of Upper Paleozoic, Mesozoic, and Cenozoic sediments overlying a folded basement of probable Caledonian age. Paleozoic sediments, with exception of the Upper Permian, are composed mainly of carbonates and evaporites. Mesozoic-Cenozoic sediments are mostlymore » terrigenous. The major force in the development of the BM was due to extensional tectonics. Three rifting phases are recognizable: Late Devonian-Early Carboniferous, Early Triassic, and Jurassic-Early Cretaceous. The principal features of the geologic structure and evolution of the BM during the late Paleozoic-Mesozoic correlate well with those of the Sverdup basin, Canadian Arctic. Significant quantity of Late Jurassic-Early Cretaceous basaltic dikes and sills were intruded within Triassic sequence during the third rifting phase. This was probably the main reason for trap disruption and hydrocarbon loss from Triassic structures. Lower Jurassic and Lower Cretaceous reservoir sandstones are most probably the main future objects for oil and gas discoveries within the BM. Upper Jurassic black shales are probably the main source rocks of the BM basin, as well as excellent structural traps for hydrocarbon fluids from the underlying sediments.« less
Reichle, Joe; Drager, Kathryn; Caron, Jessica; Parker-McGowan, Quannah
2016-11-01
This article examines the growth of aided augmentative and alternative communication (AAC) in providing support to children and youth with significant communication needs. Addressing current trends and offering a discussion of needs and probable future advances is framed around five guiding principles initially introduced by Williams, Krezman, and McNaughton. These include: (1) communication is a basic right and the use of AAC, especially at a young age, can help individuals realize their communicative potential; (2) AAC, like traditional communication, requires it to be fluid with the ability to adapt to different environments and needs; (3) AAC must be individualized and appropriate for each user; (4) AAC must support full participation in society across all ages and interests; and (5) individuals who use AAC have the right to be involved in all aspects of research, development, and intervention. In each of these areas current advances, needs, and future predictions are offered and discussed in terms of researchers' and practitioners' efforts to a continued upward trajectory of research and translational service delivery. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Improving Scotland's health: time for a fresh approach?
Stone, D H
2012-05-01
Scotland's health remains the worst in the UK. There are several probable reasons for this. Of those that are amenable to change, health improvement policy has been excessively preoccupied with targeting individuals perceived to be 'at risk' rather than adopting a whole population perspective. Environmental as opposed to behavioural approaches to health improvement have been relatively neglected. To meet the challenge of Scotland's poor health more effectively in the future, new strategic thinking is necessary. Three initial steps are required: recognize that current approaches are inadequate and that fresh ideas are needed; identify the principles that should underlie future strategy development; translate these principles into achievable operational objectives. Five principles of a revitalized strategy to improve the health of Scotland in the future are proposed. These are start early and sustain effort; create a healthy and safe environment; reduce geographical as well as social inequalities in health; adopt an evidence-based approach to public health interventions; use epidemiology to assess need, plan interventions and monitor progress. These principles may then be translated into achievable operational policy and practice objectives.
Weiser, Armin A; Gross, Stefan; Schielke, Anika; Wigger, Jan-Frederik; Ernert, Andrea; Adolphs, Julian; Fetsch, Alexandra; Müller-Graf, Christine; Käsbohrer, Annemarie; Mosbach-Schulz, Olaf; Appel, Bernd; Greiner, Matthias
2013-03-01
The Shiga toxin-producing Escherichia coli O104:H4 outbreak in Germany in 2011 required the development of appropriate tools in real-time for tracing suspicious foods along the supply chain, namely salad ingredients, sprouts, and seeds. Food commodities consumed at locations identified as most probable site of infection (outbreak clusters) were traced back in order to identify connections between different disease clusters via the supply chain of the foods. A newly developed relational database with integrated consistency and plausibility checks was used to collate these data for further analysis. Connections between suppliers, distributors, and producers were visualized in network graphs and geographic projections. Finally, this trace-back and trace-forward analysis led to the identification of sprouts produced by a horticultural farm in Lower Saxony as vehicle for the pathogen, and a specific lot of fenugreek seeds imported from Egypt as the most likely source of contamination. Network graphs have proven to be a powerful tool for summarizing and communicating complex trade relationships to various stake holders. The present article gives a detailed description of the newly developed tracing tools and recommendations for necessary requirements and improvements for future foodborne outbreak investigations.
NASA Astrophysics Data System (ADS)
Cardona, O. D.
2013-05-01
Recently earthquakes have struck cities both from developing as well as developed countries, revealing significant knowledge gaps and the need to improve the quality of input data and of the assumptions of the risk models. The quake and tsunami in Japan (2011) and the disasters due to earthquakes in Haiti (2010), Chile (2010), New Zealand (2011) and Spain (2011), only to mention some unexpected impacts in different regions, have left several concerns regarding hazard assessment as well as regarding the associated uncertainties to the estimation of the future losses. Understanding probable losses and reconstruction costs due to earthquakes creates powerful incentives for countries to develop planning options and tools to cope with sovereign risk, including allocating the sustained budgetary resources necessary to reduce those potential damages and safeguard development. Therefore the use of robust risk models is a need to assess the future economic impacts, the country's fiscal responsibilities and the contingent liabilities for governments and to formulate, justify and implement risk reduction measures and optimal financial strategies of risk retention and transfer. Special attention should be paid to the understanding of risk metrics such as the Loss Exceedance Curve (empiric and analytical) and the Expected Annual Loss in the context of conjoint and cascading hazards.
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
1998-01-01
(79) Waste, by definition, has no benefit. It should be viewed as one aspect of the beneficial practice that gave rise to it. Furthermore, radioactive waste management should be placed in the context of the management of society's waste in general. (80) A major issue in evaluating the acceptability of a disposal system for long-lived solid radioactive waste is that doses or risks may arise from exposures in the distant future. There is uncertainty surrounding any estimate of these doses or risks due to lack of knowledge about future conditions. Such exposures are treated as potential exposures as their magnitude depends on future processes and conditions that have probabilities associated with them. (81) Nevertheless, the Commission recognises a basic principle that individuals and populations in the future should be afforded at least the same level of protection from the action of disposing of radioactive waste today as is the current generation. This implies use of the current quantitative dose and risk criteria derived from considering associated health detriment. Therefore, protection of future generations should be achieved by applying these dose or risk criteria to the estimated future doses or risks in appropriately defined critical groups. These estimates should not be regarded as measures of health detriment beyond times of around several hundreds of years into the future. In the case of these longer time periods, they represent indicators of the protection afforded by the disposal system. (82 Constrained optimisation is the central approach to evaluating the radiological acceptability of a waste disposal system; dose or risk constraints are used rather than dose or risk limits. By this transition from limitation to optimisation, the needs of practical application of the radiological protection system to the disposal of long-lived solid waste disposal are met: determination of acceptability now for exposures that may occur in the distant future. Optimisation should be applied in an iterative manner during the disposal system development process and should particularly cover both site selection and repository design. (83) Two broad categories of exposure situations should be considered: natural processes and human intrusion. The latter only refers to intrusion that is inadvertent. The radiological implications of deliberate intrusion into a repository are the responsibility of the intruder. Assessed doses or risks arising from natural processes should be compared with a dose constraint of 0.3 mSv per year or its risk equivalent of around 10(-5) per year. With regard to human intrusion, the consequences from one or more plausible stylized scenarios should be considered in order to evaluate the resilience of the repository to such events. (84) The Commission considers that in circumstances where human intrusion could lead to doses to those living around the site sufficiently high that intervention on current criteria would almost always be justified, reasonable efforts should be made at the repository development stage to reduce the probability of human intrusion or to limit its consequences. In this respect, the Commission has previously advised that an existing annual dose of around 10 mSv per year may be used as a generic reference level below which intervention is not likely to be justifiable. Conversely, an existing annual dose of around 100 mSv per year may be used as a generic reference level above which intervention should be considered almost always justifiable. Similar considerations apply in situations where the thresholds for deterministic effects in relevant organs are exceeded. (85) Compliance with the constraints can be assessed by utilising either an aggregated risk-oriented approach, with a risk constraint, or a disaggregated dose/probability approach, with a dose constraint, or a combination of both. A similar level of protection can be achieved by any of these approaches; however, more information may
Wisk, Lauren E; Gangnon, Ronald; Vanness, David J; Galbraith, Alison A; Mullahy, John; Witt, Whitney P
2014-01-01
Objective To develop and validate a theoretically based and empirically driven objective measure of financial burden for U.S. families with children. Data Sources The measure was developed using 149,021 families with children from the National Health Interview Survey, and it was validated using 18,488 families with children from the Medical Expenditure Panel Survey. Study Design We estimated the marginal probability of unmet health care need due to cost using a bivariate tensor product spline for family income and out-of-pocket health care costs (OOPC; e.g., deductibles, copayments), while adjusting for confounders. Recursive partitioning was performed on these probabilities, as a function of income and OOPC, to establish thresholds demarcating levels of predicted risk. Principal Findings We successfully generated a novel measure of financial burden with four categories that were associated with unmet need (vs. low burden: midlow OR: 1.93, 95 percent CI: 1.78–2.09; midhigh OR: 2.78, 95 percent CI: 2.49–3.10; high OR: 4.38, 95 percent CI: 3.99–4.80). The novel burden measure demonstrated significantly better model fit and less underestimation of financial burden compared to an existing measure (OOPC/income ≥10 percent). Conclusion The newly developed measure of financial burden establishes thresholds based on different combinations of family income and OOPC that can be applied in future studies of health care utilization and expenditures and in policy development and evaluation. PMID:25328073
McDonald, Heather; Charles, Cathy; Gafni, Amiram
2011-01-01
Abstract Context Promoting patient participation in treatment decision making is of increasing interest to researchers, clinicians and policy makers. Decision aids (DAs) are advocated as one way to help achieve this goal. Despite their proliferation, there has been little agreement on criteria or standards for evaluating these tools. To fill this gap, an international collaboration of researchers and others interested in the development, content and quality of DAs have worked over the past several years to develop a checklist and, based on this checklist, an instrument for determining whether any given DA meets a defined set of quality criteria. Objective/Methods In this paper, we offer a framework for assessing the conceptual clarity and evidence base used to support the development of quality criteria/standards for evaluating DAs. We then apply this framework to assess the conceptual clarity and evidence base underlying the International Patient Decision Aids Standards (IPDAS) checklist criteria for one of the checklist domains: how best to present in DAs probability information to patients on treatment benefits and risks. Conclusion We found that some of the central concepts underlying the presenting probabilities domain were not defined. We also found gaps in the empirical evidence and theoretical support for this domain and criteria within this domain. Finally, we offer suggestions for steps that should be undertaken for further development and refinement of quality standards for DAs in the future. PMID:22050440
Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis
2016-03-01
Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.
Managing fire and fuels in a warmer climate
David L. Peterson
2010-01-01
This historical perspective on fire provides a window into the future of fire in the Pacific Northwest. Although fire will always be more common in the interior portion of the region, a warmer climate could bring more fire to the westside of the Cascade Range where summers are typically dry and will probably become drier. If future climate resembles the climate now...
Probabilities of Possible Future Prices (Short-Term Energy Outlook Supplement April 2010)
2010-01-01
The Energy Information Administration introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts.
ERIC Educational Resources Information Center
Kyslenko, Dmytro
2017-01-01
The paper discusses the use of information technologies in professional training of future security specialists in the United States, Great Britain, Poland and Israel. The probable use of computer-based techniques being available within the integrated Web-sites have been systematized. It has been suggested that the presented scheme may be of great…
Future fire probability modeling with climate change data and physical chemistry
Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey
2014-01-01
Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...
Scenario studies as a synthetic and integrative research activity for Long-Term Ecological Research
Jonathan R. Thompson; Arnim Wiek; Frederick J. Swanson; Stephen R. Carpenter; Nancy Fresco; Teresa Hollingsworth; Thomas A. Spies; David R. Foster
2012-01-01
Scenario studies have emerged as a powerful approach for synthesizing diverse forms of research and for articulating and evaluating alternative socioecological futures. Unlike predictive modeling, scenarios do not attempt to forecast the precise or probable state of any variable at a given point in the future. Instead, comparisons among a set of contrasting scenarios...
Prediction of the future number of wells in production (in Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coca, B.P.
1981-01-01
A method to predict the number of wells that will continue producing at a certain date in the future is presented. The method is applicable to reservoirs of the depletion type and is based on the survival probability concept. This is useful when forecasting by empirical methods. An example of a field in primary production is presented.
Looking to the Future: Will Behavior Analysis Survive and Prosper?
ERIC Educational Resources Information Center
Poling, Alan
2010-01-01
Behavior analysis as a discipline currently is doing relatively well. How it will do in the future is unclear and depends on how the field, and the world at large, changes. Five current characteristics of the discipline that appear to reduce the probability that it will survive and prosper are discussed and suggestions for improvement are offered.…
We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact
Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.
2014-01-01
What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073
Use of computers in dysmorphology.
Diliberti, J H
1988-01-01
As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092
Development of future indications for BOTOX.
Brin, Mitchell F
2009-10-01
Since the late 1970s, local injections of BoNT have provided clinical benefit for patients with inappropriately contracting muscles with or without pain or sensory disturbance. Marketing authorization for some BoNTs, depending on country, include core indications of dystonia (blepharospasm and cervical dystonia), large muscle spastic disorders (not yet approved in the United States, e.g., adult post-stroke spasticity and equinus foot deformity), hyperhidrosis and aesthetic. Subsequent development has extended to selected conditions characterized by recurrent or chronic pain (migraine headache), and urologic indications (neurogenic/idiopathic overactive bladder; prostate hyperplasia), with multiple additional opportunities available. Portfolio management requires a careful individual opportunity assessment of scientific and technical aspects (basic science foundation, potential to treat unmet medical need, product-specific risk in specific populations, therapeutic margin/safety profile, and probability of successful registration pathway). This article describes ongoing development targets for BOTOX.
NASA Astrophysics Data System (ADS)
Faulkner, B. R.; Lyon, W. G.
2001-12-01
We present a probabilistic model for predicting virus attenuation. The solution employs the assumption of complete mixing. Monte Carlo methods are used to generate ensemble simulations of virus attenuation due to physical, biological, and chemical factors. The model generates a probability of failure to achieve 4-log attenuation. We tabulated data from related studies to develop probability density functions for input parameters, and utilized a database of soil hydraulic parameters based on the 12 USDA soil categories. Regulators can use the model based on limited information such as boring logs, climate data, and soil survey reports for a particular site of interest. Plackett-Burman sensitivity analysis indicated the most important main effects on probability of failure to achieve 4-log attenuation in our model were mean logarithm of saturated hydraulic conductivity (+0.396), mean water content (+0.203), mean solid-water mass transfer coefficient (-0.147), and the mean solid-water equilibrium partitioning coefficient (-0.144). Using the model, we predicted the probability of failure of a one-meter thick proposed hydrogeologic barrier and a water content of 0.3. With the currently available data and the associated uncertainty, we predicted soils classified as sand would fail (p=0.999), silt loams would also fail (p=0.292), but soils classified as clays would provide the required 4-log attenuation (p=0.001). The model is extendible in the sense that probability density functions of parameters can be modified as future studies refine the uncertainty, and the lightweight object-oriented design of the computer model (implemented in Java) will facilitate reuse with modified classes. This is an abstract of a proposed presentation and does not necessarily reflect EPA policy.
Cuthbertson, Carmen C; Kucharska-Newton, Anna; Faurot, Keturah R; Stürmer, Til; Jonsson Funk, Michele; Palta, Priya; Windham, B Gwen; Thai, Sydney; Lund, Jennifer L
2018-07-01
Frailty is a geriatric syndrome characterized by weakness and weight loss and is associated with adverse health outcomes. It is often an unmeasured confounder in pharmacoepidemiologic and comparative effectiveness studies using administrative claims data. Among the Atherosclerosis Risk in Communities (ARIC) Study Visit 5 participants (2011-2013; n = 3,146), we conducted a validation study to compare a Medicare claims-based algorithm of dependency in activities of daily living (or dependency) developed as a proxy for frailty with a reference standard measure of phenotypic frailty. We applied the algorithm to the ARIC participants' claims data to generate a predicted probability of dependency. Using the claims-based algorithm, we estimated the C-statistic for predicting phenotypic frailty. We further categorized participants by their predicted probability of dependency (<5%, 5% to <20%, and ≥20%) and estimated associations with difficulties in physical abilities, falls, and mortality. The claims-based algorithm showed good discrimination of phenotypic frailty (C-statistic = 0.71; 95% confidence interval [CI] = 0.67, 0.74). Participants classified with a high predicted probability of dependency (≥20%) had higher prevalence of falls and difficulty in physical ability, and a greater risk of 1-year all-cause mortality (hazard ratio = 5.7 [95% CI = 2.5, 13]) than participants classified with a low predicted probability (<5%). Sensitivity and specificity varied across predicted probability of dependency thresholds. The Medicare claims-based algorithm showed good discrimination of phenotypic frailty and high predictive ability with adverse health outcomes. This algorithm can be used in future Medicare claims analyses to reduce confounding by frailty and improve study validity.
Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.
2007-01-01
Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.
Martin, Petra; Leighl, Natasha B.
2017-01-01
This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice. PMID:28607579
Evaluating detection and estimation capabilities of magnetometer-based vehicle sensors
NASA Astrophysics Data System (ADS)
Slater, David M.; Jacyna, Garry M.
2013-05-01
In an effort to secure the northern and southern United States borders, MITRE has been tasked with developing Modeling and Simulation (M&S) tools that accurately capture the mapping between algorithm-level Measures of Performance (MOP) and system-level Measures of Effectiveness (MOE) for current/future surveillance systems deployed by the the Customs and Border Protection Office of Technology Innovations and Acquisitions (OTIA). This analysis is part of a larger M&S undertaking. The focus is on two MOPs for magnetometer-based Unattended Ground Sensors (UGS). UGS are placed near roads to detect passing vehicles and estimate properties of the vehicle's trajectory such as bearing and speed. The first MOP considered is the probability of detection. We derive probabilities of detection for a network of sensors over an arbitrary number of observation periods and explore how the probability of detection changes when multiple sensors are employed. The performance of UGS is also evaluated based on the level of variance in the estimation of trajectory parameters. We derive the Cramer-Rao bounds for the variances of the estimated parameters in two cases: when no a priori information is known and when the parameters are assumed to be Gaussian with known variances. Sample results show that UGS perform significantly better in the latter case.
Method for Identifying Probable Archaeological Sites from Remotely Sensed Data
NASA Technical Reports Server (NTRS)
Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel
2011-01-01
Archaeological sites are being compromised or destroyed at a catastrophic rate in most regions of the world. The best solution to this problem is for archaeologists to find and study these sites before they are compromised or destroyed. One way to facilitate the necessary rapid, wide area surveys needed to find these archaeological sites is through the generation of maps of probable archaeological sites from remotely sensed data. We describe an approach for identifying probable locations of archaeological sites over a wide area based on detecting subtle anomalies in vegetative cover through a statistically based analysis of remotely sensed data from multiple sources. We further developed this approach under a recent NASA ROSES Space Archaeology Program project. Under this project we refined and elaborated this statistical analysis to compensate for potential slight miss-registrations between the remote sensing data sources and the archaeological site location data. We also explored data quantization approaches (required by the statistical analysis approach), and we identified a superior data quantization approached based on a unique image segmentation approach. In our presentation we will summarize our refined approach and demonstrate the effectiveness of the overall approach with test data from Santa Catalina Island off the southern California coast. Finally, we discuss our future plans for further improving our approach.
NASA Astrophysics Data System (ADS)
Moreno, B.; Aune, S.; Ball, J.; Charles, G.; Giganon, A.; Konczykowski, P.; Lahonde-Hamdoun, C.; Moutarde, H.; Procureur, S.; Sabatié, F.
2011-10-01
We present first discharge rate measurements for Micromegas detectors in the presence of a high longitudinal magnetic field in the GeV kinematical region. Measurements were performed by using two Micromegas detectors and a photon beam impinging a CH 2 target in the Hall B of the Jefferson Laboratory. One detector was equipped with an additional GEM foil, and a reduction of the discharge probability by two orders of magnitude compared to the stand-alone Micromegas was observed. The detectors were placed in the FROST solenoid providing a longitudinal magnetic field up to 5 T. It allowed for precise measurements of the discharge probability dependence with a diffusion-reducing magnetic field. Between 0 and 5 T, the discharge probability increased by a factor of 10 for polar angles between 19° and 34°. A GEANT4-based simulation developed for sparking rate calculation was calibrated against these data in order to predict the sparking rate in a high longitudinal magnetic field environment. This simulation is then used to investigate the possible use of Micromegas in the Forward Vertex Tracker (FVT) of the future CLAS12 spectrometer. In the case of the FVT a sparking rate of 1 Hz per detector was obtained at the anticipated CLAS12 luminosity.
A causal loop analysis of the sustainability of integrated community case management in Rwanda.
Sarriot, Eric; Morrow, Melanie; Langston, Anne; Weiss, Jennifer; Landegger, Justine; Tsuma, Laban
2015-04-01
Expansion of community health services in Rwanda has come with the national scale up of integrated Community Case Management (iCCM) of malaria, pneumonia and diarrhea. We used a sustainability assessment framework as part of a large-scale project evaluation to identify factors affecting iCCM sustainability (2011). We then (2012) used causal-loop analysis to identify systems determinants of iCCM sustainability from a national systems perspective. This allows us to develop three high-probability future scenarios putting the achievements of community health at risk, and to recommend mitigating strategies. Our causal loop diagram highlights both balancing and reinforcing loops of cause and effect in the national iCCM system. Financial, political and technical scenarios carry high probability for threatening the sustainability through: (1) reduction in performance-based financing resources, (2) political shocks and erosion of political commitment for community health, and (3) insufficient progress in resolving district health systems--"building blocks"--performance gaps. In a complex health system, the consequences of choices may be delayed and hard to predict precisely. Causal loop analysis and scenario mapping make explicit complex cause-and-effects relationships and high probability risks, which need to be anticipated and mitigated. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Method for detecting and avoiding flight hazards
NASA Astrophysics Data System (ADS)
von Viebahn, Harro; Schiefele, Jens
1997-06-01
Today's aircraft equipment comprise several independent warning and hazard avoidance systems like GPWS, TCAS or weather radar. It is the pilot's task to monitor all these systems and take the appropriate action in case of an emerging hazardous situation. The developed method for detecting and avoiding flight hazards combines all potential external threats for an aircraft into a single system. It is based on an aircraft surrounding airspace model consisting of discrete volume elements. For each element of the volume the threat probability is derived or computed from sensor output, databases, or information provided via datalink. The position of the own aircraft is predicted by utilizing a probability distribution. This approach ensures that all potential positions of the aircraft within the near future are considered while weighting the most likely flight path. A conflict detection algorithm initiates an alarm in case the threat probability exceeds a threshold. An escape manoeuvre is generated taking into account all potential hazards in the vicinity, not only the one which caused the alarm. The pilot gets a visual information about the type, the locating, and severeness o the threat. The algorithm was implemented and tested in a flight simulator environment. The current version comprises traffic, terrain and obstacle hazards avoidance functions. Its general formulation allows an easy integration of e.g. weather information or airspace restrictions.
NASA Astrophysics Data System (ADS)
Keyser, Alisa; Westerling, Anthony LeRoy
2017-05-01
A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.
42 CFR 438.700 - Basis for imposition of sanctions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... medical condition or history indicates probable need for substantial future medical services. (4... directly, or indirectly through any agent or independent contractor, marketing materials that have not been...
Socio-hydrology of the Thippagondanahalli catchment in India - from common property to open-access.
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Thomas, B.; Lele, S.
2014-12-01
Developing countries face difficult challenge as they must adapt to an uncertain climate future even as land use, demography and the composition of their economies are rapidly changing. Achieving a secure water future requires making reliable predictions of water cycle dynamics in future years. This necessitates understanding societal feedbacks and predicting how these will change in the future. We explore this "Predictions Under Change" problem in the Thippagondanahalli (TG Halli) catchment of the Arkavathy Basin in South India. Here, river flows have declined sharply over the last thirty years. The TG Halli Reservoir that once supplied 148 MLD to Bangalore city only yields 30 MLD today. Our analyses suggest that these declines cannot be attributed to climatic factors; groundwater depletion is probably the major cause. We analysed the interlinked human and hydrologic factors and feedbacks between them that have resulted in the present situation using extensive primary data, including weather stations, stream gaging, soil moisture sensing, household surveys, oral histories, interviews, and secondary data including census data, crop reports, satellite imagery and historical hydro-climatic data. Our analysis suggests that several factors have contributed to a continuous shift from surface to groundwater in the TG Halli catchment. First, cheap borewell technology has made groundwater more accessible. Second, as demand for high-value produce from the city and wealth increased, farmers became increasingly willing to invest in risky borewell drilling. Third, differences in governance in groundwater (open access) versus surface water (community managed tanks) hastened the break-down of community managed water systems allowing unchecked exploitation of groundwater. Finally, the political economy of water spurred groundwater development through provision of free electricity and "watershed development" programmes.
Analysis and Forecasting of Shoreline Position
NASA Astrophysics Data System (ADS)
Barton, C. C.; Tebbens, S. F.
2007-12-01
Analysis of historical shoreline positions on sandy coasts, in the geologic record, and study of sea-level rise curves reveals that the dynamics of the underlying processes produce temporal/spatial signals that exhibit power scaling and are therefore self-affine fractals. Self-affine time series signals can be quantified over many orders of magnitude in time and space in terms of persistence, a measure of the degree of correlation between adjacent values in the stochastic portion of a time series. Fractal statistics developed for self-affine time series are used to forecast a probability envelope bounding future shoreline positions. The envelope provides the standard deviation as a function of three variables: persistence, a constant equal to the value of the power spectral density when 1/period equals 1, and the number of time increments. The persistence of a twenty-year time series of the mean-high-water (MHW) shoreline positions was measured for four profiles surveyed at Duck, NC at the Field Research Facility (FRF) by the U.S. Army Corps of Engineers. The four MHW shoreline time series signals are self-affine with persistence ranging between 0.8 and 0.9, which indicates that the shoreline position time series is weakly persistent (where zero is uncorrelated), and has highly varying trends for all time intervals sampled. Forecasts of a probability envelope for future MHW positions are made for the 20 years of record and beyond to 50 years from the start of the data records. The forecasts describe the twenty-year data sets well and indicate that within a 96% confidence envelope, future decadal MHW shoreline excursions should be within 14.6 m of the position at the start of data collection. This is a stable-oscillatory shoreline. The forecasting method introduced here includes the stochastic portion of the time series while the traditional method of predicting shoreline change reduces the time series to a linear trend line fit to historic shoreline positions and extrapolated linearly to forecast future positions with a linearly increasing mean that breaks the confidence envelope eight years into the future and continues to increase. The traditional method is a poor representation of the observed shoreline position time series and is a poor basis for extrapolating future shoreline positions.
NASA Astrophysics Data System (ADS)
Skilling, John
2005-11-01
This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Boyun; Duguid, Andrew; Nygaard, Ronar
The objective of this project is to develop a computerized statistical model with the Integrated Neural-Genetic Algorithm (INGA) for predicting the probability of long-term leak of wells in CO 2 sequestration operations. This object has been accomplished by conducting research in three phases: 1) data mining of CO 2-explosed wells, 2) INGA computer model development, and 3) evaluation of the predictive performance of the computer model with data from field tests. Data mining was conducted for 510 wells in two CO 2 sequestration projects in the Texas Gulf Coast region. They are the Hasting West field and Oyster Bayou fieldmore » in the Southern Texas. Missing wellbore integrity data were estimated using an analytical and Finite Element Method (FEM) model. The INGA was first tested for performances of convergence and computing efficiency with the obtained data set of high dimension. It was concluded that the INGA can handle the gathered data set with good accuracy and reasonable computing time after a reduction of dimension with a grouping mechanism. A computerized statistical model with the INGA was then developed based on data pre-processing and grouping. Comprehensive training and testing of the model were carried out to ensure that the model is accurate and efficient enough for predicting the probability of long-term leak of wells in CO 2 sequestration operations. The Cranfield in the southern Mississippi was select as the test site. Observation wells CFU31F2 and CFU31F3 were used for pressure-testing, formation-logging, and cement-sampling. Tools run in the wells include Isolation Scanner, Slim Cement Mapping Tool (SCMT), Cased Hole Formation Dynamics Tester (CHDT), and Mechanical Sidewall Coring Tool (MSCT). Analyses of the obtained data indicate no leak of CO 2 cross the cap zone while it is evident that the well cement sheath was invaded by the CO 2 from the storage zone. This observation is consistent with the result predicted by the INGA model which indicates the well has a CO 2 leak-safe probability of 72%. This comparison implies that the developed INGA model is valid for future use in predicting well leak probability.« less
NASA Astrophysics Data System (ADS)
Rey Vicario, D.; Holman, I.
2016-12-01
The use of water for irrigation and on-farm reservoir filling is globally important for agricultural production. In humid climates, like the UK, supplemental irrigation can be critical to buffer the effects of rainfall variability and to achieve high quality crops. Given regulatory efforts to secure sufficient environmental river flows and meet rising water demands due to population growth and climate change, increasing water scarcity is likely to compound the drought challenges faced by irrigated agriculture in this region. Currently, water abstraction from surface waters for agricultural irrigation can be restricted by the Environment Agency during droughts under Section 57 of the Water Resources Act (1991), based on abnormally low river flow levels and rainfall forecast, causing significant economic impacts on irrigated agricultural production. The aim of this study is to assess the impact that climate change may have on agricultural abstraction in the UK within the context of the abstraction restriction triggers currently in place. These triggers have been applied to the `Future Flows hydrology' database to assess the likelihood of increasing restrictions on agricultural abstraction in the future by comparing the probability of voluntary and compulsory restrictions in the baseline (1961-1990) and future period (2071-2098) for 282 catchments throughout the whole of the UK. The results of this study show a general increase in the probability of future agricultural irrigation abstraction restrictions in the UK in the summer, particularly in the South West, although there is significant variability between the 11 ensemble members. The results also indicate that UK winters are likely to become wetter in the future, although in some catchments the probability of abstraction restriction in the reservoir refilling winter months (November-February) could increase slightly. An increasing frequency of drought events due to climate change is therefore likely to lead to more water abstraction restrictions, increasing the need for irrigators to adapt their businesses to increase drought resilience and hence food security.
Basics of tumor development and importance of human papilloma virus (HPV) for head and neck cancer
Wittekindt, Claus; Wagner, Steffen; Mayer, Christina Sabine; Klussmann, Jens Peter
2012-01-01
Head and Neck Squamous Cell Carcinomas (HNSCC) are the 6th most common cancers worldwide. While incidence rates for cancer of the hypopharynx and larynx are decreasing, a significant increase in cancer of the oropharynx (OSCC) is observed. Classical risk factors for HNSCC are smoking and alcohol. It has been shown for 25 to 60% of OSCC to be associated with an infection by oncogenic human papilloma viruses (HPV). The development of “common” cancer of the head and neck is substantially enhanced by an accumulation of genetic changes, which lead to an inactivation of tumor suppressor genes or activation of proto-oncogenes. A more or less uniform sequence of different DNA-damages leads to genetic instability. In this context, an early and frequent event is deletion on the short arm of chromosome 9, which results in inactivation of the p16-gene. In contrast, for HPV-induced carcinogenesis, expression of the viral proteins E6 and E7 is most important, since they lead to inactivation of the cellular tumor-suppressor-proteins p53 and Rb. The natural route of transoral infection is a matter of debate; peroral HPV-infections might be frequent and disappear uneventfully in most cases. Smoking seems to increase the probability for developing an HPV-associated OSCC. The association of HNSCC with HPV can be proven with established methods in clinical diagnostics. In addition to classical prognostic factors, diagnosis of HPV-association may become important for selection of future therapies. Prognostic relevance of HPV probably surmounts many known risk-factors, for example regional metastasis. Until now, no other molecular markers are established in clinical routine. Future therapy concepts may vary for the two subgroups of patients, particularly patients with HPV-associated OSCC may take advantage of less aggressive treatments. Finally, an outlook will be given on possible targeted therapies. PMID:23320061
NASA Astrophysics Data System (ADS)
Quinn, N.; Bates, P. D.; Siddall, M.
2013-12-01
The rate at which sea levels will rise in the coming century is of great interest to decision makers tasked with developing mitigation policies to cope with the risk of coastal inundation. Accurate estimates of future sea levels are vital in the provision of effective policy. Recent reports from UK Climate Impacts Programme (UKCIP) suggest that mean sea levels in the UK may rise by as much as 80 cm by 2100; however, a great deal of uncertainty surrounds model predictions, particularly the contribution from ice sheets responding to climatic warming. For this reason, the application of semi-empirical modelling approaches for sea level rise predictions has increased of late, the results from which suggest that the rate of sea level rise may be greater than previously thought, exceeding 1 m by 2100. Furthermore, studies in the Red Sea indicate that rapid sea level rise beyond 1m per century has occurred in the past. In light of such research, the latest UKCIP assessment has included a H++ scenario for sea level rise in the UK of up to 1.9 m which is defined as improbable but, crucially, physically plausible. The significance of such low-probability sea level rise scenarios upon the estimation of future flood risk is assessed using the Somerset levels (UK) as a case study. A simple asymmetric probability distribution is constructed to include sea level rise scenarios of up to 1.9 m by 2100 which are added to a current 1:200 year event water level to force a two-dimensional hydrodynamic model of coastal inundation. From the resulting ensemble predictions an estimation of risk by 2100 is established. The results indicate that although the likelihood of extreme sea level rise due to rapid ice sheet mass loss is low, the resulting hazard can be large, resulting in a significant (27%) increase to the projected annual risk. Furthermore, current defence construction guidelines for the coming century in the UK are expected to account for 95% of the sea level rise distribution presented in this research, while the larger, low probability scenarios beyond this level are estimated to contribute a residual annual risk of approximately £0.45 million. These findings clearly demonstrate that uncertainty in future sea level rise is a vital component of coastal flood risk, and therefore, needs to be accounted for by decision makers when considering mitigation policies related to coastal flooding.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krier, D. J.; Perry, F. V.
Location, timing, volume, and eruptive style of post-Miocene volcanoes have defined the volcanic hazard significant to a proposed high-level radioactive waste (HLW) and spent nuclear fuel (SNF) repository at Yucca Mountain, Nevada, as a low-probability, high-consequence event. Examination of eruptive centers in the region that may be analogueues to possible future volcanic activity at Yucca Mountain have aided in defining and evaluating the consequence scenarios for intrusion into and eruption above a repository. The probability of a future event intersecting a repository at Yucca Mountain has a mean value of 1.7 x 10{sup -8} per year. This probability comes frommore » the Probabilistic Volcanic Hazard Assessment (PVHA) completed in 1996 and updated to reflect change in repository layout. Since that time, magnetic anomalies representing potential buried volcanic centers have been identified fiom magnetic surveys; however these potential buried centers only slightly increase the probability of an event intersecting the repository. The proposed repository will be located in its central portion of Yucca Mountain at approximately 300m depth. The process for assessing performance of a repository at Yucca Mountain has identified two scenarios for igneous activity that, although having a very low probability of occurrence, could have a significant consequence should an igneous event occur. Either a dike swarm intersecting repository drifts containing waste packages, or a volcanic eruption through the repository could result in release of radioactive material to the accessible environment. Ongoing investigations are assessing the mechanisms and significance of the consequence scenarios. Lathrop Wells Cone ({approx}80,000 yrs), a key analogue for estimating potential future volcanic activity, is the youngest surface expression of apparent waning basaltic volcanism in the region. Cone internal structure, lavas, and ash-fall tephra have been examined to estimate eruptive volume, eruption type, and subsurface disturbance accompanying conduit growth and eruption. The Lathrop Wells volcanic complex has a total volume estimate of approximately 0.1 km{sup 3}. The eruptive products indicate a sequence of initial magmatic fissure fountaining, early Strombolian activity, and a brief hydrovolcanic phase, and violent Strombolian phase(s). Lava flows adjacent to the Lathrop Wells Cone probably were emplaced during the mid-eruptive sequence. Ongoing investigations continue to address the potential hazards of a volcanic event at Yucca Mountain.« less
Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data
NASA Astrophysics Data System (ADS)
Nyaupane, Narayan; Parajuli, Ranjan; Kalra, Ajay
2017-12-01
Flooding is the most severe and costlier natural hazard in US. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with proper understanding of flooding event can mitigate the risk of such hazard. The flood plain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural area in the desert of Nevada has experienced peak flood in recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on a HEC-RAS model, prepared using available terrain data. Some of the climate projection shows extreme increase in future design flood. The future design flood could be more than the historic 500yr flood. At the same time, the extent of flooding could go beyond the historic flood of 0.2% annual probability. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer and stake holders.
Forecasting the duration of volcanic eruptions: an empirical probabilistic model
NASA Astrophysics Data System (ADS)
Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.
2014-01-01
The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions.
ERIC Educational Resources Information Center
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Forecasting urban growth across the United States-Mexico border
Norman, L.M.; Feller, M.; Phillip, Guertin D.
2009-01-01
The sister-city area of Nogales, Arizona, and Nogales, Sonora, Mexico, is known collectively as Ambos (both) Nogales. This area was historically one city and was administratively divided by the Gadsden Purchase in 1853. These arid-lands have limited and sensitive natural resources. Environmental planning can support sustainable development to accommodate the predicted influx of population. The objective of this research is to quantify the amount of predicted urban growth for the Ambos Nogales watershed to support future planning for sustainable development. Two modeling regimes are explored. Our goal is to identify possible growth patterns associated with the twin-city area as a whole and with the two cities modeled as separate entities. We analyzed the cross-border watershed using regression analysis from satellite images from 1975, 1983, 1996, and 2002 and created urban area classifications. We used these classifications as input to the urban growth model, SLEUTH, to simulate likely patterns of development and define projected conversion probabilities. Model results indicate that the two cities are undergoing very different patterns of change and identify locations of expected growth based on historical development. Growth in Nogales, Arizona is stagnant while the urban area in Nogales, Sonora is exploding. This paper demonstrates an application that portrays how future binational urban growth could develop and affect the environment. This research also provides locations of potential growth for use in city planning.
Kim, Sanghag; Kochanska, Grazyna; Boldt, Lea J.; Nordling, Jamie Koenig; O’Bleness, Jessica J.
2014-01-01
Parent-child relationships are critical in development, but much remains to be learned about mechanisms of their impact. We examined early parent-child relationship as a moderator of the developmental trajectory from children’s affective and behavioral responses to transgressions to future antisocial, externalizing behavior problems in Family Study (102 community mothers, fathers, and infants, followed through age 8) and Play Study (186 low-income, diverse mothers and toddlers, followed for 10 months). The relationship quality was indexed by attachment security in Family Study and maternal responsiveness in Play Study. Responses to transgressions (tense discomfort and reparation) were observed in laboratory mishaps that led children to believe they had damaged a valued object. Antisocial outcomes were rated by parents. In both studies, early relationship moderated the future developmental trajectory: Children’s attenuated tense discomfort predicted more antisocial outcomes, but only in insecure or unresponsive relationships. That risk was defused in secure or responsive relationships. Moderated mediation analyses in Family Study indicated that the links between low tense discomfort and future antisocial behavior in insecure parent-child dyads were mediated by parental stronger discipline pressure. By influencing indirectly future developmental sequelae, early relationship may increase or decrease the probability that the parent-child dyad will embark on a path toward antisocial outcomes. PMID:24280347
Kim, Sanghag; Kochanska, Grazyna; Boldt, Lea J; Nordling, Jamie Koenig; O'Bleness, Jessica J
2014-02-01
Parent-child relationships are critical in development, but much remains to be learned about the mechanisms of their impact. We examined the early parent-child relationship as a moderator of the developmental trajectory from children's affective and behavioral responses to transgressions to future antisocial, externalizing behavior problems in the Family Study (102 community mothers, fathers, and infants, followed through age 8) and the Play Study (186 low-income, diverse mothers and toddlers, followed for 10 months). The relationship quality was indexed by attachment security in the Family Study and maternal responsiveness in the Play Study. Responses to transgressions (tense discomfort and reparation) were observed in laboratory mishaps wherein children believed they had damaged a valued object. Antisocial outcomes were rated by parents. In both studies, early relationships moderated the future developmental trajectory: diminished tense discomfort predicted more antisocial outcomes, but only in insecure or unresponsive relationships. That risk was defused in secure or responsive relationships. Moderated mediation analyses in the Family Study indicated that the links between diminished tense discomfort and future antisocial behavior in insecure parent-child dyads were mediated by stronger discipline pressure from parents. By indirectly influencing future developmental sequelae, early relationships may increase or decrease the probability that the parent-child dyad will embark on a path toward antisocial outcomes.
Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K
2014-01-01
Background: Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. Objective: We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Design: Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Results: Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Conclusion: Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. This trial was registered at clinicaltrials.gov as NCT02099812. PMID:25008855
Continuous-time random-walk model for financial distributions
NASA Astrophysics Data System (ADS)
Masoliver, Jaume; Montero, Miquel; Weiss, George H.
2003-02-01
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.
Energy efficient engine: Propulsion system-aircraft integration evaluation
NASA Technical Reports Server (NTRS)
Owens, R. E.
1979-01-01
Flight performance and operating economics of future commercial transports utilizing the energy efficient engine were assessed as well as the probability of meeting NASA's goals for TSFC, DOC, noise, and emissions. Results of the initial propulsion systems aircraft integration evaluation presented include estimates of engine performance, predictions of fuel burns, operating costs of the flight propulsion system installed in seven selected advanced study commercial transports, estimates of noise and emissions, considerations of thrust growth, and the achievement-probability analysis.
On the abundance of extraterrestrial life after the Kepler mission
NASA Astrophysics Data System (ADS)
Wandel, Amri
2015-07-01
The data recently accumulated by the Kepler mission have demonstrated that small planets are quite common and that a significant fraction of all stars may have an Earth-like planet within their habitable zone. These results are combined with a Drake-equation formalism to derive the space density of biotic planets as a function of the relatively modest uncertainty in the astronomical data and of the (yet unknown) probability for the evolution of biotic life, F b. I suggest that F b may be estimated by future spectral observations of exoplanet biomarkers. If F b is in the range 0.001-1, then a biotic planet may be expected within 10-100 light years from Earth. Extending the biotic results to advanced life I derive expressions for the distance to putative civilizations in terms of two additional Drake parameters - the probability for evolution of a civilization, F c, and its average longevity. For instance, assuming optimistic probability values (F b~F c~1) and a broadcasting longevity of a few thousand years, the likely distance to the nearest civilizations detectable by searching for intelligent electromagnetic signals is of the order of a few thousand light years. The probability of detecting intelligent signals with present and future radio telescopes is calculated as a function of the Drake parameters. Finally, I describe how the detection of intelligent signals would constrain the Drake parameters.
NASA Astrophysics Data System (ADS)
Urban, Nathan M.; Keller, Klaus
2010-10-01
How has the Atlantic Meridional Overturning Circulation (AMOC) varied over the past centuries and what is the risk of an anthropogenic AMOC collapse? We report probabilistic projections of the future climate which improve on previous AMOC projection studies by (i) greatly expanding the considered observational constraints and (ii) carefully sampling the tail areas of the parameter probability distribution function (pdf). We use a Bayesian inversion to constrain a simple model of the coupled climate, carbon cycle and AMOC systems using observations to derive multicentury hindcasts and projections. Our hindcasts show considerable skill in representing the observational constraints. We show that robust AMOC risk estimates can require carefully sampling the parameter pdfs. We find a low probability of experiencing an AMOC collapse within the 21st century for a business-as-usual emissions scenario. The probability of experiencing an AMOC collapse within two centuries is 1/10. The probability of crossing a forcing threshold and triggering a future AMOC collapse (by 2300) is approximately 1/30 in the 21st century and over 1/3 in the 22nd. Given the simplicity of the model structure and uncertainty in the forcing assumptions, our analysis should be considered a proof of concept and the quantitative conclusions subject to severe caveats.
Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska
Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.
2007-01-01
We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.
Engineering principles to assure compatible docking between future spacecraft of USA and USSR
NASA Technical Reports Server (NTRS)
Johnson, C. C.
1973-01-01
An androgynous peripheral type docking mechanism concept selected by the U.S. and the USSR is described. The rationale supporting the selection of the concept, the mechanical principles inherent to the concept, and the probable nature of future designs stemming from the concept are discussed. Operational situations prior to docking, impact conditions, energy absorption, and structural joining of two spacecraft are examined.
Criminal psychological profiling of serial arson crimes.
Kocsis, Richard N; Cooksey, Ray W
2002-12-01
The practice of criminal psychological profiling is frequently cited as being applicable to serial arson crimes. Despite this claim, there does not appear to be any empirical research that examines serial arson offence behaviors in the context of profiling. This study seeks to develop an empirical model of serial arsonist behaviors that can be systematically associated with probable offender characteristics. Analysis has produced a model of offence behaviors that identify four discrete behavior patterns, all of which share a constellation of common nondiscriminatory behaviors. The inherent behavioral themes of each of these patterns are explored with discussion of their broader implications for our understanding of serial arson and directions for future research.
[Scenario analysis--a method for long-term planning].
Stavem, K
2000-01-10
Scenarios are known from the film industry, as detailed descriptions of films. This has given name to scenario analysis, a method for long term planning using descriptions of composite future pictures. This article is an introduction to the scenario method. Scenarios describe plausible, not necessarily probable, developments. They focus on problems and questions that decision makers must be aware of and prepare to deal with, and the consequences of alternative decisions. Scenarios are used in corporate and governmental planning, and they can be useful and complementary to traditional planning and extrapolation of past experience. The method is particularly useful in a rapidly changing world with shifting external conditions.
Hawaii natural compounds are promising to reduce ovarian cancer deaths.
Fei-Zhang, David J; Li, Chunshun; Cao, Shugeng
2016-07-02
The low survival rate of patients with ovarian cancer largely results from the advanced ovarian tumors as well as tumor resistance to chemotherapy, leading to metastasis and recurrence. However, it is missing as to an effective therapeutic approach that focuses on these aspects to prolong progression-free survival and to decrease mortality in ovarian cancer patients. Here, based on our cancer drug discovery studies, we provide prospective insights into the development of a future line of drugs to effectively reduce ovarian cancer deaths. Pathways that increase the probability of cancer, such as the defective Fanconi anemia (FA) pathway, may render cancer cells more sensitive to new drug targeting.
The AIDS epidemic: the spread of a deadly disease in the biotech era.
Noë, A; Verhofstede, C; Plum, J
2004-01-01
In the last two decades we have witnessed the progression of a newly introduced infection in humans. It is sobering that despite a world-wide effort and the tremendous progression of technical capabilities and scientific knowledge we are still not able to control the global epidemic of HIV. In 2004 more than 40 million people were infected. Educational approaches to modify risk-taking behavior is still the most critical component of prevention and the most important measure to limit the spread of the infection. Vaccine development, which is still far from promising, is probably the only way to control the disease in the future.
Biomarker development in the precision medicine era: lung cancer as a case study.
Vargas, Ashley J; Harris, Curtis C
2016-08-01
Precision medicine relies on validated biomarkers with which to better classify patients by their probable disease risk, prognosis and/or response to treatment. Although affordable 'omics'-based technology has enabled faster identification of putative biomarkers, the validation of biomarkers is still stymied by low statistical power and poor reproducibility of results. This Review summarizes the successes and challenges of using different types of molecule as biomarkers, using lung cancer as a key illustrative example. Efforts at the national level of several countries to tie molecular measurement of samples to patient data via electronic medical records are the future of precision medicine research.
NASA Technical Reports Server (NTRS)
Layland, J. W.
1974-01-01
An approximate analysis of the effect of a noisy carrier reference on the performance of sequential decoding is presented. The analysis uses previously developed techniques for evaluating noisy reference performance for medium-rate uncoded communications adapted to sequential decoding for data rates of 8 to 2048 bits/s. In estimating the ten to the minus fourth power deletion probability thresholds for Helios, the model agrees with experimental data to within the experimental tolerances. The computational problem involved in sequential decoding, carrier loop effects, the main characteristics of the medium-rate model, modeled decoding performance, and perspectives on future work are discussed.
Treatment and prophylaxis of melioidosis.
Dance, David
2014-04-01
Melioidosis, infection with Burkholderia pseudomallei, is being recognised with increasing frequency and is probably more common than currently appreciated. Treatment recommendations are based on a series of clinical trials conducted in Thailand over the past 25 years. Treatment is usually divided into two phases: in the first, or acute phase, parenteral drugs are given for ≥10 days with the aim of preventing death from overwhelming sepsis; in the second, or eradication phase, oral drugs are given, usually to complete a total of 20 weeks, with the aim of preventing relapse. Specific treatment for individual patients needs to be tailored according to clinical manifestations and response, and there remain many unanswered questions. Some patients with very mild infections can probably be cured by oral agents alone. Ceftazidime is the mainstay of acute-phase treatment, with carbapenems reserved for severe infections or treatment failures and amoxicillin/clavulanic acid (co-amoxiclav) as second-line therapy. Trimethoprim/sulfamethoxazole (co-trimoxazole) is preferred for the eradication phase, with the alternative of co-amoxiclav. In addition, the best available supportive care is needed, along with drainage of abscesses whenever possible. Treatment for melioidosis is unaffordable for many in endemic areas of the developing world, but the relative costs have reduced over the past decade. Unfortunately there is no likelihood of any new or cheaper options becoming available in the immediate future. Recommendations for prophylaxis following exposure to B. pseudomallei have been made, but the evidence suggests that they would probably only delay rather than prevent the development of infection. Copyright © 2014 The Author. Published by Elsevier B.V. All rights reserved.
Predictive models attribute effects on fish assemblages to toxicity and habitat alteration.
de Zwart, Dick; Dyer, Scott D; Posthuma, Leo; Hawkins, Charles P
2006-08-01
Biological assessments should both estimate the condition of a biological resource (magnitude of alteration) and provide environmental managers with a diagnosis of the potential causes of impairment. Although methods of quantifying condition are well developed, identifying and proportionately attributing impairment to probable causes remain problematic. Furthermore, analyses of both condition and cause have often been difficult to communicate. We developed an approach that (1) links fish, habitat, and chemistry data collected from hundreds of sites in Ohio (USA) streams, (2) assesses the biological condition at each site, (3) attributes impairment to multiple probable causes, and (4) provides the results of the analyses in simple-to-interpret pie charts. The data set was managed using a geographic information system. Biological condition was assessed using a RIVPACS (river invertebrate prediction and classification system)-like predictive model. The model provided probabilities of capture for 117 fish species based on the geographic location of sites and local habitat descriptors. Impaired biological condition was defined as the proportion of those native species predicted to occur at a site that were observed. The potential toxic effects of exposure to mixtures of contaminants were estimated using species sensitivity distributions and mixture toxicity principles. Generalized linear regression models described species abundance as a function of habitat characteristics. Statistically linking biological condition, habitat characteristics including mixture risks, and species abundance allowed us to evaluate the losses of species with environmental conditions. Results were mapped as simple effect and probable-cause pie charts (EPC pie diagrams), with pie sizes corresponding to magnitude of local impairment, and slice sizes to the relative probable contributions of different stressors. The types of models we used have been successfully applied in ecology and ecotoxicology, but they have not previously been used in concert to quantify impairment and its likely causes. Although data limitations constrained our ability to examine complex interactions between stressors and species, the direct relationships we detected likely represent conservative estimates of stressor contributions to local impairment. Future refinements of the general approach and specific methods described here should yield even more promising results.
NASA Astrophysics Data System (ADS)
Lautze, N. C.; Ito, G.; Thomas, D. M.; Hinz, N.; Frazer, L. N.; Waller, D.
2015-12-01
Hawaii offers the opportunity to gain knowledge and develop geothermal energy on the only oceanic hotspot in the U.S. As a remote island state, Hawaii is more dependent on imported fossil fuel than any other state in the U.S., and energy prices are 3 to 4 times higher than the national average. The only proven resource, located on Hawaii Island's active Kilauea volcano, is a region of high geologic risk; other regions of probable resource exist but lack adequate assessment. The last comprehensive statewide geothermal assessment occurred in 1983 and found a potential resource on all islands (Hawaii Institute of Geophysics, 1983). Phase 1 of a Department of Energy funded project to assess the probability of geothermal resource potential statewide in Hawaii was recently completed. The execution of this project was divided into three main tasks: (1) compile all historical and current data for Hawaii that is relevant to geothermal resources into a single Geographic Information System (GIS) project; (2) analyze and rank these datasets in terms of their relevance to the three primary properties of a viable geothermal resource: heat (H), fluid (F), and permeability (P); and (3) develop and apply a Bayesian statistical method to incorporate the ranks and produce probability models that map out Hawaii's geothermal resource potential. Here, we summarize the project methodology and present maps that highlight both high prospect areas as well as areas that lack enough data to make an adequate assessment. We suggest a path for future exploration activities in Hawaii, and discuss how this method of analysis can be adapted to other regions and other types of resources. The figure below shows multiple layers of GIS data for Hawaii Island. Color shades indicate crustal density anomalies produced from inversions of gravity (Flinders et al. 2013). Superimposed on this are mapped calderas, rift zones, volcanic cones, and faults (following Sherrod et al., 2007). These features were used to identify probable locations of intrusive rock (heat) and permeability.
40 CFR 144.61 - Definitions of terms as used in this subpart.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operating cycle of the business. Current liabilities means obligations whose liquidation is reasonably... business community. Assets means all existing and all probable future economic benefits obtained or...
The potential benefits of a new poliovirus vaccine for long-term poliovirus risk management.
Duintjer Tebbens, Radboud J; Thompson, Kimberly M
2016-12-01
To estimate the incremental net benefits (INBs) of a hypothetical ideal vaccine with all of the advantages and no disadvantages of existing oral and inactivated poliovirus vaccines compared with current vaccines available for future outbreak response. INB estimates based on expected costs and polio cases from an existing global model of long-term poliovirus risk management. Excluding the development costs, an ideal poliovirus vaccine could offer expected INBs of US$1.6 billion. The ideal vaccine yields small benefits in most realizations of long-term risks, but great benefits in low-probability-high-consequence realizations. New poliovirus vaccines may offer valuable insurance against long-term poliovirus risks and new vaccine development efforts should continue as the world gathers more evidence about polio endgame risks.
Antarctic glacial history from numerical models and continental margin sediments
Barker, P.F.; Barrett, P.J.; Cooper, A. K.; Huybrechts, P.
1999-01-01
The climate record of glacially transported sediments in prograded wedges around the Antarctic outer continental shelf, and their derivatives in continental rise drifts, may be combined to produce an Antarctic ice sheet history, using numerical models of ice sheet response to temperature and sea-level change. Examination of published models suggests several preliminary conclusions about ice sheet history. The ice sheet's present high sensitivity to sea-level change at short (orbital) periods was developed gradually as its size increased, replacing a declining sensitivity to temperature. Models suggest that the ice sheet grew abruptly to 40% (or possibly more) of its present size at the Eocene-Oligocene boundary, mainly as a result of its own temperature sensitivity. A large but more gradual middle Miocene change was externally driven, probably by development of the Antarctic Circumpolar Current (ACC) and Polar Front, provided that a few million years' delay can be explained. The Oligocene ice sheet varied considerably in size and areal extent, but the late Miocene ice sheet was more stable, though significantly warmer than today's. This difference probably relates to the confining effect of the Antarctic continental margin. Present-day numerical models of ice sheet development are sufficient to guide current sampling plans, but sea-ice formation, polar wander, basal topography and ice streaming can be identified as factors meriting additional modelling effort in the future.
Maru, Duncan Smith-Rohrberg; Kozal, Michael J; Bruce, R Douglas; Springer, Sandra A; Altice, Frederick L
2007-12-15
Directly administered antiretroviral therapy (DAART) is an effective intervention that improves clinical outcomes among HIV-infected drug users. Its effects on antiretroviral drug resistance, however, are unknown. We conducted a community-based, prospective, randomized controlled trial of DAART compared with self-administered therapy (SAT). We performed a modified intention-to-treat analysis among 115 subjects who provided serum samples for HIV genotypic resistance testing at baseline and at follow-up. The main outcomes measures included total genotypic sensitivity score, future drug options, number of new drug resistance mutations (DRMs), and number of new major International AIDS Society (IAS) mutations. The adjusted probability of developing at least 1 new DRM did not differ between the 2 arms (SAT: 0.41 per person-year [PPY], DAART: 0.49 PPY; adjusted relative risk [RR] = 1.04; P = 0.90), nor did the number of new mutations (SAT: 0.76 PPY, DAART: 0.83 PPY; adjusted RR = 0.99; P = 0.99) or the probability of developing new major IAS new drug mutations (SAT: 0.30 PPY, DAART: 0.33 PPY; adjusted RR = 1.12; P = 0.78). On measures of GSS and FDO, the 2 arms also did not differ. In this trial, DAART provided on-treatment virologic benefit for HIV-infected drug users without affecting the rate of development of antiretroviral medication resistance.
Joint probability of statistical success of multiple phase III trials.
Zhang, Jianliang; Zhang, Jenny J
2013-01-01
In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.
Changes in the prosthodontic literature 1966 to 2042.
Carlsson, Gunnar E
2005-05-01
To describe the growth and content of the prosthodontic literature over the last 4 decades, to make a prognosis on its probable development in the coming 4 decades and to discuss changes in the content of the International Journal of Prosthodontics (IJP) from its start in 1988 to 2004. MEDLINE was searched for articles on prosthodontics published between 1966 and April 2004. All volumes of IJP were examined with respect to type, subject area and geographic origin of articles. Using the term "prosthodontics," the MEDLINE search produced 66,600 hits. The proportion of clinical studies increased from 1% during the first 10-year period to 13% since 2001. Articles on removable dentures decreased during the period reviewed, whereas those on implant prosthodontics increased. Randomized controlled trials were rare and often of inadequate quality. Literature reviews have become popular, but many do not follow current guidelines for systematic reviews. A marked change in geographic origin of articles in IJP has occurred, with a decrease in material from North America and an increase in that from Europe and Asia. The Internet and open-access publishing will probably have a great impact on the future development of the prosthodontic literature. Substantial changes have occurred in the prosthodontic literature between 1966 and 2004, and they can be expected to continue with the rapid development of information technology and increased use of the Internet.
Journy, Neige M Y; Lee, Choonsik; Harbron, Richard W; McHugh, Kieran; Pearce, Mark S; Berrington de González, Amy
2017-01-03
To project risks of developing cancer and the number of cases potentially induced by past, current, and future computed tomography (CT) scans performed in the United Kingdom in individuals aged <20 years. Organ doses were estimated from surveys of individual scan parameters and CT protocols used in the United Kingdom. Frequencies of scans were estimated from the NHS Diagnostic Imaging Dataset. Excess lifetime risks (ELRs) of radiation-related cancer were calculated as cumulative lifetime risks, accounting for survival probabilities, using the RadRAT risk assessment tool. In 2000-2008, ELRs ranged from 0.3 to 1 per 1000 head scans and 1 to 5 per 1000 non-head scans. ELRs per scan were reduced by 50-70% in 2000-2008 compared with 1990-1995, subsequent to dose reduction over time. The 130 750 scans performed in 2015 in the United Kingdom were projected to induce 64 (90% uncertainty interval (UI): 38-113) future cancers. Current practices would lead to about 300 (90% UI: 230-680) future cancers induced by scans performed in 2016-2020. Absolute excess risks from single exposures would be low compared with background risks, but even small increases in annual CT rates over the next years would substantially increase the number of potential subsequent cancers.
Perceived risk of diabetes seriously underestimates actual diabetes risk: The KORA FF4 study
Stang, Andreas; Bongaerts, Brenda; Kuss, Oliver; Herder, Christian; Roden, Michael; Quante, Anne; Holle, Rolf; Huth, Cornelia; Peters, Annette; Meisinger, Christa
2017-01-01
Objective Early detection of diabetes and prediabetic states is beneficial for patients, but may be delayed by patients´ being overly optimistic about their own health. Therefore, we assessed how persons without known diabetes perceive their risk of having or developing diabetes, and we identified factors associated with perception of diabetes risk. Research design and methods 1,953 participants without previously known diabetes from the population-based, German KORA FF4 Study (59.1 years, 47.8% men) had an oral glucose tolerance test. They estimated their probability of having undiagnosed diabetes mellitus (UDM) on a six category scale, and assessed whether they were at risk of developing diabetes in the future. We cross-tabulated glycemic status with risk perception, and fitted robust Poisson regression models to identify determinants of diabetes risk perception. Results 74% (95% CI: 65–82) of persons with UDM believed that their probability of having undetected diabetes was low or very low. 72% (95% CI: 69–75) of persons with prediabetes believed that they were not at risk of developing diabetes. In people with prediabetes, seeing oneself at risk of diabetes was associated with self-rated poor general health (prevalence ratio (PR) = 3.1 (95% CI: 1.4–6.8), parental diabetes (PR = 2.6, 1.9–3.4), high educational level (PR = 1.9 (1.4–2.5)), lower age (PR = 0.7, 0.6–0.8, per 1 standard deviation increase), female sex (PR = 1.2, 0.9–1.5) and obesity (PR = 1.5, 1.2–2.0). Conclusions People with undiagnosed diabetes or prediabetes considerably underestimate their probability of having or developing diabetes. Contrary to associations with actual diabetes risk, perceived diabetes risk was lower in men, lower educated and older persons. PMID:28141837
Yu, Soonyoung; Unger, Andre J A; Parker, Beth; Kim, Taehee
2012-06-15
In this study, we defined risk capital as the contingency fee or insurance premium that a brownfields redeveloper needs to set aside from the sale of each house in case they need to repurchase it at a later date because the indoor air has been detrimentally affected by subsurface contamination. The likelihood that indoor air concentrations will exceed a regulatory level subject to subsurface heterogeneity and source zone location uncertainty is simulated by a physics-based hydrogeological model using Monte Carlo realizations, yielding the probability of failure. The cost of failure is the future value of the house indexed to the stochastic US National Housing index. The risk capital is essentially the probability of failure times the cost of failure with a surcharge to compensate the developer against hydrogeological and financial uncertainty, with the surcharge acting as safety loading reflecting the developers' level of risk aversion. We review five methodologies taken from the actuarial and financial literature to price the risk capital for a highly stylized brownfield redevelopment project, with each method specifically adapted to accommodate our notion of the probability of failure. The objective of this paper is to develop an actuarially consistent approach for combining the hydrogeological and financial uncertainty into a contingency fee that the brownfields developer should reserve (i.e. the risk capital) in order to hedge their risk exposure during the project. Results indicate that the price of the risk capital is much more sensitive to hydrogeological rather than financial uncertainty. We use the Capital Asset Pricing Model to estimate the risk-adjusted discount rate to depreciate all costs to present value for the brownfield redevelopment project. A key outcome of this work is that the presentation of our risk capital valuation methodology is sufficiently generalized for application to a wide variety of engineering projects. Copyright © 2012 Elsevier Ltd. All rights reserved.
Defining Baconian Probability for Use in Assurance Argumentation
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.
2016-01-01
The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.
Parekh, Nikesh; Hodges, Stewart D; Pollock, Allyson M; Kirkwood, Graham
2012-06-01
The communication of injury risk in rugby and other sports is underdeveloped and parents, children and coaches need to be better informed about risk. A Poisson distribution was used to transform population based incidence of injury into average probabilities of injury to individual players. The incidence of injury in schoolboy rugby matches range from 7 to 129.8 injuries per 1000 player-hours; these rates translate to average probabilities of injury to a player of between 12% and 90% over a season. Incidence of injury and average probabilities of injury over a season should be published together in all future epidemiological studies on school rugby and other sports. More research is required on informing and communicating injury risks to parents, staff and children and how it affects monitoring, decision making and prevention strategies.
Neural dynamics of reward probability coding: a Magnetoencephalographic study in humans
Thomas, Julie; Vanni-Mercier, Giovanna; Dreher, Jean-Claude
2013-01-01
Prediction of future rewards and discrepancy between actual and expected outcomes (prediction error) are crucial signals for adaptive behavior. In humans, a number of fMRI studies demonstrated that reward probability modulates these two signals in a large brain network. Yet, the spatio-temporal dynamics underlying the neural coding of reward probability remains unknown. Here, using magnetoencephalography, we investigated the neural dynamics of prediction and reward prediction error computations while subjects learned to associate cues of slot machines with monetary rewards with different probabilities. We showed that event-related magnetic fields (ERFs) arising from the visual cortex coded the expected reward value 155 ms after the cue, demonstrating that reward value signals emerge early in the visual stream. Moreover, a prediction error was reflected in ERF peaking 300 ms after the rewarded outcome and showing decreasing amplitude with higher reward probability. This prediction error signal was generated in a network including the anterior and posterior cingulate cortex. These findings pinpoint the spatio-temporal characteristics underlying reward probability coding. Together, our results provide insights into the neural dynamics underlying the ability to learn probabilistic stimuli-reward contingencies. PMID:24302894
Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.
1998-01-01
The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to 2000. The probability of a Mw = 6.9 earthquake within 50 km of Osaka during 1997-2007 is estimated to have risen from 5-6% before the Kobe earthquake to 7-11% afterward; during 1997-2027, it is estimated to have risen from 14-16% before Kobe to 16-22%.
Interstellar Travel and Galactic Colonization: Insights from Percolation Theory and the Yule Process
NASA Astrophysics Data System (ADS)
Lingam, Manasvi
2016-06-01
In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined.
Assessing changes in failure probability of dams in a changing climate
NASA Astrophysics Data System (ADS)
Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.
2017-12-01
Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.
NASA Astrophysics Data System (ADS)
Reeves, K. L.; Samson, C.; Summers, R. S.; Balaji, R.
2017-12-01
Drinking water treatment utilities (DWTU) are tasked with the challenge of meeting disinfection and disinfection byproduct (DBP) regulations to provide safe, reliable drinking water under changing climate and land surface characteristics. DBPs form in drinking water when disinfectants, commonly chlorine, react with organic matter as measured by total organic carbon (TOC), and physical removal of pathogen microorganisms are achieved by filtration and monitored by turbidity removal. Turbidity and TOC in influent waters to DWTUs are expected to increase due to variable climate and more frequent fires and droughts. Traditional methods for forecasting turbidity and TOC require catchment specific data (i.e. streamflow) and have difficulties predicting them under non-stationary climate. A modelling framework was developed to assist DWTUs with assessing their risk for future compliance with disinfection and DBP regulations under changing climate. A local polynomial method was developed to predict surface water TOC using climate data collected from NOAA, Normalized Difference Vegetation Index (NDVI) data from the IRI Data Library, and historical TOC data from three DWTUs in diverse geographic locations. Characteristics from the DWTUs were used in the EPA Water Treatment Plant model to determine thresholds for influent TOC that resulted in DBP concentrations within compliance. Lastly, extreme value theory was used to predict probabilities of threshold exceedances under the current climate. Results from the utilities were used to produce a generalized TOC threshold approach that only requires water temperature and bromide concentration. The threshold exceedance model will be used to estimate probabilities of exceedances under projected climate scenarios. Initial results show that TOC can be forecasted using widely available data via statistical methods, where temperature, precipitation, Palmer Drought Severity Index, and NDVI with various lags were shown to be important predictors of TOC, and TOC thresholds can be determined using water temperature and bromide concentration. Results include a model to predict influent turbidity and turbidity thresholds, similar to the TOC models, as well as probabilities of threshold exceedances for TOC and turbidity under changing climate.
NASA Astrophysics Data System (ADS)
Rikitake, T.
1999-03-01
In light of newly-acquired geophysical information about earthquake generation in the Tokai area, Central Japan, where occurrence of a great earthquake of magnitude 8 or so has recently been feared, probabilities of earthquake occurrence in the near future are reevaluated. Much of the data used for evaluation here relies on recently-developed paleoseismology, tsunami study and GPS geodesy.The new Weibull distribution analysis of recurrence tendency of great earthquakes in the Tokai-Nankai zone indicates that the mean return period of great earthquakes there is estimated as 109 yr with a standard deviation amounting to 33 yr. These values do not differ much from those of previous studies (Rikitake, 1976, 1986; Utsu, 1984).Taking the newly-determined velocities of the motion of Philippine Sea plate at various portions of the Tokai-Nankai zone into account, the ultimate displacements to rupture at the plate boundary are obtained. A Weibull distribution analysis results in the mean ultimate displacement amounting to 4.70 m with a standard deviation estimated as 0.86 m. A return period amounting to 117 yr is obtained at the Suruga Bay portion by dividing the mean ultimate displacement by the relative plate velocity.With the aid of the fault models as determined from the tsunami studies, the increases in the cumulative seismic slips associated with the great earthquakes are examined at various portions of the zone. It appears that a slip-predictable model can better be applied to the occurrence mode of great earthquakes in the zone than a time-predictable model. The crustal strain accumulating over the Tokai area as estimated from the newly-developed geodetic work including the GPS observations is compared to the ultimate strain presumed by the above two models.The probabilities for a great earthquake to recur in the Tokai district are then estimated with the aid of the Weibull analysis parameters obtained for the four cases discussed in the above. All the probabilities evaluated for the four cases take on values ranging 35-45 percent for a ten-year period following the year 2000.
O'Mahony, James F; Newall, Anthony T; van Rosmalen, Joost
2015-12-01
Time is an important aspect of health economic evaluation, as the timing and duration of clinical events, healthcare interventions and their consequences all affect estimated costs and effects. These issues should be reflected in the design of health economic models. This article considers three important aspects of time in modelling: (1) which cohorts to simulate and how far into the future to extend the analysis; (2) the simulation of time, including the difference between discrete-time and continuous-time models, cycle lengths, and converting rates and probabilities; and (3) discounting future costs and effects to their present values. We provide a methodological overview of these issues and make recommendations to help inform both the conduct of cost-effectiveness analyses and the interpretation of their results. For choosing which cohorts to simulate and how many, we suggest analysts carefully assess potential reasons for variation in cost effectiveness between cohorts and the feasibility of subgroup-specific recommendations. For the simulation of time, we recommend using short cycles or continuous-time models to avoid biases and the need for half-cycle corrections, and provide advice on the correct conversion of transition probabilities in state transition models. Finally, for discounting, analysts should not only follow current guidance and report how discounting was conducted, especially in the case of differential discounting, but also seek to develop an understanding of its rationale. Our overall recommendations are that analysts explicitly state and justify their modelling choices regarding time and consider how alternative choices may impact on results.
Sakoda, Lori C; Henderson, Louise M; Caverly, Tanner J; Wernli, Karen J; Katki, Hormuzd A
2017-12-01
Risk prediction models may be useful for facilitating effective and high-quality decision-making at critical steps in the lung cancer screening process. This review provides a current overview of published lung cancer risk prediction models and their applications to lung cancer screening and highlights both challenges and strategies for improving their predictive performance and use in clinical practice. Since the 2011 publication of the National Lung Screening Trial results, numerous prediction models have been proposed to estimate the probability of developing or dying from lung cancer or the probability that a pulmonary nodule is malignant. Respective models appear to exhibit high discriminatory accuracy in identifying individuals at highest risk of lung cancer or differentiating malignant from benign pulmonary nodules. However, validation and critical comparison of the performance of these models in independent populations are limited. Little is also known about the extent to which risk prediction models are being applied in clinical practice and influencing decision-making processes and outcomes related to lung cancer screening. Current evidence is insufficient to determine which lung cancer risk prediction models are most clinically useful and how to best implement their use to optimize screening effectiveness and quality. To address these knowledge gaps, future research should be directed toward validating and enhancing existing risk prediction models for lung cancer and evaluating the application of model-based risk calculators and its corresponding impact on screening processes and outcomes.
Probabilistic Risk Assessment for Astronaut Post Flight Bone Fracture
NASA Technical Reports Server (NTRS)
Lewandowski, Beth; Myers, Jerry; Licata, Angelo
2015-01-01
Introduction: Space flight potentially reduces the loading that bone can resist before fracture. This reduction in bone integrity may result from a combination of factors, the most common reported as reduction in astronaut BMD. Although evaluating the condition of bones continues to be a critical aspect of understanding space flight fracture risk, defining the loading regime, whether on earth, in microgravity, or in reduced gravity on a planetary surface, remains a significant component of estimating the fracture risks to astronauts. This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM) to understanding pre-, post, and in mission astronaut bone fracture risk. The overview includes an assessment of contributing factors utilized in the BFxRM and illustrates how new information, such as biomechanics of space suit design or better understanding of post flight activities may influence astronaut fracture risk. Opportunities for the bone mineral research community to contribute to future model development are also discussed. Methods: To investigate the conditions in which spaceflight induced changes to bone plays a critical role in post-flight fracture probability, we implement a modified version of the NASA Bone Fracture Risk Model (BFxRM). Modifications included incorporation of variations in physiological characteristics, post-flight recovery rate, and variations in lateral fall conditions within the probabilistic simulation parameter space. The modeled fracture probability estimates for different loading scenarios at preflight and at 0 and 365 days post-flight time periods are compared. Results: For simple lateral side falls, mean post-flight fracture probability is elevated over mean preflight fracture probability due to spaceflight induced BMD loss and is not fully recovered at 365 days post-flight. In the case of more energetic falls, such as from elevated heights or with the addition of lateral movement, the contribution of space flight quality changes is much less clear, indicating more granular assessments, such as Finite Element modeling, may be needed to further assess the risks in these scenarios.
Flood Risk Due to Hurricane Flooding
NASA Astrophysics Data System (ADS)
Olivera, Francisco; Hsu, Chih-Hung; Irish, Jennifer
2015-04-01
In this study, we evaluated the expected economic losses caused by hurricane inundation. We used surge response functions, which are physics-based dimensionless scaling laws that give surge elevation as a function of the hurricane's parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location) at specified locations along the coast. These locations were close enough to avoid significant changes in surge elevations between consecutive points, and distant enough to minimize calculations. The probability of occurrence of a surge elevation value at a given location was estimated using a joint probability distribution of the hurricane parameters. The surge elevation, at the shoreline, was assumed to project horizontally inland within a polygon of influence. Individual parcel damage was calculated based on flood water depth and damage vs. depth curves available for different building types from the HAZUS computer application developed by the Federal Emergency Management Agency (FEMA). Parcel data, including property value and building type, were obtained from the county appraisal district offices. The expected economic losses were calculated as the sum of the products of the estimated parcel damages and their probability of occurrence for the different storms considered. Anticipated changes for future climate scenarios were considered by accounting for projected hurricane intensification, as indicated by sea surface temperature rise, and sea level rise, which modify the probability distribution of hurricane central pressure and change the baseline of the damage calculation, respectively. Maps of expected economic losses have been developed for Corpus Christi in Texas, Gulfport in Mississippi and Panama City in Florida. Specifically, for Port Aransas, in the Corpus Christi area, it was found that the expected economic losses were in the range of 1% to 4% of the property value for current climate conditions, of 1% to 8% for the 2030's and of 1% to 14% for the 2080's.
Inconvenient Truth or Convenient Fiction? Probable Maximum Precipitation and Nonstationarity
NASA Astrophysics Data System (ADS)
Nielsen-Gammon, J. W.
2017-12-01
According to the inconvenient truth that Probable Maximum Precipitation (PMP) represents a non-deterministic, statistically very rare event, future changes in PMP involve a complex interplay between future frequencies of storm type, storm morphology, and environmental characteristics, many of which are poorly constrained by global climate models. On the other hand, according to the convenient fiction that PMP represents an estimate of the maximum possible precipitation that can occur at a given location, as determined by storm maximization and transposition, the primary climatic driver of PMP change is simply a change in maximum moisture availability. Increases in boundary-layer and total-column moisture have been observed globally, are anticipated from basic physical principles, and are robustly projected to continue by global climate models. Thus, using the same techniques that are used within the PMP storm maximization process itself, future PMP values may be projected. The resulting PMP trend projections are qualitatively consistent with observed trends of extreme rainfall within Texas, suggesting that in this part of the world the inconvenient truth is congruent with the convenient fiction.
Sochan, Anne M
2011-07-01
How should nursing knowledge advance? This exploration contextualizes its evolution past and present. In addressing how it evolved in the past, a probable historical evolution of its development draws on the perspectives of Frank & Gills's World System Theory, Kuhn's treatise on Scientific Revolutions, and Foucault's notions of Discontinuities in scientific knowledge development. By describing plausible scenarios of how nursing knowledge evolved, I create a case for why nursing knowledge developers should adopt a post-structural stance in prioritizing their research agenda(s). Further, by adopting a post-structural stance, I create a case on how nurses can advance their disciplinary knowledge using an engaging post-colonial strategy. Given an interrupted history caused by influence(s) constraining nursing's knowledge development by power structures external, and internal, to nursing, knowledge development can evolve in the future by drawing on post-structural interpretation, and post-colonial strategy. The post-structural writings of Deleuze & Guattari's understanding of 'Nomadology' as a subtle means to resist being constrained by existing knowledge development structures, might be a useful stance to understanding the urgency of why nursing knowledge should advance addressing the structural influences on its development. Furthermore, Bhabha's post-colonial elucidation of 'Hybridity' as an equally discreet means to change the culture of those constraining structures is an appropriate strategy to enact how nursing knowledge developers can engage with existing power structures, and simultaneously influence that engagement. Taken together, 'post-structural stance' and 'post-colonial strategy' can refocus nursing scholarship to learn from its past, in order to develop relevant disciplinary knowledge in its future. © 2011 Blackwell Publishing Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masoumi, Ali; Vilenkin, Alexander; Yamada, Masaki, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu, E-mail: Masaki.Yamada@tufts.edu
In the landscape perspective, our Universe begins with a quantum tunneling from an eternally-inflating parent vacuum, followed by a period of slow-roll inflation. We investigate the tunneling process and calculate the probability distribution for the initial conditions and for the number of e-folds of slow-roll inflation, modeling the landscape by a small-field one-dimensional random Gaussian potential. We find that such a landscape is fully consistent with observations, but the probability for future detection of spatial curvature is rather low, P ∼ 10{sup −3}.
Scientific Knowledge and Technology, Animal Experimentation, and Pharmaceutical Development.
Kinter, Lewis B; DeGeorge, Joseph J
2016-12-01
Human discovery of pharmacologically active substances is arguably the oldest of the biomedical sciences with origins >3500 years ago. Since ancient times, four major transformations have dramatically impacted pharmaceutical development, each driven by advances in scientific knowledge, technology, and/or regulation: (1) anesthesia, analgesia, and antisepsis; (2) medicinal chemistry; (3) regulatory toxicology; and (4) targeted drug discovery. Animal experimentation in pharmaceutical development is a modern phenomenon dating from the 20th century and enabling several of the four transformations. While each transformation resulted in more effective and/or safer pharmaceuticals, overall attrition, cycle time, cost, numbers of animals used, and low probability of success for new products remain concerns, and pharmaceutical development remains a very high risk business proposition. In this manuscript we review pharmaceutical development since ancient times, describe its coevolution with animal experimentation, and attempt to predict the characteristics of future transformations. © The Author 2016. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Masubuchi, K.; Agapakis, J. E.; Debiccari, A.; Vonalt, C.
1985-01-01
A six month research program entitled Feasibility of Remotely Manipulated Welding in Space - A Step in the Development of Novel Joining Technologies is performed at the Massachusetts Institute of Technology for the Office of Space Science and Applications, NASA, under Contract No. NASW-3740. The work is performed as a part of the Innovative Utilization of the Space Station Program. The final report from M.I.T. was issued in September 1983. This paper presents a summary of the work performed under this contract. The objective of this research program is to initiate research for the development of packaged, remotely controlled welding systems for space construction and repair. The research effort includes the following tasks: (1) identification of probable joining tasks in space; (2) identification of required levels of automation in space welding tasks; (3) development of novel space welding concepts; (4) development of recommended future studies; and (5) preparation of the final report.
Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections
NASA Astrophysics Data System (ADS)
Wakazuki, Y.
2015-12-01
A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.
NASA Astrophysics Data System (ADS)
MU, J.; Antle, J. M.; Zhang, H.; Capalbo, S. M.; Eigenbrode, S.; Kruger, C.; Stockle, C.; Wolfhorst, J. D.
2013-12-01
Representative Agricultural Pathways (RAPs) are projections of plausible future biophysical and socio-economic conditions used to carry out climate impact assessments for agriculture. The development of RAPs iss motivated by the fact that the various global and regional models used for agricultural climate change impact assessment have been implemented with individualized scenarios using various data and model structures, often without transparent documentation or public availability. These practices have hampered attempts at model inter-comparison, improvement, and synthesis of model results across studies. This paper aims to (1) present RAPs developed for the principal wheat-producing region of the Pacific Northwest, and to (2) combine these RAPs with downscaled climate data, crop model simulations and economic model simulations to assess climate change impacts on winter wheat production and farm income. This research was carried out as part of a project funded by the USDA known as the Regional Approaches to Climate Change in the Pacific Northwest (REACCH). The REACCH study region encompasses the major winter wheat production area in Pacific Northwest and preliminary research shows that farmers producing winter wheat could benefit from future climate change. However, the future world is uncertain in many dimensions, including commodity and input prices, production technology, and policies, as well as increased probability of disturbances (pests and diseases) associated with a changing climate. Many of these factors cannot be modeled, so they are represented in the regional RAPS. The regional RAPS are linked to global agricultural and shared social-economic pathways, and used along with climate change projections to simulate future outcomes for the wheat-based farms in the REACCH region.
ERIC Educational Resources Information Center
Carnegie Council on Policy Studies in Higher Education, Berkeley, CA.
In this look at the near future of higher education, and in light of probable declining enrollments and resources, two perspectives are given, both drawn from existing literature. The first sets forth some of the fears of higher education professionals in one possible, if extreme, scenario; the second, some of their hopes in another. Among the…
Use of the Weibull function to predict future diameter distributions from current plot data
Quang V. Cao
2012-01-01
The Weibull function has been widely used to characterize diameter distributions in forest stands. The future diameter distribution of a forest stand can be predicted by use of a Weibull probability density function from current inventory data for that stand. The parameter recovery approach has been used to ârecoverâ the Weibull parameters from diameter moments or...
Experimental evidence for adaptive personalities in a wild passerine bird
Nicolaus, Marion; Tinbergen, Joost M.; Bouwman, Karen M.; Michler, Stephanie P. M.; Ubels, Richard; Both, Christiaan; Kempenaers, Bart; Dingemanse, Niels J.
2012-01-01
Individuals of the same species differ consistently in risky actions. Such ‘animal personality’ variation is intriguing because behavioural flexibility is often assumed to be the norm. Recent theory predicts that between-individual differences in propensity to take risks should evolve if individuals differ in future fitness expectations: individuals with high long-term fitness expectations (i.e. that have much to lose) should behave consistently more cautious than individuals with lower expectations. Consequently, any manipulation of future fitness expectations should result in within-individual changes in risky behaviour in the direction predicted by this adaptive theory. We tested this prediction and confirmed experimentally that individuals indeed adjust their ‘exploration behaviour’, a proxy for risk-taking behaviour, to their future fitness expectations. We show for wild great tits (Parus major) that individuals with experimentally decreased survival probability become faster explorers (i.e. increase risk-taking behaviour) compared to individuals with increased survival probability. We also show, using quantitative genetics approaches, that non-genetic effects (i.e. permanent environment effects) underpin adaptive personality variation in this species. This study thereby confirms a key prediction of adaptive personality theory based on life-history trade-offs, and implies that selection may indeed favour the evolution of personalities in situations where individuals differ in future fitness expectations. PMID:23097506
NASA Astrophysics Data System (ADS)
Ferrés, D.; Reyes Pimentel, T. A.; Espinasa-Pereña, R.; Nieto, A.; Sobradelo, R.; Flores, X.; González Huesca, A. E.; Ramirez, A.
2013-05-01
Popocatépetl volcano is one of the most active in Latin America. During its last cycle of activity, beginning at the end of 1994, more than 40 episodes of dome construction and destruction have occurred inside the summit crater. Most of these episodes finished with eruptions of VEI 1-2. Eruptions of higher intensity were also registered in 1997, 2001 and 2009, of VEI≥3, which produced eruptive columns up to 8 km high and abundant and frequent ash falls on the villages at the eastern sector of the volcano. The January 22nd 2001 eruption also produced pyroclastic flows that followed several streams on the volcanic cone, reaching 4 to 6 km, and transforming to mudflows with ranges up to 15 km. The capital, Mexico City, is within the radius of 80 km from Popocatépetl volcano and can be affected by ash fall during the first months of the rainy season (May to July). Other important cities, such as Puebla and Atlixco, are located 15 to 30 km from the crater. Several villages of the states of México, Puebla and Morelos, which have a total population of 40,000 people, are inside the radius of 12 to 15 km, where the impacts of any of the products of an eruption, including pyroclastic flows, are possible. This high exposure of people and infrastructure around Popocatépetl volcano emphasizes the need of tools for early warning and the development of preventive actions to protect the population from volcanic phenomena. The diagnosis of the volcanic activity, based on the information provided by the monitoring systems, and the prognosis of the evolution of the volcano in the short-term is made by the Scientific Advisory Committee, formed by volcanologists of the National Autonomous University of Mexico, and by CENAPRED staff. From this prognosis, the alert level for the people is determined and it is spread by the code of the traffic light of volcanic alert. A volcanic event tree was constructed with the advisory of the scientific committee in the recent seismic-eruptive crisis of April-May 2012, in order to identify the most probable processes in which this unrest could have developed and to contribute to the diagnosis task. In this research, we propose a comparison between the processes identified in this preliminary volcanic event tree and another elaborated using a Hazard Assessment Event Tree probability tool (HASSET), built on a bayesian event tree structure, using mainly the information of the known eruptive history of Popocatépetl. The HASSET method is based on Bayesian Inference and is used to assess volcanic hazard of future eruptive scenarios, by evaluating the most relevant sources of uncertainty that play a role in estimating the future probability of occurrence of a specific volcanic event. The final goal is to find the most useful tools to make the diagnosis and prognosis of the Popocatépetl volcanic activity, integrating the known eruptive history of the volcano, the experience of the scientific committee and the information provided by the monitoring systems, in an interactive and user-friendly way.
NASA Astrophysics Data System (ADS)
Zarekarizi, M.; Moradkhani, H.; Yan, H.
2017-12-01
The Operational Probabilistic Drought Forecasting System (OPDFS) is an online tool recently developed at Portland State University for operational agricultural drought forecasting. This is an integrated statistical-dynamical framework issuing probabilistic drought forecasts monthly for the lead times of 1, 2, and 3 months. The statistical drought forecasting method utilizes copula functions in order to condition the future soil moisture values on the antecedent states. Due to stochastic nature of land surface properties, the antecedent soil moisture states are uncertain; therefore, data assimilation system based on Particle Filtering (PF) is employed to quantify the uncertainties associated with the initial condition of the land state, i.e. soil moisture. PF assimilates the satellite soil moisture data to Variable Infiltration Capacity (VIC) land surface model and ultimately updates the simulated soil moisture. The OPDFS builds on the NOAA's seasonal drought outlook by offering drought probabilities instead of qualitative ordinal categories and provides the user with the probability maps associated with a particular drought category. A retrospective assessment of the OPDFS showed that the forecasting of the 2012 Great Plains and 2014 California droughts were possible at least one month in advance. The OPDFS offers a timely assistance to water managers, stakeholders and decision-makers to develop resilience against uncertain upcoming droughts.
Siol, V; Lange, A; Prenzler, A; Neubauer, S; Frank, M
2017-05-01
Objectives: The present study aims to investigate the interest of young adults in predictive oncological genetic testing and their willingness to pay for such a test. Furthermore, major determinants of the 2 variables of interest were identified. Methods: 348 students of economics from the Leibniz University of Hanover were queried in July 2013 using an extensive questionnaire. Among other things, the participants were asked if they are interested in information about the probability to develop cancer in the future and their willingness to pay for such information. Data were analysed using descriptive statistics and ordinal probit regressions. Additionally marginal effects were calculated. Results: About 50% of the students were interested in predictive oncological genetic testing and were willing to pay for the test. Moreover, the participants who were willing to pay for the test partly attach high monetary values to the information that could so be obtained. The study shows that the interest of the students and their willingness to pay were primarily influenced by individual attitudes and perceptions. Conclusions: The study proves that young adults were interested in predictive genetic testing and appreciate information about their probability of develop cancer someday. © Georg Thieme Verlag KG Stuttgart · New York.
(abstract) Infrared Cirrus and Future Space Based Astronomy
NASA Technical Reports Server (NTRS)
Gautier, T. N.
1993-01-01
A review of the known properties of the distribution of infrared cirrus is followed by a discussion of the implications of cirrus on observations from space. Probable limitations on space observations due to IR cirrus.
A Robust Decision-Making Technique for Water Management under Decadal Scale Climate Variability
NASA Astrophysics Data System (ADS)
Callihan, L.; Zagona, E. A.; Rajagopalan, B.
2013-12-01
Robust decision making, a flexible and dynamic approach to managing water resources in light of deep uncertainties associated with climate variability at inter-annual to decadal time scales, is an analytical framework that detects when a system is in or approaching a vulnerable state. It provides decision makers the opportunity to implement strategies that both address the vulnerabilities and perform well over a wide range of plausible future scenarios. A strategy that performs acceptably over a wide range of possible future states is not likely to be optimal with respect to the actual future state. The degree of success--the ability to avoid vulnerable states and operate efficiently--thus depends on the skill in projecting future states and the ability to select the most efficient strategies to address vulnerabilities. This research develops a robust decision making framework that incorporates new methods of decadal scale projections with selection of efficient strategies. Previous approaches to water resources planning under inter-annual climate variability combining skillful seasonal flow forecasts with climatology for subsequent years are not skillful for medium term (i.e. decadal scale) projections as decision makers are not able to plan adequately to avoid vulnerabilities. We address this need by integrating skillful decadal scale streamflow projections into the robust decision making framework and making the probability distribution of this projection available to the decision making logic. The range of possible future hydrologic scenarios can be defined using a variety of nonparametric methods. Once defined, an ensemble projection of decadal flow scenarios are generated from a wavelet-based spectral K-nearest-neighbor resampling approach using historical and paleo-reconstructed data. This method has been shown to generate skillful medium term projections with a rich variety of natural variability. The current state of the system in combination with the probability distribution of the projected flow ensembles enables the selection of appropriate decision options. This process is repeated for each year of the planning horizon--resulting in system outcomes that can be evaluated on their performance and resiliency. The research utilizes the RiverSMART suite of software modeling and analysis tools developed under the Bureau of Reclamation's WaterSMART initiative and built around the RiverWare modeling environment. A case study is developed for the Gunnison and Upper Colorado River Basins. The ability to mitigate vulnerability using the framework is gauged by system performance indicators that measure the ability of the system to meet various water demands (i.e. agriculture, environmental flows, hydropower etc.). Options and strategies for addressing vulnerabilities include measures such as conservation, reallocation and adjustments to operational policy. In addition to being able to mitigate vulnerabilities, options and strategies are evaluated based on benefits, costs and reliability. Flow ensembles are also simulated to incorporate mean and variance from climate change projections for the planning horizon and the above robust decision-making framework is applied to evaluate its performance under changing climate.
Planetary Protection for future missions to Europa and other icy moons: the more things change...
NASA Astrophysics Data System (ADS)
Conley, C. A.; Race, M.
2007-12-01
NASA maintains a planetary protection policy regarding contamination of extraterrestrial bodies by terrestrial microorganisms and organic compounds, and sets limits intended to minimize or prevent contamination resulting from spaceflight missions. Europa continues to be a high priority target for astrobiological investigations, and other icy moons of the outer planets are becoming increasingly interesting as data are returned from current missions. In 2000, a study was released by the NRC that provided recommendations on preventing the forward contamination of Europa. This study addressed a number of issues, including cleaning and sterilization requirements, the applicability of protocols derived from Viking and other missions to Mars, and the need to supplement spore based culture methods in assessing spacecraft bioload. The committee also identified a number of future studies that would improve knowledge of Europa and better define issues related to forward contamination of that body. The standard recommended by the 2000 study and adopted by NASA uses a probabilistic approach, such that spacecraft sent to Europa must demonstrate a probability less than 10-4 per mission of contaminating an europan ocean with one viable terrestrial organism. A number of factors enter into the equation for calculating this probability, including at least bioload at launch, probability of survival during flight, probability of reaching the surface of Europa, and probability of reaching an europan ocean. Recently, the NASA Planetary Protection Subcommittee of the NASA Advisory Council has recommended that the probabilistic approach recommended for Europa be applied to all outer planet icy moons, until another NRC study can be convened to reevaluate the issues in light of recent data. This presentation will discuss the status of current and anticipated planetary protection considerations for missions to Europa and other icy moons.
Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)
NASA Astrophysics Data System (ADS)
Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro
2013-04-01
The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.
Clinical case definition for the diagnosis of acute intussusception.
Bines, Julie E; Ivanoff, Bernard; Justice, Frances; Mulholland, Kim
2004-11-01
Because of the reported association between intussusception and a rotavirus vaccine, future clinical trials of rotavirus vaccines will need to include intussusception surveillance in the evaluation of vaccine safety. The aim of this study is to develop and validate a clinical case definition for the diagnosis of acute intussusception. A clinical case definition for the diagnosis of acute intussusception was developed by analysis of an extensive literature review that defined the clinical presentation of intussusception in 70 developed and developing countries. The clinical case definition was then assessed for sensitivity and specificity using a retrospective chart review of hospital admissions. Sensitivity of the clinical case definition was assessed in children diagnosed with intussusception over a 6.5-year period. Specificity was assessed in patients aged <2 years admitted with bowel obstruction and in patients aged <19 years presenting with symptoms that may occur in intussusception. The clinical case definition accurately identified 185 of 191 assessable cases as "probable" intussusception and six cases as "possible" intussusception (sensitivity, 97%). No case of radiologic or surgically proven intussusception failed to be identified by the clinical case definition. The specificity of the definition in correctly identifying patients who did not have intussusception ranged from 87% to 91%. The clinical case definition for intussusception may assist in the prompt identification of patients with intussusception and may provide an important tool for the future trials of enteric vaccines.
Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate
NASA Astrophysics Data System (ADS)
Samaras, C.; Cook, L.
2015-12-01
Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.
NASA Astrophysics Data System (ADS)
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning
Worry and perceived threat of proximal and distal undesirable outcomes.
Bredemeier, Keith; Berenbaum, Howard; Spielberg, Jeffrey M
2012-04-01
Individuals who are prone to worry tend to overestimate the likelihoods and costs of future undesirable outcomes. However, it is unclear whether these relations vary as a function of the timeframe of the event in question. In the present study, 342 undergraduate students completed a self-report measure of worry and rated the perceived probabilities and costs of 40 undesirable outcomes. Specifically, each participant estimated the probability that each of these outcomes would occur within three different timeframes: the next month, the next year, and the next 10 years. We found that the strength of the association between worry and probability estimates was strongest for the most proximal timeframe. Probability estimates were more strongly associated with worry for participants with elevated cost estimates, and this interactive effect was strongest for the most distal timeframe. Implications of these findings for understanding the etiology and treatment of excessive worry are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning.
Lechuga, Julia; Wiebe, John S
2011-08-01
A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed.
Probabilistic safety analysis of earth retaining structures during earthquakes
NASA Astrophysics Data System (ADS)
Grivas, D. A.; Souflis, C.
1982-07-01
A procedure is presented for determining the probability of failure of Earth retaining structures under static or seismic conditions. Four possible modes of failure (overturning, base sliding, bearing capacity, and overall sliding) are examined and their combined effect is evaluated with the aid of combinatorial analysis. The probability of failure is shown to be a more adequate measure of safety than the customary factor of safety. As Earth retaining structures may fail in four distinct modes, a system analysis can provide a single estimate for the possibility of failure. A Bayesian formulation of the safety retaining walls is found to provide an improved measure for the predicted probability of failure under seismic loading. The presented Bayesian analysis can account for the damage incurred to a retaining wall during an earthquake to provide an improved estimate for its probability of failure during future seismic events.
Constructing event trees for volcanic crises
Newhall, C.; Hoblitt, R.
2002-01-01
Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.
NASA Astrophysics Data System (ADS)
Gürbüz, Ramazan
2010-09-01
The purpose of this study is to investigate and compare the effects of activity-based and traditional instructions on students' conceptual development of certain probability concepts. The study was conducted using a pretest-posttest control group design with 80 seventh graders. A developed 'Conceptual Development Test' comprising 12 open-ended questions was administered on both groups of students before and after the intervention. The data were analysed using analysis of covariance, with the pretest as covariate. The results revealed that activity-based instruction (ABI) outperformed the traditional counterpart in the development of probability concepts. Furthermore, ABI was found to contribute students' conceptual development of the concept of 'Probability of an Event' the most, whereas to the concept of 'Sample Space' the least. As a consequence, it can be deduced that the designed instructional process was effective in the instruction of probability concepts.
Building entity models through observation and learning
NASA Astrophysics Data System (ADS)
Garcia, Richard; Kania, Robert; Fields, MaryAnne; Barnes, Laura
2011-05-01
To support the missions and tasks of mixed robotic/human teams, future robotic systems will need to adapt to the dynamic behavior of both teammates and opponents. One of the basic elements of this adaptation is the ability to exploit both long and short-term temporal data. This adaptation allows robotic systems to predict/anticipate, as well as influence, future behavior for both opponents and teammates and will afford the system the ability to adjust its own behavior in order to optimize its ability to achieve the mission goals. This work is a preliminary step in the effort to develop online entity behavior models through a combination of learning techniques and observations. As knowledge is extracted from the system through sensor and temporal feedback, agents within the multi-agent system attempt to develop and exploit a basic movement model of an opponent. For the purpose of this work, extraction and exploitation is performed through the use of a discretized two-dimensional game. The game consists of a predetermined number of sentries attempting to keep an unknown intruder agent from penetrating their territory. The sentries utilize temporal data coupled with past opponent observations to hypothesize the probable locations of the opponent and thus optimize their guarding locations.
Assessment of the potential future market in Sweden for hydrogen as an energy carrier
NASA Astrophysics Data System (ADS)
Carleson, G.
Future hydrogen markets for the period 1980-2025 are projected, the probable range of hydrogen production costs for various manufacturing methods is estimated, and expected market shares in competition with alternative energy carriers are evaluated. A general scenario for economic and industrial development in Sweden for the given period was evaluated, showing the average increase in gross national product to become 1.6% per year. Three different energy scenarios were then developed: alternatives were based on nuclear energy, renewable indigenous energy sources, and the present energy situation with free access to imported natural or synthetic fuels. An analysis was made within each scenario of the competitiveness of hydrogen on both the demand and the supply of the following sectors: chemical industry, steel industry, peak power production, residential and commercial heating, and transportation. Costs were calculated for the production, storage and transmission of hydrogen according to technically feasible methods and were compared to those of alternative energy carriers. Health, environmental and societal implications were also considered. The market penetration of hydrogen in each sector was estimated, and the required investment capital was shown to be less than 4% of the national gross investment sum.
New and future heat pump technologies
NASA Astrophysics Data System (ADS)
Creswick, F. A.
It is not possible to say for sure what future heat pumps will look like, but there are some interesting possibilities. In the next five years, we are likely to see US heat pumps with two kinds of innovations: capacity modulation and charge control. Capacity modulation will be accomplished by variable-speed compressor motors. The objective of charge control is to keep the refrigerant charge in the system where it belongs for best performance; there are probably many ways to accomplish this. Charge control will improve efficiency and durability; capacity modulation will further improve efficiency and comfort. The Stirling cycle heat pump has several interesting advantages, but it is farther out in time. At present, we don't know how to make it as efficient as the conventional vapor-compression heat pump. Electric utility people should be aware that major advances are being made in gas-fired heat pumps which could provide strong competition in the future. However, even a gas-fired heat pump has a substantial auxiliary electric power requirement. The resources needed to develop advanced heat pumps are substantial and foreign competition will be intense. It will be important for utilities, manufacturers, and the federal government to work in close cooperation.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
Communication theory and the search for effective feedback.
Simonds, S K
1995-01-01
If messages transmitted to the public, patients and health professionals could be assured of being received, understood and acted on as intended by the senders of messages, there would be little need to focus on communications and feedback. That the physician's office, the healthcare system and the community are littered with messages that 'never got through' attests to the problem of ineffective communications and the absence of effective feedback. Communication theorists, health psychologists and thoughtful health professionals, particularly those working in community hypertension programmes, have developed approaches that improve the probabilities of 'getting the message through'. Theory-based communications with built-in feedback and 'feed-forward' enhance the probabilities of success considerably. This presentation explores these problems using the SMCR model of communication. Differences between linear models and transactional models are discussed. On the assumption that the health message environments of the future will be increasingly complex with highly differentiated target audiences in a rapid paced computer and electronically driven world, 'getting the message through' will become an even greater challenge than in the recent past. Specific steps to change communication approaches in this setting are proposed.
Influence of age on androgen deprivation therapy-associated Alzheimer’s disease
NASA Astrophysics Data System (ADS)
Nead, Kevin T.; Gaskin, Greg; Chester, Cariad; Swisher-McClure, Samuel; Dudley, Joel T.; Leeper, Nicholas J.; Shah, Nigam H.
2016-10-01
We recently found an association between androgen deprivation therapy (ADT) and Alzheimer’s disease. As Alzheimer’s disease is a disease of advanced age, we hypothesize that older individuals on ADT may be at greatest risk. We conducted a retrospective multi-institutional analysis among 16,888 individuals with prostate cancer using an informatics approach. We tested the effect of ADT on Alzheimer’s disease using Kaplan-Meier age stratified analyses in a propensity score matched cohort. We found a lower cumulative probability of remaining Alzheimer’s disease-free between non-ADT users age ≥70 versus those age <70 years (p < 0.001) and between ADT versus non-ADT users ≥70 years (p = 0.034). The 5-year probability of developing Alzheimer’s disease was 2.9%, 1.9% and 0.5% among ADT users ≥70, non-ADT users ≥70 and individuals <70 years, respectively. Compared to younger individuals older men on ADT may have the greatest absolute Alzheimer’s disease risk. Future work should investigate the ADT Alzheimer’s disease association in advanced age populations given the greater potential clinical impact.
NASA Astrophysics Data System (ADS)
Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.
2018-05-01
At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.
Paleolakes and lacustrine basins on Mars
NASA Technical Reports Server (NTRS)
Scott, David H.; Rice, James W., Jr.; Dohm, James M.
1991-01-01
The problems of how warm and wet Mars once was and when climate transitions may have occurred are not well understood. Mars may have had an early environment similar to Earth's that was conducive to the emergence of life. In addition, increasing geologic evidence indicates that water, upon which terrestrial life depends, has been present on Mars throughout its history. This evidence does not detract from the possibility that life may have originated on early Mars, but rather suggests that life could have developed over longer periods of time in longer lasting, more clement local environments than previously envisioned. It is suggested herein that such environments may have been provided by paleolakes, located mostly in the northern lowlands and probably ice covered. Such lakes probably would have had diverse origins. Glacial lakes may have occupied ice eroded hollows or formed in valleys obstructed by moraines or ice barriers. Unlike Earth, the Martian record of the origin and evolution of possible life may have not been erased by extensive deformation of the surface. Thus the basins that may have contained the paleolakes are potential sites for future biological, geological, and climatological study.
Prediction and control of neural responses to pulsatile electrical stimulation
NASA Astrophysics Data System (ADS)
Campbell, Luke J.; Sly, David James; O'Leary, Stephen John
2012-04-01
This paper aims to predict and control the probability of firing of a neuron in response to pulsatile electrical stimulation of the type delivered by neural prostheses such as the cochlear implant, bionic eye or in deep brain stimulation. Using the cochlear implant as a model, we developed an efficient computational model that predicts the responses of auditory nerve fibers to electrical stimulation and evaluated the model's accuracy by comparing the model output with pooled responses from a group of guinea pig auditory nerve fibers. It was found that the model accurately predicted the changes in neural firing probability over time to constant and variable amplitude electrical pulse trains, including speech-derived signals, delivered at rates up to 889 pulses s-1. A simplified version of the model that did not incorporate adaptation was used to adaptively predict, within its limitations, the pulsatile electrical stimulus required to cause a desired response from neurons up to 250 pulses s-1. Future stimulation strategies for cochlear implants and other neural prostheses may be enhanced using similar models that account for the way that neural responses are altered by previous stimulation.
Bergström, Richard
2011-04-01
The established market model for pharmaceutical products, as for most other products, is heavily dependent on sales volumes. Thus, it is a primary interest of the producer to sell large quantities. This may be questionable for medicinal products and probably most questionable for antibacterial remedies. For these products, treatment indications are very complex and encompass both potential patient benefits, possible adverse effects in the actual patient and, which is unique for this therapeutic class, consideration about what effects the drug use will have on the future therapeutic value of the drug. This is because bacteria are sure to develop resistance. The European Federation of Pharmaceutical Industries and Associations (EFPIA) agrees with the general description of the antibacterial resistance problem and wants to participate in measures to counteract antibacterial resistance. Stakeholders should forge an alliance that will address the need for and prudent use of new antibiotics. A variety of incentives probably have to be applied, but having all in common that the financial return has to be separated from the use of the product. Copyright © 2011. Published by Elsevier Ltd.
Furuhashi, Tatsuhiko; Moroi, Masao; Joki, Nobuhiko; Hase, Hiroki; Masai, Hirofumi; Kunimasa, Taeko; Fukuda, Hiroshi; Sugi, Kaoru
2013-02-01
Pretest probability of coronary artery disease (CAD) facilitates diagnosis and risk stratification of CAD. Stress myocardial perfusion imaging (MPI) and chronic kidney disease (CKD) are established major predictors of cardiovascular events. However, the role of CKD to assess pretest probability of CAD has been unclear. This study evaluates the role of CKD to assess the predictive value of cardiovascular events under consideration of pretest probability in patients who underwent stress MPI. Patients with no history of CAD underwent stress MPI (n = 310; male = 166; age = 70; CKD = 111; low/intermediate/high pretest probability = 17/194/99) and were followed for 24 months. Cardiovascular events included cardiac death and nonfatal acute coronary syndrome. Cardiovascular events occurred in 15 of the 310 patients (4.8 %), but not in those with low pretest probability which included 2 CKD patients. In patients with intermediate to high pretest probability (n = 293), multivariate Cox regression analysis identified only CKD [hazard ratio (HR) = 4.88; P = 0.022) and summed stress score of stress MPI (HR = 1.50; P < 0.001) as independent and significant predictors of cardiovascular events. Cardiovascular events were not observed in patients with low pretest probability. In patients with intermediate to high pretest probability, CKD and stress MPI are independent predictors of cardiovascular events considering the pretest probability of CAD in patients with no history of CAD. In assessing pretest probability of CAD, CKD might be an important factor for assessing future cardiovascular prognosis.
NASA Astrophysics Data System (ADS)
Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.
2013-05-01
Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2007-12-01
We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?
NASA Technical Reports Server (NTRS)
Leonard, J. I.; Furukawa, S.; Vannordstrand, P. C.
1975-01-01
The use of automated, analytical techniques to aid medical support teams is suggested. Recommendations are presented for characterizing crew health in terms of: (1) wholebody function including physiological, psychological and performance factors; (2) a combination of critical performance indexes which consist of multiple factors of measurable parameters; (3) specific responses to low noise level stress tests; and (4) probabilities of future performance based on present and periodic examination of past performance. A concept is proposed for a computerized real time biomedical monitoring and health care system that would have the capability to integrate monitored data, detect off-nominal conditions based on current knowledge of spaceflight responses, predict future health status, and assist in diagnosis and alternative therapies. Mathematical models could play an important role in this approach, especially when operating in a real time mode. Recommendations are presented to update the present health monitoring systems in terms of recent advances in computer technology and biomedical monitoring systems.
Nojavan A, Farnaz; Qian, Song S; Paerl, Hans W; Reckhow, Kenneth H; Albright, Elizabeth A
2014-06-15
The present paper utilizes a Bayesian Belief Network (BBN) approach to intuitively present and quantify our current understanding of the complex physical, chemical, and biological processes that lead to eutrophication in an estuarine ecosystem (New River Estuary, North Carolina, USA). The model is further used to explore the effects of plausible future climatic and nutrient pollution management scenarios on water quality indicators. The BBN, through visualizing the structure of the network, facilitates knowledge communication with managers/stakeholders who might not be experts in the underlying scientific disciplines. Moreover, the developed structure of the BBN is transferable to other comparable estuaries. The BBN nodes are discretized exploring a new approach called moment matching method. The conditional probability tables of the variables are driven by a large dataset (four years). Our results show interaction among various predictors and their impact on water quality indicators. The synergistic effects caution future management actions. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Corrosion Risk Assessment Model for Underground Piping
NASA Technical Reports Server (NTRS)
Datta, Koushik; Fraser, Douglas R.
2009-01-01
The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.
Magnetohydrodynamic modelling of solar disturbances in the interplanetary medium
NASA Astrophysics Data System (ADS)
Dryer, M.
1985-12-01
A scientifically constructed series of interplanetary magnetohydrodynamic models is made that comprise the foundations for a composite solar terrestrial environment model. These models, unique in the field of solar wind physics, include both 2-1/2D as well as 3D time dependent codes that will lead to future operational status. We have also developed a geomagnetic storm forecasting strategy, referred to as the Solar Terrestrial Environment Model (STEM/2000), whereby these models would be appended in modular fashion to solar, magnetosphere, ionosphere, thermosphere, and neutral atmosphere models. We stress that these models, while still not appropriate at this date for operational use, outline a strategy or blueprint for the future. This strategy, if implemented in its essential features, offers a high probability for technology transfer from theory to operational testing within, approximately, a decade. It would ensure that real time observations would be used to drive physically based models that outputs of which would be used by space environment forecasters.
Experimental Probability in Elementary School
ERIC Educational Resources Information Center
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Future probabilities of coastal floods in Finland
NASA Astrophysics Data System (ADS)
Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.
2018-04-01
Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.
Lingam, Manasvi
2016-06-01
In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined. Complex life-Extraterrestrial life-Panspermia-Life detection-SETI. Astrobiology 16, 418-426.
Multidisciplinary hydrologic investigations at Yucca Mountain, Nevada
Dudley, William W.
1990-01-01
Future climatic conditions and tectonic processes have the potential to cause significant changes of the hydrologic system in the southern Great Basin, where a nuclear-waste repository is proposed for construction above the water table at Yucca Mountain, Nevada. Geothermal anomalies in the vicinity of Yucca Mountain probably result from the local and regional transport of heat by ground-water flow. Regionally and locally irregular patterns of hydraulic potential, local marsh and pond deposits, and calcite veins in faults and fractures probably are related principally to climatically imposed hydrologic conditions within the geologic and topographic framework. However, tectonic effects on the hydrologic system have also been proposed as the causes of these features, and existing data limitations preclude a full evaluation of these competing hypotheses. A broad program that integrates many disciplines of earth science is required in order to understand the relation of hydrology to past, present and future climates and tectonism.
An Arrival and Departure Time Predictor for Scheduling Communication in Opportunistic IoT
Pozza, Riccardo; Georgoulas, Stylianos; Moessner, Klaus; Nati, Michele; Gluhak, Alexander; Krco, Srdjan
2016-01-01
In this article, an Arrival and Departure Time Predictor (ADTP) for scheduling communication in opportunistic Internet of Things (IoT) is presented. The proposed algorithm learns about temporal patterns of encounters between IoT devices and predicts future arrival and departure times, therefore future contact durations. By relying on such predictions, a neighbour discovery scheduler is proposed, capable of jointly optimizing discovery latency and power consumption in order to maximize communication time when contacts are expected with high probability and, at the same time, saving power when contacts are expected with low probability. A comprehensive performance evaluation with different sets of synthetic and real world traces shows that ADTP performs favourably with respect to previous state of the art. This prediction framework opens opportunities for transmission planners and schedulers optimizing not only neighbour discovery, but the entire communication process. PMID:27827909
An Arrival and Departure Time Predictor for Scheduling Communication in Opportunistic IoT.
Pozza, Riccardo; Georgoulas, Stylianos; Moessner, Klaus; Nati, Michele; Gluhak, Alexander; Krco, Srdjan
2016-11-04
In this article, an Arrival and Departure Time Predictor (ADTP) for scheduling communication in opportunistic Internet of Things (IoT) is presented. The proposed algorithm learns about temporal patterns of encounters between IoT devices and predicts future arrival and departure times, therefore future contact durations. By relying on such predictions, a neighbour discovery scheduler is proposed, capable of jointly optimizing discovery latency and power consumption in order to maximize communication time when contacts are expected with high probability and, at the same time, saving power when contacts are expected with low probability. A comprehensive performance evaluation with different sets of synthetic and real world traces shows that ADTP performs favourably with respect to previous state of the art. This prediction framework opens opportunities for transmission planners and schedulers optimizing not only neighbour discovery, but the entire communication process.
Challenges in making a seismic hazard map for Alaska and the Aleutians
Wesson, R.L.; Boyd, O.S.; Mueller, C.S.; Frankel, A.D.; Freymueller, J.T.
2008-01-01
We present a summary of the data and analyses leading to the revision of the time-independent probabilistic seismic hazard maps of Alaska and the Aleutians. These maps represent a revision of existing maps based on newly obtained data, and reflect best current judgments about methodology and approach. They have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States, and will be proposed for adoption in future revisions to the International Building Code. We present example maps for peak ground acceleration, 0.2 s spectral amplitude (SA), and 1.0 s SA at a probability level of 2% in 50 years (annual probability of 0.000404). In this summary, we emphasize issues encountered in preparation of the maps that motivate or require future investigation and research.
Developing a probability-based model of aquifer vulnerability in an agricultural region
NASA Astrophysics Data System (ADS)
Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei
2013-04-01
SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.
Changes in Benefits of Flood Protection Standard under Climate Change
NASA Astrophysics Data System (ADS)
Lim, W. H.; Koirala, S.; Yamazaki, D.; Hirabayashi, Y.; Kanae, S.
2014-12-01
Understanding potential risk of river flooding under future climate scenarios might be helpful for developing risk management strategies (including mitigation, adaptation). Such analyses are typically performed at the macro scales (e.g., regional, global) where the climate model output could support (e.g., Hirabayashi et al., 2013, Arnell and Gosling, 2014). To understand the potential benefits of infrastructure upgrading as part of climate adaptation strategies, it is also informative to understand the potential impact of different flood protection standards (in terms of return periods) on global river flooding under climate change. In this study, we use a baseline period (forced by observed hydroclimate conditions) and CMIP5 model output (historic and future periods) to drive a global river routing model called CaMa-Flood (Yamazaki et al., 2011) and simulate the river water depth at a spatial resolution of 15 min x 15 min. From the simulated results of baseline period, we use the annual maxima river water depth to fit the Gumbel distribution and prepare the return period-flood risk relationship (involving population and GDP). From the simulated results of CMIP5 model, we also used the annual maxima river water depth to obtain the Gumbel distribution and then estimate the exceedance probability (historic and future periods). We apply the return period-flood risk relationship (above) to the exceedance probability and evaluate the potential risk of river flooding and changes in the benefits of flood protection standard (e.g., 100-year flood of the baseline period) from the past into the future (represented by the representative concentration pathways). In this presentation, we show our preliminary results. References: Arnell, N.W, Gosling, S., N., 2014. The impact of climate change on river flood risk at the global scale. Climatic Change 122: 127-140, doi: 10.1007/s10584-014-1084-5. Hirabayashi et al., 2013. Global flood risk under climate change. Nature Climate Change 3: 816-821, doi: 10.1038/nclimate1911. Yamazaki et al., 2011. A physically based description of floodplain inundation dynamics in a global river routing model. Water Resources Research 47, W04501, doi: 10.1029/2010wr009726.
Engineering principles to assure compatible docking between future spacecraft of USA and USSR
NASA Technical Reports Server (NTRS)
Johnson, C. C.
1975-01-01
Working jointly the USA and the USSR have selected an androgynous, peripheral type docking mechanism concept. The mechanical principles inherent to the concept, the rationale supporting its selection, and the probable nature of future designs stemming from the concept, are described. Operational situations just prior to docking, impact conditions, energy absorption, and the structural joining of the spacecraft, are specified. Docking procedures for the Apollo-Soyuz missions are discussed.
Use of passive ambient ozone (O3) samplers in vegetation effects assessment
Krupa, S.; Nosal, M.; Peterson, D.L.
2001-01-01
A stochastistic, Weibull probability model was developed and verified to simulate the underlying frequency distributions of hourly ozone (O3) concentrations (exposure dynamics) using the single, weekly mean values obtained from a passive (sodium nitrite absorbent) sampler. The simulation was based on the data derived from a co-located continuous monitor. Although at the moment the model output may be considered as being specific to the elevation and location of the study site, the results were extremely good. This effort for the approximation of the O3 exposure dynamics can be extended to other sites with similar data sets and in developing a generalized understanding of the stochastic O3 exposure-plant response relationships, conferring measurable benefits to the future use of passive O3 samplers, in the absence of continuous monitoring. Copyright ?? 2000 Elsevier Science Ltd.
Lin, Chih-Tin; Meyhofer, Edgar; Kurabayashi, Katsuo
2010-01-01
Directional control of microtubule shuttles via microfabricated tracks is key to the development of controlled nanoscale mass transport by kinesin motor molecules. Here we develop and test a model to quantitatively predict the stochastic behavior of microtubule guiding when they mechanically collide with the sidewalls of lithographically patterned tracks. By taking into account appropriate probability distributions of microscopic states of the microtubule system, the model allows us to theoretically analyze the roles of collision conditions and kinesin surface densities in determining how the motion of microtubule shuttles is controlled. In addition, we experimentally observe the statistics of microtubule collision events and compare our theoretical prediction with experimental data to validate our model. The model will direct the design of future hybrid nanotechnology devices that integrate nanoscale transport systems powered by kinesin-driven molecular shuttles.
The use of a very high temperature nuclear reactor in the manufacture of synthetic fuels
NASA Technical Reports Server (NTRS)
Farbman, G. H.; Brecher, L. E.
1976-01-01
The three parts of a program directed toward creating a cost-effective nuclear hydrogen production system are described. The discussion covers the development of a very high temperature nuclear reactor (VHTR) as a nuclear heat and power source capable of producing the high temperature needed for hydrogen production and other processes; the development of a hydrogen generation process based on water decomposition, which can utilize the outputs of the VHTR and be integrated with many different ultimate hydrogen consuming processes; and the evaluation of the process applications of the nuclear hydrogen systems to assess the merits and potential payoffs. It is shown that the use of VHTR for the manufacture of synthetic fuels appears to have a very high probability of making a positive contribution to meeting the nation's energy needs in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.
Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed formore » discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.« less
Multi-Path Transportation Futures Study. Results from Phase 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phil Patterson, Phil; Singh, Margaret; Plotkin, Steve
2007-03-09
Presentation reporting Phase 1 results, 3/9/2007. Projecting the future role of advanced drivetrains and fuels in the light vehicle market is inherently difficult, given the uncertainty (and likely volatility) of future oil prices, inadequate understanding of likely consumer response to new technologies, the relative infancy of several important new technologies with inevitable future changes in their performance and costs, and the importance — and uncertainty — of future government marketplace interventions (e.g., new regulatory standards or vehicle purchase incentives). The Multi-Path Transportation Futures (MP) Study has attempted to improve our understanding of this future role by examining several scenarios ofmore » vehicle costs, fuel prices, government subsidies, and other key factors. These are projections, not forecasts, in that they try to answer a series of “what if” questions without assigning probabilities to most of the basic assumptions.« less
Thomassen, Henri A.; Fuller, Trevon; Asefi-Najafabady, Salvi; Shiplacoff, Julia A. G.; Mulembakani, Prime M.; Blumberg, Seth; Johnston, Sara C.; Kisalu, Neville K.; Kinkela, Timothée L.; Fair, Joseph N.; Wolfe, Nathan D.; Shongo, Robert L.; LeBreton, Matthew; Meyer, Hermann; Wright, Linda L.; Muyembe, Jean-Jacques; Buermann, Wolfgang; Okitolonda, Emile; Hensley, Lisa E.; Lloyd-Smith, James O.; Smith, Thomas B.; Rimoin, Anne W.
2013-01-01
Climate change is predicted to result in changes in the geographic ranges and local prevalence of infectious diseases, either through direct effects on the pathogen, or indirectly through range shifts in vector and reservoir species. To better understand the occurrence of monkeypox virus (MPXV), an emerging Orthopoxvirus in humans, under contemporary and future climate conditions, we used ecological niche modeling techniques in conjunction with climate and remote-sensing variables. We first created spatially explicit probability distributions of its candidate reservoir species in Africa's Congo Basin. Reservoir species distributions were subsequently used to model current and projected future distributions of human monkeypox (MPX). Results indicate that forest clearing and climate are significant driving factors of the transmission of MPX from wildlife to humans under current climate conditions. Models under contemporary climate conditions performed well, as indicated by high values for the area under the receiver operator curve (AUC), and tests on spatially randomly and non-randomly omitted test data. Future projections were made on IPCC 4th Assessment climate change scenarios for 2050 and 2080, ranging from more conservative to more aggressive, and representing the potential variation within which range shifts can be expected to occur. Future projections showed range shifts into regions where MPX has not been recorded previously. Increased suitability for MPX was predicted in eastern Democratic Republic of Congo. Models developed here are useful for identifying areas where environmental conditions may become more suitable for human MPX; targeting candidate reservoir species for future screening efforts; and prioritizing regions for future MPX surveillance efforts. PMID:23935820
NASA Astrophysics Data System (ADS)
Brown, Tristan R.
The revised Renewable Fuel Standard requires the annual blending of 16 billion gallons of cellulosic biofuel by 2022 from zero gallons in 2009. The necessary capacity investments have been underwhelming to date, however, and little is known about the likely composition of the future cellulosic biofuel industry as a result. This dissertation develops a framework for identifying and analyzing the industry's likely future composition while also providing a possible explanation for why investment in cellulosic biofuels capacity has been low to date. The results of this dissertation indicate that few cellulosic biofuel pathways will be economically competitive with petroleum on an unsubsidized basis. Of five cellulosic biofuel pathways considered under 20-year price forecasts with volatility, only two achieve positive mean 20-year net present value (NPV) probabilities. Furthermore, recent exploitation of U.S. shale gas reserves and the subsequent fall in U.S. natural gas prices have negatively impacted the economic competitiveness of all but two of the cellulosic biofuel pathways considered; only two of the five pathways achieve substantially higher 20-year NPVs under a post-shale gas economic scenario relative to a pre-shale gas scenario. The economic competitiveness of cellulosic biofuel pathways with petroleum is reduced further when considered under price uncertainty in combination with realistic financial assumptions. This dissertation calculates pathway-specific costs of capital for five cellulosic biofuel pathway scenarios. The analysis finds that the large majority of the scenarios incur costs of capital that are substantially higher than those commonly assumed in the literature. Employment of these costs of capital in a comparative TEA greatly reduces the mean 20-year NPVs for each pathway while increasing their 10-year probabilities of default to above 80% for all five scenarios. Finally, this dissertation quantifies the economic competitiveness of six cellulosic biofuel pathways being commercialized in eight different U.S. states under price uncertainty, utilization of pathway-specific costs of capital, and region-specific economic factors. 10-year probabilities of default in excess of 60% are calculated for all eight location scenarios considered, with default probabilities in excess of 98% calculated for seven of the eight. Negative mean 20-year NPVs are calculated for seven of the eight location scenarios.
Wang, Junhua; Kong, Yumeng; Fu, Ting; Stipancic, Joshua
2017-01-01
This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun's API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones.
Kong, Yumeng; Stipancic, Joshua
2017-01-01
This paper presents the use of the Aimsun microsimulation program to simulate vehicle violating behaviors and observe their impact on road traffic crash risk. Plugins for violations of speeding, slow driving, and abrupt stopping were developed using Aimsun’s API and SDK module. A safety analysis plugin for investigating probability of rear-end collisions was developed, and a method for analyzing collision risk is proposed. A Fuzzy C-mean Clustering algorithm was developed to identify high risk states in different road segments over time. Results of a simulation experiment based on the G15 Expressway in Shanghai showed that abrupt stopping had the greatest impact on increasing collision risk, and the impact of violations increased with traffic volume. The methodology allows for the evaluation and monitoring of risks, alerting of road hazards, and identification of hotspots, and could be applied to the operations of existing facilities or planning of future ones. PMID:28886141
The Hurricane-Flood-Landslide Continuum
NASA Technical Reports Server (NTRS)
Negri, Andrew J.; Burkardt, Nina; Golden, Joseph H.; Halverson, Jeffrey B.; Huffman, George J.; Larsen, Matthew C.; McGinley, John A.; Updike, Randall G.; Verdin, James P.; Wieczorek, Gerald F.
2005-01-01
In August 2004, representatives from NOAA, NASA, the USGS, and other government agencies convened in San Juan, Puerto Rim for a workshop to discuss a proposed research project called the Hurricane-Flood-Landslide Continuum (HFLC). The essence of the HFLC is to develop and integrate tools across disciplines to enable the issuance of regional guidance products for floods and landslides associated with major tropical rain systems, with sufficient lead time that local emergency managers can protect vulnerable populations and infrastructure. All three lead agencies are independently developing precipitation-flood-debris flow forecasting technologies, and all have a history of work on natural hazards both domestically and overseas. NOM has the capability to provide tracking and prediction of storm rainfall, trajectory and landfall and is developing flood probability and magnTtude capabilities. The USGS has the capability to evaluate the ambient stability of natural and man-made landforms, to assess landslide susceptibilities for those landforms, and to establish probabilities for initiation of landslides and debris flows. Additionally, the USGS has well-developed operational capacity for real-time monitoring and reporting of streamflow across distributed networks of automated gaging stations (http://water.usgs.gov/waterwatch/). NASA has the capability to provide sophisticated algorithms for satellite remote sensing of precipitation, land use, and in the future, soil moisture. The Workshop sought to initiate discussion among three agencies regarding their specific and highly complimentary capabilities. The fundamental goal of the Workshop was to establish a framework that will leverage the strengths of each agency. Once a prototype system is developed for example, in relatively data-rich Puerto Rim, it could be adapted for use in data-poor, low-infrastructure regions such as the Dominican Republic or Haiti. This paper provides an overview of the Workshop s goals, presentations and recommendations with respect to the development of the HFLC.
Open-loop-feedback control of serum drug concentrations: pharmacokinetic approaches to drug therapy.
Jelliffe, R W
1983-01-01
Recent developments to optimize open-loop-feedback control of drug dosage regimens, generally applicable to pharmacokinetically oriented therapy with many drugs, involve computation of patient-individualized strategies for obtaining desired serum drug concentrations. Analyses of past therapy are performed by least squares, extended least squares, and maximum a posteriori probability Bayesian methods of fitting pharmacokinetic models to serum level data. Future possibilities for truly optimal open-loop-feedback therapy with full Bayesian methods, and conceivably for optimal closed-loop therapy in such data-poor clinical situations, are also discussed. Implementation of these various therapeutic strategies, using automated, locally controlled infusion devices, has also been achieved in prototype form.
Reducing Our Carbon Footprint: Frontiers in Climate Forecasting (LBNL Science at the Theater)
Collins, Bill [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2018-06-07
Bill Collins directs Berkeley Lab's research dedicated to atmospheric and climate science. Previously, he headed the development of one of the leading climate models used in international studies of global warming. His work has confirmed that man-made greenhouse gases are probably the main culprits of recent warming and future warming poses very real challenges for the environment and society. A lead author of the most recent assessment of the science of climate change by the United Nations' Intergovernmental Panel on Climate Change, Collins wants to create a new kind of climate model, one that will integrate cutting-edge climate science with accurate predictions people can use to plan their lives
Danchenko, V G
2011-05-01
The article is devoted to the reconstruction of medical uniforms Russian navy first third of the 18th century. It can be assumed that doctors were in varying degrees, the senior officer's dress, but of course without the braid, although there are exceptions, which related to doctors willing to go to a more senior hypostasis. A number of documents of different structures gives rise to speak with a high probability that the doctors of different ranks, serving in the Marine units that had shaped dress that is largely consistent with their position in the hierarchy of ranks and received in the near future, its development.
Useful Life Prediction for Payload Carrier Hardware
NASA Technical Reports Server (NTRS)
Ben-Arieh, David
2002-01-01
The Space Shuttle has been identified for use through 2020. Payload carrier systems will be needed to support missions through the same time frame. To support the future decision making process with reliable systems, it is necessary to analyze design integrity, identify possible sources of undesirable risk and recognize required upgrades for carrier systems. This project analyzed the information available regarding the carriers and developed the probability of becoming obsolete under different scenarios. In addition, this project resulted in a plan for an improved information system that will improve monitoring and control of the various carriers. The information collected throughout this project is presented in this report as process flow, historical records, and statistical analysis.
A new, high-resolution global mass coral bleaching database
Rickbeil, Gregory J. M.; Heron, Scott F.
2017-01-01
Episodes of mass coral bleaching have been reported in recent decades and have raised concerns about the future of coral reefs on a warming planet. Despite the efforts to enhance and coordinate coral reef monitoring within and across countries, our knowledge of the geographic extent of mass coral bleaching over the past few decades is incomplete. Existing databases, like ReefBase, are limited by the voluntary nature of contributions, geographical biases in data collection, and the variations in the spatial scale of bleaching reports. In this study, we have developed the first-ever gridded, global-scale historical coral bleaching database. First, we conducted a targeted search for bleaching reports not included in ReefBase by personally contacting scientists and divers conducting monitoring in under-reported locations and by extracting data from the literature. This search increased the number of observed bleaching reports by 79%, from 4146 to 7429. Second, we employed spatial interpolation techniques to develop annual 0.04° × 0.04° latitude-longitude global maps of the probability that bleaching occurred for 1985 through 2010. Initial results indicate that the area of coral reefs with a more likely than not (>50%) or likely (>66%) probability of bleaching was eight times higher in the second half of the assessed time period, after the 1997/1998 El Niño. The results also indicate that annual maximum Degree Heating Weeks, a measure of thermal stress, for coral reefs with a high probability of bleaching increased over time. The database will help the scientific community more accurately assess the change in the frequency of mass coral bleaching events, validate methods of predicting mass coral bleaching, and test whether coral reefs are adjusting to rising ocean temperatures. PMID:28445534
Huth, John K.; Silvis, Alexander; Moosman, Paul R.; Ford, W. Mark; Sweeten, Sara E.
2015-01-01
Many aspects of foraging and roosting habitat of Myotis leibii (Eastern Small-Footed Bat), an emergent rock roosting-obligate, are poorly described. Previous comparisons of effectiveness of acoustic sampling and mist-net captures have not included Eastern Small-Footed Bat. Habitat requirements of this species differ from congeners in the region, and it is unclear whether survey protocols developed for other species are applicable. Using data from three overlapping studies at two sampling sites in western Virginia’s central Appalachian Mountains, detection probabilities were examined for three survey methods (acoustic surveys with automated identification of calls, visual searches of rock crevices, and mist-netting) for use in the development of “best practices” for future surveys and monitoring. Observer effects were investigated using an expanded version of visual search data. Results suggested that acoustic surveys with automated call identification are not effective for documenting presence of Eastern Small-Footed Bats on talus slopes (basal detection rate of 0%) even when the species is known to be present. The broadband, high frequency echolocation calls emitted by Eastern Small-Footed Bat may be prone to attenuation by virtue of their high frequencies, and these factors, along with signal reflection, lower echolocation rates or possible misidentification to other bat species over talus slopes may all have contributed to poor acoustic survey success. Visual searches and mist-netting of emergent rock had basal detection probabilities of 91% and 75%, respectively. Success of visual searches varied among observers, but detection probability improved with practice. Additionally, visual searches were considerably more economical than mist-netting.
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
A new, high-resolution global mass coral bleaching database.
Donner, Simon D; Rickbeil, Gregory J M; Heron, Scott F
2017-01-01
Episodes of mass coral bleaching have been reported in recent decades and have raised concerns about the future of coral reefs on a warming planet. Despite the efforts to enhance and coordinate coral reef monitoring within and across countries, our knowledge of the geographic extent of mass coral bleaching over the past few decades is incomplete. Existing databases, like ReefBase, are limited by the voluntary nature of contributions, geographical biases in data collection, and the variations in the spatial scale of bleaching reports. In this study, we have developed the first-ever gridded, global-scale historical coral bleaching database. First, we conducted a targeted search for bleaching reports not included in ReefBase by personally contacting scientists and divers conducting monitoring in under-reported locations and by extracting data from the literature. This search increased the number of observed bleaching reports by 79%, from 4146 to 7429. Second, we employed spatial interpolation techniques to develop annual 0.04° × 0.04° latitude-longitude global maps of the probability that bleaching occurred for 1985 through 2010. Initial results indicate that the area of coral reefs with a more likely than not (>50%) or likely (>66%) probability of bleaching was eight times higher in the second half of the assessed time period, after the 1997/1998 El Niño. The results also indicate that annual maximum Degree Heating Weeks, a measure of thermal stress, for coral reefs with a high probability of bleaching increased over time. The database will help the scientific community more accurately assess the change in the frequency of mass coral bleaching events, validate methods of predicting mass coral bleaching, and test whether coral reefs are adjusting to rising ocean temperatures.
DeWeber, Jefferson Tyrell; Wagner, Tyler
2015-01-01
The Brook Trout Salvelinus fontinalis is an important species of conservation concern in the eastern USA. We developed a model to predict Brook Trout population status within individual stream reaches throughout the species’ native range in the eastern USA. We utilized hierarchical logistic regression with Bayesian estimation to predict Brook Trout occurrence probability, and we allowed slopes and intercepts to vary among ecological drainage units (EDUs). Model performance was similar for 7,327 training samples and 1,832 validation samples based on the area under the receiver operating curve (∼0.78) and Cohen's kappa statistic (0.44). Predicted water temperature had a strong negative effect on Brook Trout occurrence probability at the stream reach scale and was also negatively associated with the EDU average probability of Brook Trout occurrence (i.e., EDU-specific intercepts). The effect of soil permeability was positive but decreased as EDU mean soil permeability increased. Brook Trout were less likely to occur in stream reaches surrounded by agricultural or developed land cover, and an interaction suggested that agricultural land cover also resulted in an increased sensitivity to water temperature. Our model provides a further understanding of how Brook Trout are shaped by habitat characteristics in the region and yields maps of stream-reach-scale predictions, which together can be used to support ongoing conservation and management efforts. These decision support tools can be used to identify the extent of potentially suitable habitat, estimate historic habitat losses, and prioritize conservation efforts by selecting suitable stream reaches for a given action. Future work could extend the model to account for additional landscape or habitat characteristics, include biotic interactions, or estimate potential Brook Trout responses to climate and land use changes.
Change-in-ratio methods for estimating population size
Udevitz, Mark S.; Pollock, Kenneth H.; McCullough, Dale R.; Barrett, Reginald H.
2002-01-01
Change-in-ratio (CIR) methods can provide an effective, low cost approach for estimating the size of wildlife populations. They rely on being able to observe changes in proportions of population subclasses that result from the removal of a known number of individuals from the population. These methods were first introduced in the 1940’s to estimate the size of populations with 2 subclasses under the assumption of equal subclass encounter probabilities. Over the next 40 years, closed population CIR models were developed to consider additional subclasses and use additional sampling periods. Models with assumptions about how encounter probabilities vary over time, rather than between subclasses, also received some attention. Recently, all of these CIR models have been shown to be special cases of a more general model. Under the general model, information from additional samples can be used to test assumptions about the encounter probabilities and to provide estimates of subclass sizes under relaxations of these assumptions. These developments have greatly extended the applicability of the methods. CIR methods are attractive because they do not require the marking of individuals, and subclass proportions often can be estimated with relatively simple sampling procedures. However, CIR methods require a carefully monitored removal of individuals from the population, and the estimates will be of poor quality unless the removals induce substantial changes in subclass proportions. In this paper, we review the state of the art for closed population estimation with CIR methods. Our emphasis is on the assumptions of CIR methods and on identifying situations where these methods are likely to be effective. We also identify some important areas for future CIR research.
NASA Astrophysics Data System (ADS)
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source fault would be performed in order to examine the amount of the displacement and conditional probability quantitatively.
Changes in the Probability of Extreme Events: Where to Look for their Causes?
NASA Astrophysics Data System (ADS)
Groisman, P. Y.; Gulev, S. K.
2011-12-01
When wet or dry events are extraordinary and are associated with flooding, water shortages, severe vegetation stress, crop failures, property losses, and harm to human health, we name them extremes. Numerous observational studies show that in the past several decades precipitation has become more intense over most of the extra-tropics. At the same time, (and often in the same regions) precipitation events may occur more or less frequently or come in sequences of prolonged no-rain and wet periods. Each extreme event which manifests itself is a consequence of individual factors that are difficult to foresee. However, when these events occur more frequently, we must admit that there are changes in the probability of their occurrence and try to estimate why this happens. For example, in attempts to project prolonged extreme events (such as droughts) in a given season, climatologists used to look for their precursors in the Earth system "memory" that include anomalies in sea ice (SI) and snow cover extents (SCE), sea surface temperature (SST), and soil moisture and for their patterns (e.g., Southern Oscillation). However, the major "memory" component of the Earth system is the Earth Climate System itself. It began changing (IPCC 2007) and is not any longer a constant factor: SST, SI, and SCE anomalies of the past now became "climatology" and it is time to include this new reality in our analyses of the frequency and intensity of extreme events. Furthermore, land use, urban development, industrial development, and water management keep changing landscapes and, there are good reasons to believe that regional environmental changes feed back causing in some areas changes in the probability of extreme events. The central United States is among the regions where the strongest increase in intense rainfall in the 20th century has been documented. This raises the question of how precipitation patterns in the central US will evolve in the future: Will the recent trends toward increases in intense rainfall continue? We present and try to substantiate a hypothesis that the observed changes in characteristics of precipitation in the central US during the 20th century have been produced by interactions of local and regional land use change with global climate changes. We shall describe climatological and anthropogenic precursors of several extreme outbreaks over the northern extratropics. These precursors were waiting for their time and manifested themselves when the time became right. For example, in order to anticipate changes in the probability of the future heat outbreaks over Europe (including European Russia), the factors that control prolonged summer anticyclone conditions over the region should be thoroughly monitored and skillfully projected. Apparently, anomalies and/or trends in regional mean surface air temperature and precipitation are not the best among these precursors.
The influence of communications on transportation in the future.
DOT National Transportation Integrated Search
1984-01-01
The report examines the influences of communication on transportation from various points of view. First, the historical influences are discussed as they occurred in the United States between 1776 and the present. Second, probable influences to the y...
Technical assistance report : I-73 economic impact analysis.
DOT National Transportation Integrated Search
1995-01-01
This study assessed the probable economic impact of the future Interstate 73 along each of twelve alternative corridors that were proposed for the new highway. The contents of this report were originally distributed in four parts during February and ...
NASA Astrophysics Data System (ADS)
Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.
2004-12-01
The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better than do the blockquake and Parkfield data. This provided opportunities for discussing the difference between Poisson and normal distributions, how those differences affect our estimation of future earthquake probabilities, the importance of both the mean and the standard deviation in predicting future behavior from a sequence of events, and how conditional probability is used to help seismologists predict future earthquakes given a known or theoretical distribution of past earthquakes.
Exact probability distribution function for the volatility of cumulative production
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Klümper, Andreas
2018-04-01
In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.
Journy, Neige M Y; Lee, Choonsik; Harbron, Richard W; McHugh, Kieran; Pearce, Mark S; Berrington de González, Amy
2017-01-01
Background: To project risks of developing cancer and the number of cases potentially induced by past, current, and future computed tomography (CT) scans performed in the United Kingdom in individuals aged <20 years. Methods: Organ doses were estimated from surveys of individual scan parameters and CT protocols used in the United Kingdom. Frequencies of scans were estimated from the NHS Diagnostic Imaging Dataset. Excess lifetime risks (ELRs) of radiation-related cancer were calculated as cumulative lifetime risks, accounting for survival probabilities, using the RadRAT risk assessment tool. Results: In 2000–2008, ELRs ranged from 0.3 to 1 per 1000 head scans and 1 to 5 per 1000 non-head scans. ELRs per scan were reduced by 50–70% in 2000–2008 compared with 1990–1995, subsequent to dose reduction over time. The 130 750 scans performed in 2015 in the United Kingdom were projected to induce 64 (90% uncertainty interval (UI): 38–113) future cancers. Current practices would lead to about 300 (90% UI: 230–680) future cancers induced by scans performed in 2016–2020. Conclusions: Absolute excess risks from single exposures would be low compared with background risks, but even small increases in annual CT rates over the next years would substantially increase the number of potential subsequent cancers. PMID:27824812
The Projection of Space Radiation Environments with a Solar Cycle Statistical Model
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee; Cucinotta, Francis A.; Wilson, John W.
2006-01-01
A solar cycle statistical model has been developed to project sunspot numbers which represent the variations in the space radiation environment. The resultant projection of sunspot numbers in near future were coupled to space-related quantities of interest in radiation protection, such as the galactic cosmic radiation (GCR) deceleration potential (f) and the mean occurrence frequency of solar particle event (SPE). Future GCR fluxes have been derived from a predictive model, in which GCR temporal dependence represented by f was derived from GCR flux and ground-based Climax neutron monitor rate measurements over the last four decades. Results showed that the point dose equivalent inside a typical spacecraft in interplanetary radiation fields was influenced by solar modulation up to a factor of three. One important characteristic of sporadic SPEs is their mean frequency of occurrence, which is dependent on solar activity. Projections of future mean frequency of SPE occurrence were estimated from a power law function of sunspot number. Furthermore, the cumulative probabilities of SPE during short-period missions were defined with the continuous database of proton fluences of SPE. The analytic representation of energy spectra of SPE was constructed by the Weibull distribution for different event sizes. The representative exposure level at each event size was estimated for the guideline of protection systems for astronauts during future space exploration missions.
NASA Astrophysics Data System (ADS)
Mereu, V.; Santini, M.; Dettori, G.; Muresu, P.; Spano, D.; Duce, P.
2009-12-01
Integrated scenarios of future climate and land use represent a useful input for impact studies about global changes. In particular, improving future land use simulations is essential for the agricultural sector, which is influenced by both biogeophysical constraints and human needs. Often land use change models are mainly based on statistical relationships between known land use distribution and biophysical or socio-economic factors, neglecting the necessary consideration of physical constraints that interact in making lands more or less capable for agriculture and suitable for supporting specific crops. In this study, a well developed land use change model (CLUE@CMCC) was suited for the Mediterranean basin case study, focusing on croplands. Several climate scenarios and future demands for croplands were combined to drive the model, while the same climate scenarios were used to more reliably allocate crops in the most suitable areas on the basis of Land Evaluation techniques. The probability for each map unit to sustain a specific crop, usually related to location characteristics, elasticity to conversion and competition among land use types, now includes specific crop-favoring location characteristics. Results, besides improving the consistency of the land use change model to allocate land for the future, can have the main feedback to suggest feasibility or reasonable thresholds to adjust land use demands during dynamic simulations.
Stacey, Dawn; Légaré, France; Lyddiatt, Anne; Giguere, Anik M C; Yoganathan, Manosila; Saarimaki, Anton; Pardo, Jordi Pardo; Rader, Tamara; Tugwell, Peter
2016-12-01
The purpose of this study was to translate evidence from Cochrane Reviews into a format that can be used to facilitate shared decision making during the consultation, namely patient decision aids. A systematic development process (a) established a stakeholder committee; (b) developed a prototype according to the International Patient Decision Aid Standards; (c) applied the prototype to a Cochrane Review and used an interview-guided survey to evaluate acceptability/usability; (d) created 12 consult decision aids; and (e) used a Delphi process to reach consensus on considerations for creating a consult decision aid. The 1-page prototype includes (a) a title specifying the decision; (b) information on the health condition, options, benefits/harms with probabilities; (c) an explicit values clarification exercise; and (d) questions to screen for decisional conflict. Hyperlinks provide additional information on definitions, probabilities presented graphically, and references. Fourteen Cochrane Consumer Network members and Cochrane Editorial Unit staff participated. Thirteen reported that it would help patient/clinician discussions and were willing to use and/or recommend it. Seven indicated the right amount of information, six not enough, and one too much. Changes to the prototype were more links to definitions, more white space, and details on GRADE evidence ratings. Creating 12 consult decision aids took about 4 h each. We identified ten considerations when selecting Cochrane Reviews for creating consult decision aids. Using a systematic process, we developed a consult decision aid prototype to be populated with evidence from Cochrane Reviews. It was acceptable and easy to apply. Future studies will evaluate implementation of consult decision aids.
NASA Astrophysics Data System (ADS)
Mortuza, M. R.; Demissie, Y. K.
2015-12-01
In lieu with the recent and anticipated more server and frequently droughts incidences in Yakima River Basin (YRB), a reliable and comprehensive drought assessment is deemed necessary to avoid major crop production loss and better manage the water right issues in the region during low precipitation and/or snow accumulation years. In this study, we have conducted frequency analysis of hydrological droughts and quantified associated uncertainty in the YRB under both historical and changing climate. Streamflow drought index (SDI) was employed to identify mutually correlated drought characteristics (e.g., severity, duration and peak). The historical and future characteristics of drought were estimated by applying tri-variate copulas probability distribution, which effectively describe the joint distribution and dependence of drought severity, duration, and peak. The associated prediction uncertainty, related to parameters of the joint probability and climate projections, were evaluated using the Bayesian approach with bootstrap resampling. For the climate change scenarios, two future representative pathways (RCP4.5 and RCP8.5) from University of Idaho's Multivariate Adaptive Constructed Analogs (MACA) database were considered. The results from the study are expected to provide useful information towards drought risk management in YRB under anticipated climate changes.
Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F
2011-10-04
The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.
Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.
2011-01-01
The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355
Paleoclimate and bubonic plague: a forewarning of future risk?
McMichael, Anthony J
2010-08-27
Pandemics of bubonic plague have occurred in Eurasia since the sixth century AD. Climatic variations in Central Asia affect the population size and activity of the plague bacterium's reservoir rodent species, influencing the probability of human infection. Using innovative time-series analysis of surrogate climate records spanning 1,500 years, a study in BMC Biology concludes that climatic fluctuations may have influenced these pandemics. This has potential implications for health risks from future climate change.
1976-05-01
subjective in nature , -it provides a practical method for analyzing a mass of data, including data which can be utilized to predict probable future... nature and administered when the individual student is unable to maintain acceptable perfornance during the training cycle. Service-wide remedial...are directly related to the curriculum topics of recruit training. Others are of a broader nature related to general Navy problo•is which present a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rastogi, Deeksha; Kao, Shih-Chieh; Ashfaq, Moetasim
Probable maximum precipitation (PMP), defined as the largest rainfall depth that could physically occur under a series of adverse atmospheric conditions, has been an important design criterion for critical infrastructures such as dams and nuclear power plants. To understand how PMP may respond to projected future climate forcings, we used a physics-based numerical weather simulation model to estimate PMP across various durations and areas over the Alabama-Coosa-Tallapoosa (ACT) river basin in the southeastern United States. Six sets of Weather Research and Forecasting (WRF) model experiments driven by both reanalysis and global climate model projections, with a total of 120 storms,more » were conducted. The depth-area-duration relationship was derived for each set of WRF simulations and compared with the conventional PMP estimates. Here, our results showed that PMP driven by projected future climate forcings is higher than 1981-2010 baseline values by around 20% in the 2021-2050 near-future and 44% in the 2071-2100 far-future periods. The additional sensitivity simulations of background air temperature warming also showed an enhancement of PMP, suggesting that atmospheric warming could be one important factor controlling the increase in PMP. In light of the projected increase in precipitation extremes under a warming environment, the reasonableness and role of PMP deserves more in-depth examination.« less
NASA Astrophysics Data System (ADS)
Rastogi, Deeksha; Kao, Shih-Chieh; Ashfaq, Moetasim; Mei, Rui; Kabela, Erik D.; Gangrade, Sudershan; Naz, Bibi S.; Preston, Benjamin L.; Singh, Nagendra; Anantharaj, Valentine G.
2017-05-01
Probable maximum precipitation (PMP), defined as the largest rainfall depth that could physically occur under a series of adverse atmospheric conditions, has been an important design criterion for critical infrastructures such as dams and nuclear power plants. To understand how PMP may respond to projected future climate forcings, we used a physics-based numerical weather simulation model to estimate PMP across various durations and areas over the Alabama-Coosa-Tallapoosa (ACT) River Basin in the southeastern United States. Six sets of Weather Research and Forecasting (WRF) model experiments driven by both reanalysis and global climate model projections, with a total of 120 storms, were conducted. The depth-area-duration relationship was derived for each set of WRF simulations and compared with the conventional PMP estimates. Our results showed that PMP driven by projected future climate forcings is higher than 1981-2010 baseline values by around 20% in the 2021-2050 near-future and 44% in the 2071-2100 far-future periods. The additional sensitivity simulations of background air temperature warming also showed an enhancement of PMP, suggesting that atmospheric warming could be one important factor controlling the increase in PMP. In light of the projected increase in precipitation extremes under a warming environment, the reasonableness and role of PMP deserve more in-depth examination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui
Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less
Rastogi, Deeksha; Kao, Shih-Chieh; Ashfaq, Moetasim; ...
2017-04-13
Probable maximum precipitation (PMP), defined as the largest rainfall depth that could physically occur under a series of adverse atmospheric conditions, has been an important design criterion for critical infrastructures such as dams and nuclear power plants. To understand how PMP may respond to projected future climate forcings, we used a physics-based numerical weather simulation model to estimate PMP across various durations and areas over the Alabama-Coosa-Tallapoosa (ACT) river basin in the southeastern United States. Six sets of Weather Research and Forecasting (WRF) model experiments driven by both reanalysis and global climate model projections, with a total of 120 storms,more » were conducted. The depth-area-duration relationship was derived for each set of WRF simulations and compared with the conventional PMP estimates. Here, our results showed that PMP driven by projected future climate forcings is higher than 1981-2010 baseline values by around 20% in the 2021-2050 near-future and 44% in the 2071-2100 far-future periods. The additional sensitivity simulations of background air temperature warming also showed an enhancement of PMP, suggesting that atmospheric warming could be one important factor controlling the increase in PMP. In light of the projected increase in precipitation extremes under a warming environment, the reasonableness and role of PMP deserves more in-depth examination.« less
Assessing the vulnerability of traditional maize seed systems in Mexico to climate change.
Bellon, Mauricio R; Hodson, David; Hellin, Jon
2011-08-16
Climate change is predicted to have major impacts on small-scale farmers in Mexico whose livelihoods depend on rain-fed maize. We examined the capacity of traditional maize seed systems to provide these farmers with appropriate genetic material under predicted agro-ecological conditions associated with climate change. We studied the structure and spatial scope of seed systems of 20 communities in four transects across an altitudinal gradient from 10-2,980 m above sea level in five states of eastern Mexico. Results indicate that 90% of all of the seed lots are obtained within 10 km of a community and 87% within an altitudinal range of ±50 m but with variation across four agro-climate environments: wet lowland, dry lowland, wet upper midlatitude, and highlands. Climate models suggest a drying and warming trend for the entire study area during the main maize season, leading to substantial shifts in the spatial distribution patterns of agro-climate environments. For all communities except those in the highlands, predicted future maize environments already are represented within the 10-km radial zones, indicating that in the future farmers will have easy access to adapted planting material. Farmers in the highlands are the most vulnerable and probably will need to acquire seed from outside their traditional geographical ranges. This change in seed sources probably will entail important information costs and the development of new seed and associated social networks, including improved linkages between traditional and formal seed systems and more effective and efficient seed-supply chains. The study has implications for analogous areas elsewhere in Mexico and around the world.
NASA Astrophysics Data System (ADS)
Constantinescu, Robert; Robertson, Richard; Lindsay, Jan M.; Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Smith, Patrick; Stewart, Roderick
2016-11-01
We report on the first "real-time" application of the BET_UNREST (Bayesian Event Tree for Volcanic Unrest) probabilistic model, during a VUELCO Simulation Exercise carried out on the island of Dominica, Lesser Antilles, in May 2015. Dominica has a concentration of nine potentially active volcanic centers and frequent volcanic earthquake swarms at shallow depths, intense geothermal activity, and recent phreatic explosions (1997) indicate the region is still active. The exercise scenario was developed in secret by a team of scientists from The University of the West Indies (Trinidad and Tobago) and University of Auckland (New Zealand). The simulated unrest activity was provided to the exercise's Scientific Team in three "phases" through exercise injects comprising processed monitoring data. We applied the newly created BET_UNREST model through its software implementation PyBetUnrest, to estimate the probabilities of having (i) unrest of (ii) magmatic, hydrothermal or tectonic origin, which may or may not lead to (iii) an eruption. The probabilities obtained for each simulated phase raised controversy and intense deliberations among the members of the Scientific Team. The results were often considered to be "too high" and were not included in any of the reports presented to ODM (Office for Disaster Management) revealing interesting crisis communication challenges. We concluded that the PyBetUnrest application itself was successful and brought the tool one step closer to a full implementation. However, as with any newly proposed method, it needs more testing, and in order to be able to use it in the future, we make a series of recommendations for future applications.
Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean
2018-05-01
A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.
Accounting for Incomplete Species Detection in Fish Community Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta
2013-01-01
Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less
Fire and climate suitability for woody vegetation communities in the south central United States
Stroh, Esther; Struckhoff, Matthew; Stambaugh, Michael C.; Guyette, Richard P.
2018-01-01
using a physical chemistry fire frequency model. We then used the fire probability data with additional climate parameters to construct maximum entropy environmental suitability models for three south central US vegetation communities. The modeled communities included an oak type (dominated by post oak, Quercus stellata Wangenh., and blackjack oak, Q. marilandica Münchh.), a mesquite type (dominated by honey mesquite, Prosopis glandulosa Torr., and velvet mesquite, P. velutina Wooton), and a pinyon−juniper type (dominated by pinyon pine, Pinus edulis Engelm., and Utah juniper, Juniperus osteosperma [Torr.] Little). We mapped baseline and future mean fire-climate suitability using data from three global climate models for 2040 to 2069 and 2070 to 2099; we also mapped future locations of threshold conditions for which all three models agreed on suitability for each community. Future projections included northward, southward, and eastward shifts in suitable conditions for the oaks along a broad path of fire-climate stability; an overall reduction in suitable area for historic mesquite communities coupled with potential expansion to new areas; and constriction and isolation of suitable conditions for pinyon−juniper communities. The inclusion of fire probability adds an important driver of vegetation distribution to climate envelope modeling. The simple models showed good fit, but future projections failed to account for future management activities or land use changes. Results provided information on potential future de-coupling and spatial re-arrangement of environmental conditions under which these communities have historically persisted and been managed. In particular, consensus threshold maps can inform long-term planning for maintenance or restoration of these communities, and they can be used as a potential tool for other communities in fire-prone environments within the study area and beyond its borders.
Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City
NASA Astrophysics Data System (ADS)
Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo
2014-05-01
The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.
Martian Magmatic-Driven Hydrothermal Sites: Potential Sources of Energy, Water, and Life
NASA Technical Reports Server (NTRS)
Anderson, R. C.; Dohm, J. M.; Baker, V. R.; Ferris, J. C.; Hare, T. M.; Tanaka, K. L.; Klemaszewski, J. E.; Skinner, J. A.; Scott, D. H.
2000-01-01
Magmatic-driven processes and impact events dominate the geologic record of Mars. Such recorded geologic activity coupled with significant evidence of past and present-day water/ice, above and below the martian surface, indicate that hydrothermal environments certainly existed in the past and may exist today. The identification of such environments, especially long-lived magmatic-driven hydrothermal environments, provides NASA with significant target sites for future sample return missions, since they (1) could favor the development and sustenance of life, (2) may comprise a large variety of exotic mineral assemblages, and (3) could potentially contain water/ice reservoirs for future Mars-related human activities. If life developed on Mars, the fossil record would presumably be at its greatest concentration and diversity in environments where long-term energy sources and water coexisted such as at sites where long-lived, magmatic-driven hydrothermal activity occurred. These assertions are supported by terrestrial analogs. Small, single-celled creatures (prokaryotes) are vitally important in the evolution of the Earth; these prokaryotes are environmentally tough and tolerant of environmental extremes of pH, temperature, salinity, and anoxic conditions found around hydrothermal vents. In addition, there is a great ability for bacteria to survive long periods of geologic time in extreme conditions, including high temperature hydrogen sulfide and sulfur erupted from Mount St. Helens volcano. Our team of investigators is conducting a geological investigation using multiple mission-derived datasets (e.g., existing geologic map data, MOC imagery, MOLA, TES image data, geophysical data, etc.) to identify prime target sites of hydrothermal activity for future hydrological, mineralogical, and biological investigations. The identification of these sites will enhance the probability of success for future missions to Mars.
Measures, Probability and Holography in Cosmology
NASA Astrophysics Data System (ADS)
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of classical probability theory to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation. The fourth project looks at finite universes as alternatives to multiverse theories of cosmology. We compare two holographic arguments that impose especially strong bounds on the amount of inflation. One comes from the de Sitter Equilibrium cosmology and the other from the work of Banks and Fischler. We find that simple versions of these two approaches yield the same bound on the number of e-foldings. A careful examination reveals that while these pictures are similar in spirit, they are not necessarily identical prescriptions. We apply the two pictures to specific cosmologies which expose potentially important differences and which also demonstrate ways these seemingly simple proposals can be tricky to implement in practice.
Intra-annual patterns in adult band-tailed pigeon survival estimates
Casazza, Michael L.; Coates, Peter S.; Overton, Cory T.; Howe, Kristy H.
2015-01-01
Implications: We present the first inter-seasonal analysis of survival probability of the Pacific coast race of band-tailed pigeons and illustrate important temporal patterns that may influence future species management including harvest strategies and disease monitoring.
Janssen, Stefan; Schudoma, Christian; Steger, Gerhard; Giegerich, Robert
2011-11-03
Many bioinformatics tools for RNA secondary structure analysis are based on a thermodynamic model of RNA folding. They predict a single, "optimal" structure by free energy minimization, they enumerate near-optimal structures, they compute base pair probabilities and dot plots, representative structures of different abstract shapes, or Boltzmann probabilities of structures and shapes. Although all programs refer to the same physical model, they implement it with considerable variation for different tasks, and little is known about the effects of heuristic assumptions and model simplifications used by the programs on the outcome of the analysis. We extract four different models of the thermodynamic folding space which underlie the programs RNAFOLD, RNASHAPES, and RNASUBOPT. Their differences lie within the details of the energy model and the granularity of the folding space. We implement probabilistic shape analysis for all models, and introduce the shape probability shift as a robust measure of model similarity. Using four data sets derived from experimentally solved structures, we provide a quantitative evaluation of the model differences. We find that search space granularity affects the computed shape probabilities less than the over- or underapproximation of free energy by a simplified energy model. Still, the approximations perform similar enough to implementations of the full model to justify their continued use in settings where computational constraints call for simpler algorithms. On the side, we observe that the rarely used level 2 shapes, which predict the complete arrangement of helices, multiloops, internal loops and bulges, include the "true" shape in a rather small number of predicted high probability shapes. This calls for an investigation of new strategies to extract high probability members from the (very large) level 2 shape space of an RNA sequence. We provide implementations of all four models, written in a declarative style that makes them easy to be modified. Based on our study, future work on thermodynamic RNA folding may make a choice of model based on our empirical data. It can take our implementations as a starting point for further program development.
Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis
NASA Astrophysics Data System (ADS)
George, E., Chan, K.
2012-09-01
Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for pairs of large objects may be inadequate. These relationships may also form the basis of an important metric for catalog maintenance by defining the maximum allowable covariance size for effective conjunction analysis. The application of these techniques promises to greatly improve the efficiency and completeness of conjunction analysis.