Assessing the present and future probability of Hurricane Harvey's rainfall
NASA Astrophysics Data System (ADS)
Emanuel, Kerry
2017-11-01
We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981–2000 and will increase to 18% over the period 2081–2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century.
Assessing the present and future probability of Hurricane Harvey's rainfall.
Emanuel, Kerry
2017-11-28
We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981-2000 and will increase to 18% over the period 2081-2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century. Copyright © 2017 the Author(s). Published by PNAS.
Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick
2012-01-01
Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.
Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection
NASA Astrophysics Data System (ADS)
Denuit, Michel; Dhaene, Jan
2007-06-01
In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.
Oddo, Perry C.; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies. PMID:28350884
Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.
Looking to the Future: Will Behavior Analysis Survive and Prosper?
ERIC Educational Resources Information Center
Poling, Alan
2010-01-01
Behavior analysis as a discipline currently is doing relatively well. How it will do in the future is unclear and depends on how the field, and the world at large, changes. Five current characteristics of the discipline that appear to reduce the probability that it will survive and prosper are discussed and suggestions for improvement are offered.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui
Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less
Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Djulbegovic, Benjamin
2014-10-21
To understand how often 'breakthroughs,' that is, treatments that significantly improve health outcomes, can be developed. We applied weighted adaptive kernel density estimation to construct the probability density function for observed treatment effects from five publicly funded cohorts and one privately funded group. 820 trials involving 1064 comparisons and enrolling 331,004 patients were conducted by five publicly funded cooperative groups. 40 cancer trials involving 50 comparisons and enrolling a total of 19,889 patients were conducted by GlaxoSmithKline. We calculated that the probability of detecting treatment with large effects is 10% (5-25%), and that the probability of detecting treatment with very large treatment effects is 2% (0.3-10%). Researchers themselves judged that they discovered a new, breakthrough intervention in 16% of trials. We propose these figures as the benchmarks against which future development of 'breakthrough' treatments should be measured. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.
2010-01-01
Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.
NASA Astrophysics Data System (ADS)
Cocco, M.
2001-12-01
Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to the stressing history perturbing the faults (such as dynamic stress changes, post-seismic stress changes caused by viscolelastic relaxation or fluid flow). If, for instance, we believe that dynamic stress changes can trigger aftershocks or earthquakes years after the passing of the seismic waves through the fault, the perspective of calculating interaction probability is untenable. It is therefore clear we have learned a lot on earthquake interaction incorporating fault constitutive properties, allowing to solve existing controversy, but leaving open questions for future research.
Drought forecasting in Luanhe River basin involving climatic indices
NASA Astrophysics Data System (ADS)
Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.
2017-11-01
Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.
Last of the Monumental Book Catalogs.
ERIC Educational Resources Information Center
Welsh, William J.
1981-01-01
Reviews the history of the National Union Catalog and the publication of the Pre-1956 Imprints. The roles of the ALA and Mansell Publishing in the completion of what is probably the last large-scale nonautomated bibliographic project, editorial problems, and the role of automation in future projects are discussed. (JL)
Bush, Peter W.; Johnston, Richard H.
1988-01-01
A considerable area remains of the Floridan aquifer system where large ground-water supplies may be developed. This area is largely inland from the coasts and characterized by high transmissivity and minimal development prior to the early 1980's. The major constraint on future development probably is degradation of water quality rather than water-quantity limitations.
Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century
Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.
2011-01-01
Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained <10% through the end of the century. The summed probabilities of vulnerable, rare, and extirpated (P(v,r,e)) increased from a current level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.
Early warning of climate tipping points
NASA Astrophysics Data System (ADS)
Lenton, Timothy M.
2011-07-01
A climate 'tipping point' occurs when a small change in forcing triggers a strongly nonlinear response in the internal dynamics of part of the climate system, qualitatively changing its future state. Human-induced climate change could push several large-scale 'tipping elements' past a tipping point. Candidates include irreversible melt of the Greenland ice sheet, dieback of the Amazon rainforest and shift of the West African monsoon. Recent assessments give an increased probability of future tipping events, and the corresponding impacts are estimated to be large, making them significant risks. Recent work shows that early warning of an approaching climate tipping point is possible in principle, and could have considerable value in reducing the risk that they pose.
NASA Astrophysics Data System (ADS)
Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.
2017-06-01
In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.
NASA Astrophysics Data System (ADS)
Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram
2018-03-01
We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.
Pallister, J.S.; Hoblitt, R.P.; Crandell, D.R.; Mullineaux, D.R.
1992-01-01
Available geophysical and geologic data provide a simplified model of the current magmatic plumbing system of Mount St. Helens (MSH). This model and new geochemical data are the basis for the revised hazards assessment presented here. The assessment is weighted by the style of eruptions and the chemistry of magmas erupted during the past 500 years, the interval for which the most detailed stratigraphic and geochemical data are available. This interval includes the Kalama (A. D. 1480-1770s?), Goat Rocks (A.D. 1800-1857), and current eruptive periods. In each of these periods, silica content decreased, then increased. The Kalama is a large amplitude chemical cycle (SiO2: 57%-67%), produced by mixing of arc dacite, which is depleted in high field-strength and incompatible elements, with enriched (OIB-like) basalt. The Goat Rocks and current cycles are of small amplitude (SiO2: 61%-64% and 62%-65%) and are related to the fluid dynamics of magma withdrawal from a zoned reservoir. The cyclic behavior is used to forecast future activity. The 1980-1986 chemical cycle, and consequently the current eruptive period, appears to be virtually complete. This inference is supported by the progressively decreasing volumes and volatile contents of magma erupted since 1980, both changes that suggest a decreasing potential for a major explosive eruption in the near future. However, recent changes in seismicity and a series of small gas-release explosions (beginning in late 1989 and accompanied by eruption of a minor fraction of relatively low-silica tephra on 6 January and 5 November 1990) suggest that the current eruptive period may continue to produce small explosions and that a small amount of magma may still be present within the conduit. The gas-release explosions occur without warning and pose a continuing hazard, especially in the crater area. An eruption as large or larger than that of 18 May 1980 (???0.5 km3 dense-rock equivalent) probably will occur only if magma rises from an inferred deep (???7 km), relative large (5-7 km3) reservoir. A conservative approach to hazard assessment is to assume that this deep magma is rich in volatiles and capable of erupting explosively to produce voluminous fall deposits and pyroclastic flows. Warning of such an eruption is expectable, however, because magma ascent would probably be accompanied by shallow seismicity that could be detected by the existing seismic-monitoring system. A future large-volume eruption (???0.1 km3) is virtually certain; the eruptive history of the past 500 years indicates the probability of a large explosive eruption is at least 1% annually. Intervals between large eruptions at Mount St. Helens have varied widely; consequently, we cannot confidently forecast whether the next large eruption will be years decades, or farther in the future. However, we can forecast the types of hazards, and the areas that will be most affected by future large-volume eruptions, as well as hazards associated with the approaching end of the current eruptive period. ?? 1992 Springer-Verlag.
ERIC Educational Resources Information Center
Djerassi, Carl
1972-01-01
Manipulation of genes in human beings on a large scale is not possible under present conditions because it lacks economic potential and other attractions for industry. However, preventive'' genetic engineering may be a field for vast research in the future and will perhaps be approved by governments, parishes, people and industry. (PS)
Andrew Youngblood; Kerry L. Metlen; Eric E. Knapp; Kenneth W. Outcalt; Scott L. Stephens; Thomas A. Waldrop; Daniel Yaussy
2005-01-01
Many fire-dependent forests today are denser, contain fewer large trees, have higher fuel loads, and greater fuel continuity than occurred under historical fire regimes. These conditions increase the probability of unnaturally severe wildfires. Silviculturists are increasingly being asked to design fuel reduction treatments to help protect existing and future forest...
NASA Astrophysics Data System (ADS)
Kaneko, Yoshihiro; Wallace, Laura M.; Hamling, Ian J.; Gerstenberger, Matthew C.
2018-05-01
Slow slip events (SSEs) have been documented in subduction zones worldwide, yet their implications for future earthquake occurrence are not well understood. Here we develop a relatively simple, simulation-based method for estimating the probability of megathrust earthquakes following tectonic events that induce any transient stress perturbations. This method has been applied to the locked Hikurangi megathrust (New Zealand) surrounded on all sides by the 2016 Kaikoura earthquake and SSEs. Our models indicate the annual probability of a M≥7.8 earthquake over 1 year after the Kaikoura earthquake increases by 1.3-18 times relative to the pre-Kaikoura probability, and the absolute probability is in the range of 0.6-7%. We find that probabilities of a large earthquake are mainly controlled by the ratio of the total stressing rate induced by all nearby tectonic sources to the mean stress drop of earthquakes. Our method can be applied to evaluate the potential for triggering a megathrust earthquake following SSEs in other subduction zones.
Neural dynamics of reward probability coding: a Magnetoencephalographic study in humans
Thomas, Julie; Vanni-Mercier, Giovanna; Dreher, Jean-Claude
2013-01-01
Prediction of future rewards and discrepancy between actual and expected outcomes (prediction error) are crucial signals for adaptive behavior. In humans, a number of fMRI studies demonstrated that reward probability modulates these two signals in a large brain network. Yet, the spatio-temporal dynamics underlying the neural coding of reward probability remains unknown. Here, using magnetoencephalography, we investigated the neural dynamics of prediction and reward prediction error computations while subjects learned to associate cues of slot machines with monetary rewards with different probabilities. We showed that event-related magnetic fields (ERFs) arising from the visual cortex coded the expected reward value 155 ms after the cue, demonstrating that reward value signals emerge early in the visual stream. Moreover, a prediction error was reflected in ERF peaking 300 ms after the rewarded outcome and showing decreasing amplitude with higher reward probability. This prediction error signal was generated in a network including the anterior and posterior cingulate cortex. These findings pinpoint the spatio-temporal characteristics underlying reward probability coding. Together, our results provide insights into the neural dynamics underlying the ability to learn probabilistic stimuli-reward contingencies. PMID:24302894
Moxie matters: associations of future orientation with active life expectancy.
Laditka, Sarah B; Laditka, James N
2017-10-01
Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.
ERIC Educational Resources Information Center
Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.; Miech, Richard A.
2014-01-01
Substance use is a leading cause of preventable morbidity and mortality, and is in large part why people in the U.S. have the highest probability among industrialized nations of dying by age 50. Substance use deserves our sustained attention. It is also an important determinant of many social ills including child and spouse abuse, violence more…
Training in Small Business Retailing: Testing Human Capital Theory.
ERIC Educational Resources Information Center
Barcala, Marta Fernandez; Perez, Maria Jose Sanzo; Gutierrez, Juan Antonio Trespalacios
1999-01-01
Looks at four models of training demand: (1) probability of attending training in the near future; (2) probability of having attended training in the past; (3) probability of being willing to follow multimedia and correspondence courses; and (4) probability of repeating the experience of attending another training course in the near future.…
Large-Scale Controls and Characteristics of Fire Activity in Central Chile, 2001-2015
NASA Astrophysics Data System (ADS)
McWethy, D. B.; Pauchard, A.; García, R.; Holz, A.; González, M.; Veblen, T. T.; Stahl, J.
2016-12-01
In recent decades, fire activity has increased in many ecosystems worldwide, even where fuel conditions and natural ignitions historically limited fire activity, and this increase begs questions of whether climate change, land-use change, and/or altered vegetation are responsible. Increased frequency of large fires in these settings has been attributed to drier-than-average summers and longer fire seasons as well as fuel accumulation related to ENSO events, raising concerns about the trajectory of post-fire vegetation dynamics and future fire regimes. In temperate and Mediterranean forests of central Chile, recent large fires associated with altered ecosystems, climate variability and land-use change highlight the risk and hazard of increasing fire activity yet the causes and consequences are poorly understood. To better understand characteristics of recent fire activity, key drivers of fire occurrence and the spatial probability of wildfire we examined the relationship between fire activity derived from MODIS satellite imagery and biophysical, land-cover and land-use variables. The probability of fire occurrence and annual area burned was best predicted by seasonal precipitation, annual temperature and land cover type. The likelihood of fire occurrence was greatest in Matorral shrublands, agricultural lands (including pasture lands) and Pinus and Eucalyptus plantations, highlighting the importance of vegetation type and fuel flammability as a critical control on fire activity. Our results suggest that land-use change responsible for the widespread presence of highly flammable vegetation and projections for continued warming and drying will likely combine to promote the occurrence of large fires in central Chile in the future.
Western North Pacific Tropical Cyclone Model Tracks in Present and Future Climates
NASA Astrophysics Data System (ADS)
Nakamura, Jennifer; Camargo, Suzana J.; Sobel, Adam H.; Henderson, Naomi; Emanuel, Kerry A.; Kumar, Arun; LaRow, Timothy E.; Murakami, Hiroyuki; Roberts, Malcolm J.; Scoccimarro, Enrico; Vidale, Pier Luigi; Wang, Hui; Wehner, Michael F.; Zhao, Ming
2017-09-01
Western North Pacific tropical cyclone (TC) model tracks are analyzed in two large multimodel ensembles, spanning a large variety of models and multiple future climate scenarios. Two methodologies are used to synthesize the properties of TC tracks in this large data set: cluster analysis and mass moment ellipses. First, the models' TC tracks are compared to observed TC tracks' characteristics, and a subset of the models is chosen for analysis, based on the tracks' similarity to observations and sample size. Potential changes in track types in a warming climate are identified by comparing the kernel smoothed probability distributions of various track variables in historical and future scenarios using a Kolmogorov-Smirnov significance test. Two track changes are identified. The first is a statistically significant increase in the north-south expansion, which can also be viewed as a poleward shift, as TC tracks are prevented from expanding equatorward due to the weak Coriolis force near the equator. The second change is an eastward shift in the storm tracks that occur near the central Pacific in one of the multimodel ensembles, indicating a possible increase in the occurrence of storms near Hawaii in a warming climate. The dependence of the results on which model and future scenario are considered emphasizes the necessity of including multiple models and scenarios when considering future changes in TC characteristics.
Potential economic benefits of adapting agricultural production systems to future climate change
Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.
2010-01-01
Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.
Potential economic benefits of adapting agricultural production systems to future climate change.
Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R
2010-03-01
Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.
How Unusual were Hurricane Harvey's Rains?
NASA Astrophysics Data System (ADS)
Emanuel, K.
2017-12-01
We apply an advanced technique for hurricane risk assessment to evaluate the probability of hurricane rainfall of Harvey's magnitude. The technique embeds a detailed computational hurricane model in the large-scale conditions represented by climate reanalyses and by climate models. We simulate 3700 hurricane events affecting the state of Texas, from each of three climate reanalyses spanning the period 1980-2016, and 2000 events from each of six climate models for each of two periods: the period 1981-2000 from historical simulations, and the period 2081-2100 from future simulations under Representative Concentration Pathway (RCP) 8.5. On the basis of these simulations, we estimate that hurricane rain of Harvey's magnitude in the state of Texas would have had an annual probability of 0.01 in the late twentieth century, and will have an annual probability of 0.18 by the end of this century, with remarkably small scatter among the six climate models downscaled. If the event frequency is changing linearly over time, this would yield an annual probability of 0.06 in 2017.
A Bayesian predictive two-stage design for phase II clinical trials.
Sambucini, Valeria
2008-04-15
In this paper, we propose a Bayesian two-stage design for phase II clinical trials, which represents a predictive version of the single threshold design (STD) recently introduced by Tan and Machin. The STD two-stage sample sizes are determined specifying a minimum threshold for the posterior probability that the true response rate exceeds a pre-specified target value and assuming that the observed response rate is slightly higher than the target. Unlike the STD, we do not refer to a fixed experimental outcome, but take into account the uncertainty about future data. In both stages, the design aims to control the probability of getting a large posterior probability that the true response rate exceeds the target value. Such a probability is expressed in terms of prior predictive distributions of the data. The performance of the design is based on the distinction between analysis and design priors, recently introduced in the literature. The properties of the method are studied when all the design parameters vary.
Scaling properties and universality of first-passage-time probabilities in financial markets
NASA Astrophysics Data System (ADS)
Perelló, Josep; Gutiérrez-Roig, Mario; Masoliver, Jaume
2011-12-01
Financial markets provide an ideal frame for the study of crossing or first-passage time events of non-Gaussian correlated dynamics, mainly because large data sets are available. Tick-by-tick data of six futures markets are herein considered, resulting in fat-tailed first-passage time probabilities. The scaling of the return with its standard deviation collapses the probabilities of all markets examined—and also for different time horizons—into single curves, suggesting that first-passage statistics is market independent (at least for high-frequency data). On the other hand, a very closely related quantity, the survival probability, shows, away from the center and tails of the distribution, a hyperbolic t-1/2 decay typical of a Markovian dynamics, albeit the existence of memory in markets. Modifications of the Weibull and Student distributions are good candidates for the phenomenological description of first-passage time properties under certain regimes. The scaling strategies shown may be useful for risk control and algorithmic trading.
Global Pyrogeography: the Current and Future Distribution of Wildfire
Krawchuk, Meg A.; Moritz, Max A.; Parisien, Marc-André; Van Dorn, Jeff; Hayhoe, Katharine
2009-01-01
Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade). We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research on global vegetation-climate change dynamics and conservation planning. PMID:19352494
ERIC Educational Resources Information Center
Miech, Richard A.; Johnston, Lloyd D.; O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.
2015-01-01
Substance use is a leading cause of preventable morbidity and mortality, and it is in large part why people in the U.S. have the highest probability among 17 high-income nations of dying by age 50. Substance use is also an important contributor to many social ills including child and spouse abuse, violence more generally, theft, suicide, and more;…
De Brigard, Felipe; Giovanello, Kelly S
2012-09-01
Recent findings suggest that our capacity to imagine the future depends on our capacity to remember the past. However, the extent to which episodic memory is involved in our capacity to think about what could have happened in our past, yet did not occur (i.e., episodic counterfactual thinking), remains largely unexplored. The current experiments investigate the phenomenological characteristics and the influence of outcome valence on the experience of past, future and counterfactual thoughts. Participants were asked to mentally simulate past, future, and counterfactual events with positive or negative outcomes. Features of their subjective experiences during each type of simulation were measured using questionnaires and autobiographical interviews. The results suggest that clarity and vividness were higher for past than future and counterfactual simulations. Additionally, emotional intensity was lower for counterfactual simulations than past and future simulations. Finally, outcome valence influenced participants' judgment of probability for future and counterfactual simulations. Copyright © 2012 Elsevier Inc. All rights reserved.
Water supply and management concepts
Leopold, Luna Bergere
1965-01-01
If I had to cite one fact about water in the United States which would be not only the most important but also the most informative, the one I would choose would k this: Over 50 percent of all the water presently being used in the United States is used by industry, and nearly all of that is used for cooling.The large amount of attention recently being given to water shortage and the expected rapid increase in demand for water is probably to some extent clouded because there are certain simple facts about water availability and water use which, though readily available, are not generally either known or understood.Probably most people react to information in the public press about present and possible future water shortages with the thought that it is going to be more difficult in the future to supply the ordinary household with water for drinking, washing, and tbe culinary arts. As a matter of fact that may be true to some extent, but it is not the salient aspect.
Space Radiation Risk Assessment for Future Lunar Missions
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Ponomarev, Artem; Atwell, Bill; Cucinotta, Francis A.
2007-01-01
For lunar exploration mission design, radiation risk assessments require the understanding of future space radiation environments in support of resource management decisions, operational planning, and a go/no-go decision. The future GCR flux was estimated as a function of interplanetary deceleration potential, which was coupled with the estimated neutron monitor rate from the Climax monitor using a statistical model. A probability distribution function for solar particle event (SPE) occurrence was formed from proton fluence measurements of SPEs occurred during the past 5 solar cycles (19-23). Large proton SPEs identified from impulsive nitrate enhancements in polar ice for which the fluences are greater than 2 10(exp 9) protons/sq cm for energies greater than 30 MeV, were also combined to extend the probability calculation for high level of proton fluences. The probability with which any given proton fluence level of a SPE will be exceeded during a space mission of defined duration was then calculated. Analytic energy spectra of SPEs at different ranks of the integral fluences were constructed over broad energy ranges extending out to GeV, and representative exposure levels were analyzed at those fluences. For the development of an integrated strategy for radiation protection on lunar exploration missions, effective doses at various points inside a spacecraft were calculated with detailed geometry models representing proposed transfer vehicle and habitat concepts. Preliminary radiation risk assessments from SPE and GCR were compared for various configuration concepts of radiation shelter in exploratory-class spacecrafts.
Assessing changes in failure probability of dams in a changing climate
NASA Astrophysics Data System (ADS)
Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.
2017-12-01
Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.
TORNADO-WARNING PERFORMANCE IN THE PAST AND FUTURE: A Perspective from Signal Detection Theory.
NASA Astrophysics Data System (ADS)
Brooks, Harold E.
2004-06-01
Changes over the years in tornado-warning performance in the United States can be modeled from the perspective of signal detection theory. From this view, it can be seen that there have been distinct periods of change in performance, most likely associated with deployment of radars, and changes in scientific understanding and training. The model also makes it clear that improvements in the false alarm ratio can only occur at the cost of large decreases in the probability of detection, or with large improvements in the overall quality of the warning system.
The Everett-Wheeler interpretation and the open future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudbery, Anthony
2011-03-28
I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.
Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.
Speirs, Calandra; Huang, Vivian; Konnert, Candace
2017-09-01
Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.
Estimating trends in alligator populations from nightlight survey data
Fujisaki, Ikuko; Mazzotti, Frank J.; Dorazio, Robert M.; Rice, Kenneth G.; Cherkiss, Michael; Jeffery, Brian
2011-01-01
Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001–2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations.
Estimating trends in alligator populations from nightlight survey data
Fujisaki, Ikuko; Mazzotti, F.J.; Dorazio, R.M.; Rice, K.G.; Cherkiss, M.; Jeffery, B.
2011-01-01
Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001-2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations. ?? 2011 US Government.
Influence of internal variability on population exposure to hydroclimatic changes
NASA Astrophysics Data System (ADS)
Mankin, Justin S.; Viviroli, Daniel; Mekonnen, Mesfin M.; Hoekstra, Arjen Y.; Horton, Radley M.; E Smerdon, Jason; Diffenbaugh, Noah S.
2017-04-01
Future freshwater supply, human water demand, and people’s exposure to water stress are subject to multiple sources of uncertainty, including unknown future pathways of fossil fuel and water consumption, and ‘irreducible’ uncertainty arising from internal climate system variability. Such internal variability can conceal forced hydroclimatic changes on multi-decadal timescales and near-continental spatial-scales. Using three projections of population growth, a large ensemble from a single Earth system model, and assuming stationary per capita water consumption, we quantify the likelihoods of future population exposure to increased hydroclimatic deficits, which we define as the average duration and magnitude by which evapotranspiration exceeds precipitation in a basin. We calculate that by 2060, ∽31%-35% of the global population will be exposed to >50% probability of hydroclimatic deficit increases that exceed existing hydrological storage, with up to 9% of people exposed to >90% probability. However, internal variability, which is an irreducible uncertainty in climate model predictions that is under-sampled in water resource projections, creates substantial uncertainty in predicted exposure: ∽86%-91% of people will reside where irreducible uncertainty spans the potential for both increases and decreases in sub-annual water deficits. In one population scenario, changes in exposure to large hydroclimate deficits vary from -3% to +6% of global population, a range arising entirely from internal variability. The uncertainty in risk arising from irreducible uncertainty in the precise pattern of hydroclimatic change, which is typically conflated with other uncertainties in projections, is critical for climate risk management that seeks to optimize adaptations that are robust to the full set of potential real-world outcomes.
Carry-over effects of the social environment on future divorce probability in a wild bird population
Culina, Antica; Hinde, Camilla A.; Sheldon, Ben C.
2015-01-01
Initial mate choice and re-mating strategies (infidelity and divorce) influence individual fitness. Both of these should be influenced by the social environment, which determines the number and availability of potential partners. While most studies looking at this relationship take a population-level approach, individual-level responses to variation in the social environment remain largely unstudied. Here, we explore carry-over effects on future mating decisions of the social environment in which the initial mating decision occurred. Using detailed data on the winter social networks of great tits, we tested whether the probability of subsequent divorce, a year later, could be predicted by measures of the social environment at the time of pairing. We found that males that had a lower proportion of female associates, and whose partner ranked lower among these, as well as inexperienced breeders, were more likely to divorce after breeding. We found no evidence that a female's social environment influenced the probability of divorce. Our findings highlight the importance of the social environment that individuals experience during initial pair formation on later pairing outcomes, and demonstrate that such effects can be delayed. Exploring these extended effects of the social environment can yield valuable insights into processes and selective pressures acting upon the mating strategies that individuals adopt. PMID:26468239
Extreme weather and experience influence reproduction in an endangered bird
Reichert, Brian E.; Cattau, Christopher E.; Fletcher, Robert J.; Kendall, William L.; Kitchens, Wiley M.
2012-01-01
Using a 14-year time series spanning large variation in climatic conditions and the entirety of a population's breeding range, we estimated the effects of extreme weather conditions (drought) on the state-specific probabilities of breeding and survival of an endangered bird, the Florida Snail Kite (Rostrhamus sociabilis plumbeus). Our analysis accounted for uncertainty in breeding status assignment, a common source of uncertainty that is often ignored when states are based on field observations. Breeding probabilities in adult kites (>1 year of age) decreased during droughts, whereas the probability of breeding in young kites (1 year of age) tended to increase. Individuals attempting to breed showed no evidence of reduced future survival. Although population viability analyses of this species and other species often implicitly assume that all adults will attempt to breed, we find that breeding probabilities were significantly <1 for all 13 estimable years considered. Our results suggest that experience is an important factor determining whether or not individuals attempt to breed during harsh environmental conditions and that reproductive effort may be constrained by an individual's quality and/or despotic behavior among individuals attempting to breed.
Future equivalent of 2010 Russian heatwave intensified by weakening soil moisture constraints
NASA Astrophysics Data System (ADS)
Rasmijn, L. M.; van der Schrier, G.; Bintanja, R.; Barkmeijer, J.; Sterl, A.; Hazeleger, W.
2018-05-01
The 2010 heatwave in eastern Europe and Russia ranks among the hottest events ever recorded in the region1,2. The excessive summer warmth was related to an anomalously widespread and intense quasi-stationary anticyclonic circulation anomaly over western Russia, reinforced by depletion of spring soil moisture1,3-5. At present, high soil moisture levels and strong surface evaporation generally tend to cap maximum summer temperatures6-8, but these constraints may weaken under future warming9,10. Here, we use a data assimilation technique in which future climate model simulations are nudged to realistically represent the persistence and strength of the 2010 blocked atmospheric flow. In the future, synoptically driven extreme warming under favourable large-scale atmospheric conditions will no longer be suppressed by abundant soil moisture, leading to a disproportional intensification of future heatwaves. This implies that future mid-latitude heatwaves analogous to the 2010 event will become even more extreme than previously thought, with temperature extremes increasing by 8.4 °C over western Russia. Thus, the socioeconomic impacts of future heatwaves will probably be amplified beyond current estimates.
Impacts of savanna trees on forage quality for a large African herbivore
De Kroon, Hans; Prins, Herbert H. T.
2008-01-01
Recently, cover of large trees in African savannas has rapidly declined due to elephant pressure, frequent fires and charcoal production. The reduction in large trees could have consequences for large herbivores through a change in forage quality. In Tarangire National Park, in Northern Tanzania, we studied the impact of large savanna trees on forage quality for wildebeest by collecting samples of dominant grass species in open grassland and under and around large Acacia tortilis trees. Grasses growing under trees had a much higher forage quality than grasses from the open field indicated by a more favourable leaf/stem ratio and higher protein and lower fibre concentrations. Analysing the grass leaf data with a linear programming model indicated that large savanna trees could be essential for the survival of wildebeest, the dominant herbivore in Tarangire. Due to the high fibre content and low nutrient and protein concentrations of grasses from the open field, maximum fibre intake is reached before nutrient requirements are satisfied. All requirements can only be satisfied by combining forage from open grassland with either forage from under or around tree canopies. Forage quality was also higher around dead trees than in the open field. So forage quality does not reduce immediately after trees die which explains why negative effects of reduced tree numbers probably go initially unnoticed. In conclusion our results suggest that continued destruction of large trees could affect future numbers of large herbivores in African savannas and better protection of large trees is probably necessary to sustain high animal densities in these ecosystems. PMID:18309522
Steam explosions, earthquakes, and volcanic eruptions -- what's in Yellowstone's future?
Lowenstern, Jacob B.; Christiansen, Robert L.; Smith, Robert B.; Morgan, Lisa A.; Heasler, Henry
2005-01-01
Yellowstone, one of the world?s largest active volcanic systems, has produced several giant volcanic eruptions in the past few million years, as well as many smaller eruptions and steam explosions. Although no eruptions of lava or volcanic ash have occurred for many thousands of years, future eruptions are likely. In the next few hundred years, hazards will most probably be limited to ongoing geyser and hot-spring activity, occasional steam explosions, and moderate to large earthquakes. To better understand Yellowstone?s volcano and earthquake hazards and to help protect the public, the U.S. Geological Survey, the University of Utah, and Yellowstone National Park formed the Yellowstone Volcano Observatory, which continuously monitors activity in the region.
NASA Astrophysics Data System (ADS)
Butler, G. V.
1981-04-01
Early space station designs are considered, taking into account Herman Oberth's first space station, the London Daily Mail Study, the first major space station design developed during the moon mission, and the Manned Orbiting Laboratory Program of DOD. Attention is given to Skylab, new space station studies, the Shuttle and Spacelab, communication satellites, solar power satellites, a 30 meter diameter radiometer for geological measurements and agricultural assessments, the mining of the moons, and questions of international cooperation. It is thought to be very probable that there will be very large space stations at some time in the future. However, for the more immediate future a step-by-step development that will start with Spacelab stations of 3-4 men is envisaged.
NASA Astrophysics Data System (ADS)
Keyser, A.; Westerling, A. L.; Jones, G.; Peery, M. Z.
2017-12-01
Sierra Nevada forests have experienced an increase in very large fires with significant areas of high burn severity, such as the Rim (2013) and King (2014) fires, that have impacted habitat of endangered species such as the California spotted owl. In order to support land manager forest management planning and risk assessment activities, we used historical wildfire histories from the Monitoring Trends in Burn Severity project and gridded hydroclimate and land surface characteristics data to develope statistical models to simulate the frequency, location and extent of high severity burned area in Sierra Nevada forest wildfires as functions of climate and land surface characteristics. We define high severity here as BA90 area: the area comprising patches with ninety percent or more basal area killed within a larger fire. We developed a system of statistical models to characterize the probability of large fire occurrence, the probability of significant BA90 area present given a large fire, and the total extent of BA90 area in a fire on a 1/16 degree lat/lon grid over the Sierra Nevada. Repeated draws from binomial and generalized pareto distributions using these probabilities generated a library of simulated histories of high severity fire for a range of near (50 yr) future climate and fuels management scenarios. Fuels management scenarios were provided by USFS Region 5. Simulated BA90 area was then downscaled to 30 m resolution using a statistical model we developed using Random Forest techniques to estimate the probability of adjacent 30m pixels burning with ninety percent basal kill as a function of fire size and vegetation and topographic features. The result is a library of simulated high resolution maps of BA90 burned areas for a range of climate and fuels management scenarios with which we estimated conditional probabilities of owl nesting sites being impacted by high severity wildfire.
A historical analysis of Plinian unrest and the key promoters of explosive activity.
NASA Astrophysics Data System (ADS)
Winson, A. E. G.; Newhall, C. G.; Costa, F.
2015-12-01
Plinian eruptions are the largest historically recorded volcanic phenomena, and have the potential to be widely destructive. Yet when a volcano becomes newly restless we are unable to anticipate whether or not a large eruption is imminent. We present the findings from a multi-parametric study of 42 large explosive eruptions (29 Plinian and 13 Sub-plinian) that form the basis for a new Bayesian Belief network that addresses this question. We combine the eruptive history of the volcanoes that have produced these large eruptions with petrological studies, and reported unrest phenomena to assess the probability of an eruption being plinian. We find that the 'plinian probability' is increased most strongly by the presence of an exsolved volatile phase in the reservoir prior to an eruption. In our survey 60% of the plinian eruptions, had an excess SO2 gas phase of more than double than it is calculated by petrologic studies alone. Probability is also increased by three related and more easily observable parameters: a high plinian Ratio (that is the ratio of VEI≥4 eruptions in a volcanoes history to the number of all VEI≥2 eruptions in the history), a repose time of more than 1000 years, and a Repose Ratio (the ratio of the average return of VEI≥4 eruptions in the volcanic record to the repose time since the last VEI≥4) of greater than 0.7. We looked for unrest signals that potentially are indicative of future plinian activity and report a few observations from case studies but cannot say if these will generally appear. Finally we present a retrospective analysis of the probabilities of eruptions in our study becoming plinian, using our Bayesian belief network. We find that these probabilities are up to about 4 times greater than those calculate from an a priori assessment of the global eruptive catalogue.
Stochastic demographic forecasting.
Lee, R D
1992-11-01
"This paper describes a particular approach to stochastic population forecasting, which is implemented for the U.S.A. through 2065. Statistical time series methods are combined with demographic models to produce plausible long run forecasts of vital rates, with probability distributions. The resulting mortality forecasts imply gains in future life expectancy that are roughly twice as large as those forecast by the Office of the Social Security Actuary.... Resulting stochastic forecasts of the elderly population, elderly dependency ratios, and payroll tax rates for health, education and pensions are presented." excerpt
Characterizing uncertain sea-level rise projections to support investment decisions.
Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.
Characterizing uncertain sea-level rise projections to support investment decisions
Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978
Reducing Community Vulnerability to Wildland Fires in Southern California
NASA Astrophysics Data System (ADS)
Keeley, J. E.
2010-12-01
In the US fires are not treated like other hazards such as earthquakes but rather as preventable through landscape fuel treatments and aggressive fire suppression. In southern California extreme fire weather has made it impossible to control all fires and thus loss of homes and lives is a constant threat to communities. There is growing evidence that indicate we are not likely to ever eliminate fires on these landscapes. Thus, it is time to reframe the fire problem and think of fires like we do with other natural hazards such as earthquakes. We do not attempt to stop earthquakes, rather the primary emphasis is on altering human infrastructure in ways that minimize community vulnerability. In other words we need to change our approach from risk elimination to risk management. This approach means we accept that we cannot eliminate fires but rather learn to live with fire by communities becoming more fire adapted. We potentially can make great strides in reducing community vulnerability by finding those factors with high impacts and are sensitive to changes in management. Presently, decision makers have relatively little guidance about which of these is likely to have the greatest impact. Future reductions in fire risk to communities requires we address both wildland and urban elements that contribute to destructive losses. Damage risk or D is determined by: D = f (I, S, E, G, H) where I = the probability of a fire starting in the landscape S = the probability of the fire reaching a size sufficient to reach the urban environment E = probability of it encroaching into the urban environment G = probability of fire propagating within the built environment H = probability of a fire, once within the built environment, resulting in the destruction of a building. In southern California, reducing I through more strategic fire prevention has potential for reducing fire risk. There are many ignition sources that could be reduced, such as replacing power line ignitions with underground lines, strategically employing arson patrols during Santa Ana wind events, enforcing regulations on power equipment use in wildland areas, k-rail barriers along roads to reduce fire spread into wildland areas etc. S, or the probability of fire reaching urban environments has historically been the primary focus of state and federal fire management activities. There is a need for greater focus on understanding the most strategic application of wildland fuel treatments. E, the probability of fire encroaching into the urban environment, has largely been addressed in the past by attention to wildland-urban interface (WUI) fuel treatments. The one factor that has perhaps the greatest potential for impacting E are patterns of urban growth, both in strategic placement and spatial patterning within communities, and this is an area where alternative future growth scenarios could have huge impacts on fire outcomes. G, the chance of fire propagating within the urban environment is a function of urban fuels, which include both home construction and landscaping. This area has the potential for effecting large changes in fire losses dependent upon future regulations on plantings in the urban environment.
Lee, Sunghee; Liu, Mingnan; Hu, Mengyao
2017-06-01
Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.
Lee, Sunghee; Liu, Mingnan; Hu, Mengyao
2017-01-01
Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381
Deep Uncertainty Surrounding Coastal Flood Risk Projections: A Case Study for New Orleans
NASA Astrophysics Data System (ADS)
Wong, Tony E.; Keller, Klaus
2017-10-01
Future sea-level rise drives severe risks for many coastal communities. Strategies to manage these risks hinge on a sound characterization of the uncertainties. For example, recent studies suggest that large fractions of the Antarctic ice sheet (AIS) may rapidly disintegrate in response to rising global temperatures, leading to potentially several meters of sea-level rise during the next few centuries. It is deeply uncertain, for example, whether such an AIS disintegration will be triggered, how much this would increase sea-level rise, whether extreme storm surges intensify in a warming climate, or which emissions pathway future societies will choose. Here, we assess the impacts of these deep uncertainties on projected flooding probabilities for a levee ring in New Orleans, LA. We use 18 scenarios, presenting probabilistic projections within each one, to sample key deeply uncertain future projections of sea-level rise, radiative forcing pathways, storm surge characterization, and contributions from rapid AIS mass loss. The implications of these deep uncertainties for projected flood risk are thus characterized by a set of 18 probability distribution functions. We use a global sensitivity analysis to assess which mechanisms contribute to uncertainty in projected flood risk over the course of a 50-year design life. In line with previous work, we find that the uncertain storm surge drives the most substantial risk, followed by general AIS dynamics, in our simple model for future flood risk for New Orleans.
Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.
1998-01-01
The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to 2000. The probability of a Mw = 6.9 earthquake within 50 km of Osaka during 1997-2007 is estimated to have risen from 5-6% before the Kobe earthquake to 7-11% afterward; during 1997-2027, it is estimated to have risen from 14-16% before Kobe to 16-22%.
Maximizing the Detection Probability of Kilonovae Associated with Gravitational Wave Observations
NASA Astrophysics Data System (ADS)
Chan, Man Leong; Hu, Yi-Ming; Messenger, Chris; Hendry, Martin; Heng, Ik Siong
2017-01-01
Estimates of the source sky location for gravitational wave signals are likely to span areas of up to hundreds of square degrees or more, making it very challenging for most telescopes to search for counterpart signals in the electromagnetic spectrum. To boost the chance of successfully observing such counterparts, we have developed an algorithm that optimizes the number of observing fields and their corresponding time allocations by maximizing the detection probability. As a proof-of-concept demonstration, we optimize follow-up observations targeting kilonovae using telescopes including the CTIO-Dark Energy Camera, Subaru-HyperSuprimeCam, Pan-STARRS, and the Palomar Transient Factory. We consider three simulated gravitational wave events with 90% credible error regions spanning areas from ∼ 30 {\\deg }2 to ∼ 300 {\\deg }2. Assuming a source at 200 {Mpc}, we demonstrate that to obtain a maximum detection probability, there is an optimized number of fields for any particular event that a telescope should observe. To inform future telescope design studies, we present the maximum detection probability and corresponding number of observing fields for a combination of limiting magnitudes and fields of view over a range of parameters. We show that for large gravitational wave error regions, telescope sensitivity rather than field of view is the dominating factor in maximizing the detection probability.
Charts designate probable future oceanographic research fields
NASA Technical Reports Server (NTRS)
1968-01-01
Charts outline the questions and problems of oceanographic research in the future. NASA uses the charts to estimate the probable requirements for instrumentation carried by satellites engaged in cooperative programs with other agencies concerned with identification, analysis, and solution of many of these problems.
The impact of land ownership, firefighting, and reserve status on fire probability in California
NASA Astrophysics Data System (ADS)
Starrs, Carlin Frances; Butsic, Van; Stephens, Connor; Stewart, William
2018-03-01
The extent of wildfires in the western United States is increasing, but how land ownership, firefighting, and reserve status influence fire probability is unclear. California serves as a unique natural experiment to estimate the impact of these factors, as ownership is split equally between federal and non-federal landowners; there is a relatively large proportion of reserved lands where extractive uses are prohibited and fire suppression is limited; and land ownership and firefighting responsibility are purposefully not always aligned. Panel Poisson regression techniques and pre-regression matching were used to model changes in annual fire probability from 1950-2015 on reserve and non-reserve lands on federal and non-federal ownerships across four vegetation types: forests, rangelands, shrublands, and forests without commercial species. Fire probability was found to have increased over time across all 32 categories. A marginal effects analysis showed that federal ownership and firefighting was associated with increased fire probability, and that the difference in fire probability on federal versus non-federal lands is increasing over time. Ownership, firefighting, and reserve status, played roughly equal roles in determining fire probability, and were found to have much greater influence than average maximum temperature (°C) during summer months (June, July, August), average annual precipitation (cm), and average annual topsoil moisture content by volume, demonstrating the critical role these factors play in western fire regimes and the importance of including them in future analysis focused on understanding and predicting wildfire in the Western United States.
The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise
NASA Astrophysics Data System (ADS)
Plag, H.; Bye, B.
2011-12-01
Policy makers and stakeholders in the coastal zone are equally challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often densely populated coastal areas are under risk of increased inundation. More than 40% of the global population is living in or near the coastal zone and this fraction is steadily increasing. A rise in LSL will increase the vulnerability of coastal infrastructure and population dramatically, with potentially devastating consequences for the global economy, society, and environment. Policy makers are faced with a trade-off between imposing today the often very high costs of coastal protection and adaptation upon national economies and leaving the costs of potential major disasters to future generations. They are in need of actionable information that provides guidance for the development of coastal zones resilient to future sea level changes. Part of this actionable information comes from risk and vulnerability assessments, which require information on future LSL changes as input. In most cases, a deterministic approach has been applied based on predictions of the plausible range of future LSL trajectories as input. However, there is little consensus in the scientific community on how these trajectories should be determined, and what the boundaries of the plausible range are. Over the last few years, many publications in Science, Nature and other peer-reviewed scientific journals have revealed a broad range of possible futures and significant epistemic uncertainties and gaps concerning LSL changes. Based on the somewhat diffuse science input, policy and decision makers have made rather different choices for mitigation and adaptation in cases such as Venice, The Netherlands, New York City, and the San Francisco Bay area. Replacing the deterministic, prediction-based approach with a statistical one that fully accounts for the uncertainties and epistemic gaps would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.
NASA Astrophysics Data System (ADS)
Biass, S.; Todde, A.; Cioni, R.; Pistolesi, M.; Geshi, N.; Bonadonna, C.
2017-10-01
We present an exposure analysis of infrastructure and lifeline to tephra fallout for a future large-scale explosive eruption of Sakurajima volcano. An eruption scenario is identified based on the field characterization of the last subplinian eruption at Sakurajima and a review of reports of the eruptions that occurred in the past six centuries. A scenario-based probabilistic hazard assessment is performed using the Tephra2 model, considering various eruption durations to reflect complex eruptive sequences of all considered reference eruptions. A quantitative exposure analysis of infrastructures and lifelines is presented primarily using open-access data. The post-event impact assessment of Magill et al. (Earth Planets Space 65:677-698, 2013) after the 2011 VEI 2 eruption of Shinmoedake is used to discuss the vulnerability and the resilience of infrastructures during a future large eruption of Sakurajima. Results indicate a main eastward dispersal, with longer eruption durations increasing the probability of tephra accumulation in proximal areas and reducing it in distal areas. The exposure analysis reveals that 2300 km of road network, 18 km2 of urban area, and 306 km2 of agricultural land have a 50% probability of being affected by an accumulation of tephra of 1 kg/m2. A simple qualitative exposure analysis suggests that the municipalities of Kagoshima, Kanoya, and Tarumizu are the most likely to suffer impacts. Finally, the 2011 VEI 2 eruption of Shinmoedake demonstrated that the already implemented mitigation strategies have increased resilience and improved recovery of affected infrastructures. Nevertheless, the extent to which these mitigation actions will perform during the VEI 4 eruption presented here is unclear and our hazard assessment points to possible damages on the Sakurajima peninsula and the neighboring municipality of Tarumizu.
Maritime transport in the Gulf of Bothnia 2030.
Pekkarinen, Annukka; Repka, Sari
2014-10-01
Scenarios for shipping traffic in the Gulf of Bothnia (GoB) by 2030 are described in order to identify the main factors that should be taken into account when preparing a Maritime Spatial Plan (MSP) for the area. The application of future research methodology to planning of marine areas was also assessed. The methods include applying existing large scale quantitative scenarios for maritime traffic in the GoB and using real-time Delphi in which an expert group discussed different factors contributing to future maritime traffic in the GoB to find out the probability and significance of the factors having an impact on maritime traffic. MSP was tested on transnational scale in the Bothnian sea area as a pilot project.
Hurricane track forecast cones from fluctuations
Meuel, T.; Prado, G.; Seychelles, F.; Bessafi, M.; Kellay, H.
2012-01-01
Trajectories of tropical cyclones may show large deviations from predicted tracks leading to uncertainty as to their landfall location for example. Prediction schemes usually render this uncertainty by showing track forecast cones representing the most probable region for the location of a cyclone during a period of time. By using the statistical properties of these deviations, we propose a simple method to predict possible corridors for the future trajectory of a cyclone. Examples of this scheme are implemented for hurricane Ike and hurricane Jimena. The corridors include the future trajectory up to at least 50 h before landfall. The cones proposed here shed new light on known track forecast cones as they link them directly to the statistics of these deviations. PMID:22701776
"If It Is Dreamable It Is Doable": The Role of Desired Job Flexibility in Imagining the Future
ERIC Educational Resources Information Center
Guglielmi, Dina; Chiesa, Rita; Mazzetti, Greta
2016-01-01
Purpose: The purpose of this paper is to compare how the dimension of attitudes toward future that consists in perception of dynamic future may be affected by desirable goals (desired job flexibility) and probable events (probable job flexibility) in a group of permanent vs temporary employees. Moreover the aim is to explore the gender differences…
NASA Astrophysics Data System (ADS)
Wünnemann, K.; Collins, G. S.; Weiss, R.
2010-12-01
The strike of a cosmic body into a marine environment differs in several respects from impact on land. Oceans cover approximately 70% of the Earth's surface, implying not only that oceanic impact is a very likely scenario for future impacts but also that most impacts in Earth's history must have happened in marine environments. Therefore, the study of oceanic impact is imperative in two respects: (1) to quantify the hazard posed by future oceanic impacts, including the potential threat of large impact-generated tsunami-like waves, and (2) to reconstruct Earth's impact record by accounting for the large number of potentially undiscovered crater structures in the ocean crust. Reconstruction of the impact record is of crucial importance both for assessing the frequency of collision events in the past and for better predicting the probability of future impact. We summarize the advances in the study of oceanic impact over the last decades and focus in particular on how numerical models have improved our understanding of cratering in the oceanic environment and the generation of waves by impact. We focus on insight gleaned from numerical modeling studies into the deceleration of the projectile by the water, cratering of the ocean floor, the late stage modification of the crater due to gravitational collapse, and water resurge. Furthermore, we discuss the generation and propagation of large tsunami-like waves as a result of a strike of a cosmic body in marine environments.
Structurally adaptive space crane concept for assembling space systems on orbit
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Sutter, Thomas R.; Wu, K. Chauncey
1992-01-01
Many future human space exploration missions will probably require large vehicles that must be assembled on orbit. Thus, a device that can move, position, and assemble large and massive spacecraft components on orbit becomes essential for these missions. A concept is described for such a device: a space crane concept that uses erectable truss hardware to achieve high-stiffness and low-mass booms and uses articulating truss joints that can be assembled on orbit. The hardware has been tested and shown to have linear load-deflection response and to be structurally predictable. The hardware also permits the crane to be reconfigured into different geometries to satisfy future assembly requirements. A number of articulating and rotary joint concepts have been sized and analyzed, and the results are discussed. Two strategies were proposed to suppress motion-induced vibration: placing viscous dampers in selected truss struts and preshaping motion commands. Preliminary analyses indicate that these techniques have the potential to greatly enhance structural damping.
A comparison of spacecraft penetration hazards due to meteoroids and manmade earth-orbiting objects
NASA Technical Reports Server (NTRS)
Brooks, D. R.
1976-01-01
The ability of a typical double-walled spacecraft structure to protect against penetration by high-velocity incident objects is reviewed. The hazards presented by meteoroids are compared to the current and potential hazards due to manmade orbiting objects. It is shown that the nature of the meteoroid number-mass relationship makes adequate protection for large space facilities a conceptually straightforward structural problem. The present level of manmade orbiting objects (an estimated 10,000 in early 1975) does not pose an unacceptable risk to manned space operations proposed for the near future, but it does produce penetration probabilities in the range of 1-10 percent for a 100-m diameter sphere in orbit for 1,000 days. The number-size distribution of manmade objects is such that adequate protection is difficult to achieve for large permanent space facilities, to the extent that future restrictions on such facilities may result if the growth of orbiting objects continues at its historical rate.
Estimation of State Transition Probabilities: A Neural Network Model
NASA Astrophysics Data System (ADS)
Saito, Hiroshi; Takiyama, Ken; Okada, Masato
2015-12-01
Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.
Probability based models for estimation of wildfire risk
Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit
2004-01-01
We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...
Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2014-01-01
The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.
Overwinter survival of neotropical migratory birds in early successional and mature tropical forests
Conway, C.J.; Powell, G.V.N.; Nichols, J.D.
1995-01-01
Many Neotropical migratory species inhabit both mature and early successional forest on their wintering grounds, yet comparisons of survival rates between habitats are lacking. Consequently, the factors affecting habitat suitability for Neotropical migrants and the potential effects of tropical deforestation on migrants are not well understood. We estimated over-winter survival and capture probabilities of Wood Thrush (Hylocichla mustelina), Ovenbird (Seiurus aurocapillus), Hooded Warbler (Wilsonia citrina), and Kentucky Warbler (Oporomis formosus) inhabiting two common tropical habitat types, mature and early-successional forest. Our results suggest that large differences (for example, ratio of survival rates (gamma) < 0.85) in overwinter survival between these habitats do not exist for any of these species. Age ratios did not differ between habitats, but males were more common in forest habitats and females more common in successional habitats for Hooded Warblers and Kentucky Warblers. Future research on overwinter survival should address the need for age- and sex-specific survival estimates before we can draw strong conclusions regarding winter habitat suitability. Our estimates of over-winter survival extrapolated to annual survival rates that were generally lower than previous estimates of annual survival of migratory birds. Capture probability differed between habitats for Kentucky Warblers, but our results provide strong evidence against large differences in capture probability between habitats for Wood Thrush, Hooded Warblers, and Ovenbirds. We found no temporal or among site differences in survival or capture probability for any of the four species. Additional research is needed to examine the effects of winter habitat use on survival during migration and between-winter survival.
Loayza, Andrea P.; Squeo, Francisco A.
2016-01-01
Scatter-hoarding rodents can act as both predators and dispersers for many large-seeded plants because they cache seeds for future use, but occasionally forget them in sites with high survival and establishment probabilities. The most important fruit or seed trait influencing rodent foraging behavior is seed size; rodents prefer large seeds because they have higher nutritional content, but this preference can be counterbalanced by the higher costs of handling larger seeds. We designed a cafeteria experiment to assess whether fruit and seed size of Myrcianthes coquimbensis, an endangered desert shrub, influence the decision-making process during foraging by three species of scatter-hoarding rodents differing in body size: Abrothrix olivaceus, Phyllotis darwini and Octodon degus. We found that the size of fruits and seeds influenced foraging behavior in the three rodent species; the probability of a fruit being harvested and hoarded was higher for larger fruits than for smaller ones. Patterns of fruit size preference were not affected by rodent size; all species were able to hoard fruits within the entire range of sizes offered. Finally, fruit and seed size had no effect on the probability of seed predation, rodents typically ate only the fleshy pulp of the fruits offered and discarded whole, intact seeds. In conclusion, our results reveal that larger M. coquimbensis fruits have higher probabilities of being harvested, and ultimately of its seeds being hoarded and dispersed by scatter-hoarding rodents. As this plant has no other dispersers, rodents play an important role in its recruitment dynamics. PMID:27861550
Mediators of the Availability Heuristic in Probability Estimates of Future Events.
ERIC Educational Resources Information Center
Levi, Ariel S.; Pryor, John B.
Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…
Estimating occupancy probability of moose using hunter survey data
Crum, Nathan J.; Fuller, Angela K.; Sutherland, Christopher S.; Cooch, Evan G.; Hurst, Jeremy E.
2017-01-01
Monitoring rare species can be difficult, especially across large spatial extents, making conventional methods of population monitoring costly and logistically challenging. Citizen science has the potential to produce observational data across large areas that can be used to monitor wildlife distributions using occupancy models. We used citizen science (i.e., hunter surveys) to facilitate monitoring of moose (Alces alces) populations, an especially important endeavor because of their recent apparent declines in the northeastern and upper midwestern regions of the United States. To better understand patterns of occurrence of moose in New York, we used data collected through an annual survey of approximately 11,000 hunters between 2012 and 2014 that recorded detection–non-detection data of moose and other species. We estimated patterns of occurrence of moose in relation to land cover characteristics, climate effects, and interspecific interactions using occupancy models to analyze spatially referenced moose observations. Coniferous and deciduous forest with low prevalence of white-tailed deer (Odocoileus virginianus) had the highest probability of moose occurrence. This study highlights the potential of data collected using citizen science for understanding the spatial distribution of low-density species across large spatial extents and providing key information regarding where and when future research and management activities should be focused.
Spatial and temporal variability in rates of landsliding in seismically active mountain ranges
NASA Astrophysics Data System (ADS)
Parker, R.; Petley, D.; Rosser, N.; Densmore, A.; Gunasekera, R.; Brain, M.
2012-04-01
Where earthquake and precipitation driven disasters occur in steep, mountainous regions, landslides often account for a large proportion of the associated damage and losses. This research addresses spatial and temporal variability in rates of landslide occurrence in seismically active mountain ranges as a step towards developing better regional scale prediction of losses in such events. In the first part of this paper we attempt to explain reductively the variability in spatial rates of landslide occurrence, using data from five major earthquakes. This is achieved by fitting a regression-based conditional probability model to spatial probabilities of landslide occurrence, using as predictor variables proxies for spatial patterns of seismic ground motion and modelled hillslope stability. A combined model for all earthquakes performs well in hindcasting spatial probabilities of landslide occurrence as a function of readily-attainable spatial variables. We present validation of the model and demonstrate the extent to which it may be applied globally to derive landslide probabilities for future earthquakes. In part two we examine the temporal behaviour of rates of landslide occurrence. This is achieved through numerical modelling to simulate the behaviour of a hypothetical landscape. The model landscape is composed of hillslopes that continually weaken, fail and reset in response to temporally-discrete forcing events that represent earthquakes. Hillslopes with different geometries require different amounts of weakening to fail, such that they fail and reset at different temporal rates. Our results suggest that probabilities of landslide occurrence are not temporally constant, but rather vary with time, irrespective of changes in forcing event magnitudes or environmental conditions. Various parameters influencing the magnitude and temporal patterns of this variability are identified, highlighting areas where future research is needed. This model has important implications for landslide hazard and risk analysis in mountain areas as existing techniques usually assume that susceptibility to failure does not change with time.
NASA Astrophysics Data System (ADS)
Guillod, B. P.; Massey, N.; Otto, F. E. L.; Allen, M. R.; Jones, R.; Hall, J. W.
2016-12-01
Extreme events being rare by definition, accurately quantifying the probabilities associated with a given event is difficult. This is particularly true for droughts, for which only few events are available in the observational record owing to their long-lasting characteristics. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying present and future risks associated with droughts in the UK. To do so, a large number of modelled weather time series for "synthetic" drought events are being fed into hydrological and impact models to assess their impacts on various sectors (social sciences, economy, industry, agriculture, and ecosystems). Here, we present and analyse the hydro-meteorological drought event sets that have been produced with a new version of weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model simulations, downscaled at 25km over Europe by a nested Regional Climate Model. Simulations include the past 100 years as well as two future time slices (2030s and 2080s), and provide a large number of sequences of spatio-temporally coherent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. Beside presenting the methodology and validation of the event sets, we provide insights into drought risk in the UK and the drivers of drought. In particular, we examine their sensitivity to sea surface temperature and sea ice patterns, both in the recent past and for future projections. How drought risk in the UK can be expected to change in the future will also be discussed. Finally, we assess the applicability of this methodology to other regions. Reference: [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo
2014-04-21
Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.
Long aftershock sequences within continents and implications for earthquake hazard assessment.
Stein, Seth; Liu, Mian
2009-11-05
One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.
Seismic hazard assessment over time: Modelling earthquakes in Taiwan
NASA Astrophysics Data System (ADS)
Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting
2017-04-01
To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.
Probable Maximum Precipitation in the U.S. Pacific Northwest in a Changing Climate
NASA Astrophysics Data System (ADS)
Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby
2017-11-01
The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate, and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics have not been fully investigated and thus differing PMP estimates are sometimes obtained without physics-based interpretations. In this study, we present a hybrid approach that takes advantage of both traditional engineering practice and modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is modified and applied to five statistically downscaled CMIP5 model outputs, producing an ensemble of PMP estimates in the Pacific Northwest (PNW) during the historical (1970-2016) and future (2050-2099) time periods. The hybrid approach produced consistent historical PMP estimates as the traditional estimates. PMP in the PNW will increase by 50% ± 30% of the current design PMP by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability through increased sea surface temperature, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, PMP exhibits higher internal variability. Thus, long-time records of high-quality data in both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.
Probable Maximum Precipitation in the U.S. Pacific Northwest in a Changing Climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xiaodong; Hossain, Faisal; Leung, Lai-Yung
2017-12-22
The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several physics-based numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics has not been investigated and thus differing PMP estimates are obtained without clarity on their interpretation. In this study, we present a hybrid approach that takes advantage of both traditional engineering wisdom andmore » modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is improved and applied to outputs from an ensemble of five CMIP5 models. This hybrid approach is applied in the Pacific Northwest (PNW) to produce ensemble PMP estimation for the historical (1970-2016) and future (2050-2099) time periods. The new historical PMP estimates are verified by comparing them with the traditional estimates. PMP in the PNW will increase by 50% of the current level by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, ensemble PMP exhibits higher internal variation. Thus high-quality data of both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.« less
NASA Astrophysics Data System (ADS)
Kalantari, Z.
2015-12-01
In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. This study was built on a conceptual framework for looking at SedInConnect model, topography, land use, soil data and other PCDs and climate change in an integrated way to pave the way for more integrated policy making. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. This framework can provide a region with an effective tool to inform a broad range of watershed planning activities within a region. Regional planners, decision-makers, etc. can utilize this tool to identify the most vulnerable points in a watershed and along roads to plan for interventions and actions to alter impacts of high flows and other extreme weather events on roads construction. The application of the model over a large scale can give a realistic spatial characterization of sediment connectivity for the optimal management of debris flow to road structures. The ability of the model to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.
Arechar, Antonio A; Kouchaki, Maryam; Rand, David G
2018-03-01
We had participants play two sets of repeated Prisoner's Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not.
NASA Technical Reports Server (NTRS)
Kessler, Donald J.
1988-01-01
The probable amount, sizes, and relative velocities of debris are discussed, giving examples of the damage caused by debris, and focusing on the use of mathematical models to forecast the debris environment and solar activity now and in the future. Most debris are within 2,000 km of the earth's surface. The average velocity of spacecraft-debris collisions varies from 9 km/sec at 30 degrees of inclination to 13 km/sec near polar orbits. Mathematical models predict a 5 percent per year increase in the large-fragment population, producing a small-fragment population increase of 10 percent per year until the year 2060, the time of critical density. A 10 percent increase in the large population would cause the critical density to be reached around 2025.
Genovart, Meritxell; Sanz-Aguilar, Ana; Fernández-Chacón, Albert; Igual, Jose M; Pradel, Roger; Forero, Manuela G; Oro, Daniel
2013-01-01
Large-scale seasonal climatic indices, such as the North Atlantic Oscillation (NAO) index or the Southern Oscillation Index (SOI), account for major variations in weather and climate around the world and may influence population dynamics in many organisms. However, assessing the extent of climate impacts on species and their life-history traits requires reliable quantitative statistical approaches. We used a new analytical tool in mark-recapture, the multi-event modelling, to simultaneously assess the influence of climatic variation on multiple demographic parameters (i.e. adult survival, transient probability, reproductive skipping and nest dispersal) at two Mediterranean colonies of the Cory's shearwater Calonectris diomedea, a trans-equatorial migratory long-lived seabird. We also analysed the impact of climate in the breeding success at the two colonies. We found a clear temporal variation of survival for Cory's shearwaters, strongly associated to the large-scale SOI especially in one of the colonies (up to 66% of variance explained). Atlantic hurricane season is modulated by the SOI and coincides with shearwater migration to their wintering areas, directly affecting survival probabilities. However, the SOI was a better predictor of survival probabilities than the frequency of hurricanes; thus, we cannot discard an indirect additive effect of SOI via food availability. Accordingly, the proportion of transients was also correlated with SOI values, indicating higher costs of first reproduction (resulting in either mortality or permanent dispersal) when bad environmental conditions occurred during winter before reproduction. Breeding success was also affected by climatic factors, the NAO explaining c. 41% of variance, probably as a result of its effect in the timing of peak abundance of squid and small pelagics, the main prey for shearwaters. No climatic effect was found either on reproductive skipping or on nest dispersal. Contrarily to what we expect for a long-lived organism, large-scale climatic indexes had a more pronounced effect on survival and transient probabilities than on less sensitive fitness parameters such reproductive skipping or nest dispersal probabilities. The potential increase in hurricane frequency because of global warming may interact with other global change agents (such as incidental bycatch and predation by alien species) nowadays impacting shearwaters, affecting future viability of populations. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, P.; Beaudet, P.
1980-01-01
The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.
Expressed Likelihood as Motivator: Creating Value through Engaging What’s Real
Higgins, E. Tory; Franks, Becca; Pavarini, Dana; Sehnert, Steen; Manley, Katie
2012-01-01
Our research tested two predictions regarding how likelihood can have motivational effects as a function of how a probability is expressed. We predicted that describing the probability of a future event that could be either A or B using the language of high likelihood (“80% A”) rather than low likelihood (“20% B”), i.e., high rather than low expressed likelihood, would make a present activity more real and engaging, as long as the future event had properties relevant to the present activity. We also predicted that strengthening engagement from the high (vs. low) expressed likelihood of a future event would intensify the value of present positive and negative objects (in opposite directions). Both predictions were supported. There was also evidence that this intensification effect from expressed likelihood was independent of the actual probability or valence of the future event. What mattered was whether high versus low likelihood language was used to describe the future event. PMID:23940411
Trempala, J; Malmberg, L E
1998-05-01
The purpose of this study was to describe the effect of a set of individual resources and cultural factors on adolescents' probability estimations of the occurrence of positive future events in three life domains: education, occupation, and family. The hypothesis was that the effects of culture and individual resources are interwoven in the formation process of future orientation. The sample consisted of 352 17-year-old Polish and Finnish girls and boys from vocational and upper secondary schools. The 78-item questionnaire developed by the authors was used to measure different aspects of future orientation (probability, valence, and extension of future events in three life domains) and individual resources (self-esteem, control beliefs, and social knowledge about normatively and the generation gap). Data analysis showed that culture separately affected individual resources and adolescents' expectations. However, the results broadly confirmed the thesis that the culture has a limited effect on adolescents' expectations of the occurrence of future events. Moreover, these data suggested that the influence of sociocultural differences on adolescents' probability estimations is indirect. In the context of the presented data, the authors discuss their model of future orientation.
Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis
NASA Astrophysics Data System (ADS)
Verendel, Vilhelm; Häggström, Olle
2017-01-01
The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Astrophysics Data System (ADS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
Probabilistic Assessment of Cancer Risk from Solar Particle Events
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
2010-01-01
For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.
NASA Astrophysics Data System (ADS)
Zarola, Amit; Sil, Arjun
2018-04-01
This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.
Future southcentral US wildfire probability due to climate change
Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.
2018-01-01
Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.
Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes
ERIC Educational Resources Information Center
Denison, Stephanie; Xu, Fei
2010-01-01
Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…
Arechar, Antonio A.; Kouchaki, Maryam; Rand, David G.
2018-01-01
We had participants play two sets of repeated Prisoner’s Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not. PMID:29809199
Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.
Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054
Modeling climate change impacts on water trading.
Luo, Bin; Maqsood, Imran; Gong, Yazhen
2010-04-01
This paper presents a new method of evaluating the impacts of climate change on the long-term performance of water trading programs, through designing an indicator to measure the mean of periodic water volume that can be released by trading through a water-use system. The indicator is computed with a stochastic optimization model which can reflect the random uncertainty of water availability. The developed method was demonstrated in the Swift Current Creek watershed of Prairie Canada under two future scenarios simulated by a Canadian Regional Climate Model, in which total water availabilities under future scenarios were estimated using a monthly water balance model. Frequency analysis was performed to obtain the best probability distributions for both observed and simulated water quantity data. Results from the case study indicate that the performance of a trading system is highly scenario-dependent in future climate, with trading effectiveness highly optimistic or undesirable under different future scenarios. Trading effectiveness also largely depends on trading costs, with high costs resulting in failure of the trading program. (c) 2010 Elsevier B.V. All rights reserved.
Ensemble tropical-extratropical cyclone coastal flood hazard assessment with climate change
NASA Astrophysics Data System (ADS)
Orton, P. M.; Lin, N.; Colle, B.
2016-12-01
A challenge with quantifying future changes in coastal flooding for the U.S. East Coast is that climate change has varying effects on different types of storms, in addition to raising mean sea levels. Moreover, future flood hazard uncertainties are large and come from many sources. Here, a new coastal flood hazard assessment approach is demonstrated that separately evaluates and then combines probabilities of storm tide generated from tropical cyclones (TCs) and extratropical cyclones (ETCs). The separation enables us to incorporate climate change impacts on both types of storms. The assessment accounts for epistemic storm tide uncertainty using an ensemble of different prior studies and methods of assessment, merged with uncertainty in climate change effects on storm tides and sea levels. The assessment is applied for New York Harbor, under the auspices of the New York City Panel on Climate Change (NPCC). In the New York Bight region and much of the U.S. East Coast, differing flood exceedance curve slopes for TCs and ETCs arise due to their differing physics. It is demonstrated how errors can arise for this region from mixing together storm types in an extreme value statistical analysis, a common practice when using observations. The effects of climate change on TC and ETC flooding have recently been assessed for this region, for TCs using a Global Climate Model (GCM) driven hurricane model with hydrodynamic modeling, and for ETCs using a GCM-driven multilinear regression-based storm surge model. The results of these prior studies are applied to our central estimates of the flood exceedance curve probabilities, transforming them for climate change effects. The results are useful for decision-makers because they highlight the large uncertainty in present-day and future flood risk, and also for scientists because they identify the areas where further research is most needed.
Scandinavian epidemiological research in gastroenterology and hepatology.
Björnsson, Einar S; Ekbom, Anders
2015-06-01
In the last decades, a large number of epidemiological studies in gastroenterology and hepatology have originated from the Scandinavian countries. With the help of large health databases, with good validity and other registries related to patient outcomes, researchers from the Scandinavian countries have been able to make some very important contributions to the field. These countries, Sweden, Norway, Finland, Denmark and Iceland, have all universal access to health care and have shown to be ideal for epidemiological research. Population-based studies have been frequent and follow-up studies have been able to describe the temporal trends and changes in phenotypes. Our ability in Scandinavia to follow up defined groups of patients over time has been crucial to learn the natural history of many gastrointestinal and liver diseases and often in a population-based setting. Patient-related outcomes measures will probably gain increasing importance in the future, but Scandinavian gastroenterologists and surgeons are likely to have a better infrastructure for such endeavors compared to most other populations. Thus, there is a bright future for international competitive research within the field of gastrointestinal and liver diseases in Scandinavia.
Computing Earthquake Probabilities on Global Scales
NASA Astrophysics Data System (ADS)
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
Kennedy, Patricia L.; Bartuszevige, Anne M.; Houle, Marcy; Humphrey, Ann B.; Dugger, Katie M.; Williams, John
2014-01-01
Potential for large prairie remnants to provide habitat for grassland-obligate wildlife may be compromised by nonsustainable range-management practices. In 1979–1980, high nesting densities of 3 species of hawks in the genus Buteo—Ferruginous Hawk (Buteo regalis), Red-tailed Hawk (B. jamaicensis), and Swainson's Hawk (B. swainsoni)—were documented on the Zumwalt Prairie and surrounding agricultural areas (34,361 ha) in northeastern Oregon, USA. This area has been managed primarily as livestock summer range since it was homesteaded. Unlike in other prairie remnants, land management on the Zumwalt Prairie was consistent over the past several decades; thus, we predicted that territory occupancy of these 3 species would be stable. We also predicted that territory occupancy would be positively related to local availability of nesting structures within territories. We evaluated these hypotheses using a historical dataset, current survey and habitat data, and occupancy models. In support of our predictions, territory occupancy of all 3 species has not changed over the study period of ∼25 yr, which suggests that local range-management practices are not negatively affecting these taxa. Probability of Ferruginous Hawk occupancy increased with increasing area of aspen, an important nest structure for this species in grasslands. Probability of Swainson's Hawk occupancy increased with increasing area of large shrubs, and probability of Red-tailed Hawk occupancy was weakly associated with area of conifers. In the study area, large shrubs and conifers are commonly used as nesting structures by Swainson's Hawks and Red-tailed Hawks, respectively. Availability of these woody species is changing (increases in conifers and large shrubs, and decline in aspen) throughout the west, and these changes may result in declines in Ferruginous Hawk occupancy and increases in Swainson's Hawk and Red-tailed Hawk occupancy in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby
The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several physics-based numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics has not been investigated and thus differing PMP estimates are obtained without clarity on their interpretation. In this study, we present a hybrid approach that takes advantage of both traditional engineering wisdom andmore » modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is improved and applied to outputs from an ensemble of five CMIP5 models. This hybrid approach is applied in the Pacific Northwest (PNW) to produce ensemble PMP estimation for the historical (1970-2016) and future (2050-2099) time periods. The new historical PMP estimates are verified by comparing them with the traditional estimates. PMP in the PNW will increase by 50% of the current level by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, ensemble PMP exhibits higher internal variation. Thus high-quality data of both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.« less
Unprecedented Zipangu Underworld of the Moon Exploration (UZUME)
NASA Astrophysics Data System (ADS)
Haruyama, J.; Kawano, I.; Kubota, T.; Otsuki, M.; Kato, H.; Nishibori, T.; Iwata, T.; Yamamoto, Y.; Nagamatsu, A.; Shimada, K.; Ishihara, Y.; Hasenaka, T.; Morota, T.; Nishino, M. N.; Hashizume, K.; Saiki, K.; Shirao, M.; Komatsu, G.; Hasebe, N.; Shimizu, H.; Miyamoto, H.; Kobayashi, K.; Yokobori, S.; Michikami, T.; Yamamoto, S.; Yokota, Y.; Arisumi, H.; Ishigami, G.; Furutani, K.; Michikawa, Y.
2014-04-01
On the Moon, three huge vertical holes (several tens to a hundred meters in diameter and depth) were discovered in SELENE (nicknamed Kaguya) Terrain Camera data of 10 m pixel resolution. These holes are probably skylights of underground large caverns such as lava tubes, or magma chambers. The huge holes and their associated subsurface caverns are among the most important future exploration targets from the viewpoint of constructing lunar bases and many scientific aspects. We are now planning to explore the caverns through the skylight holes. We name the project as UZUME (Unprecedented Zipangu (Japan) Underworld of the Moon Exploration).
Advancing Risk Assessment through the Application of Systems Toxicology
Sauer, John Michael; Kleensang, André; Peitsch, Manuel C.; Hayes, A. Wallace
2016-01-01
Risk assessment is the process of quantifying the probability of a harmful effect to individuals or populations from human activities. Mechanistic approaches to risk assessment have been generally referred to as systems toxicology. Systems toxicology makes use of advanced analytical and computational tools to integrate classical toxicology and quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Three presentations including two case studies involving both in vitro and in vivo approaches described the current state of systems toxicology and the potential for its future application in chemical risk assessment. PMID:26977253
General Astrophysics with TPF: Not Just Dark Energy
NASA Technical Reports Server (NTRS)
Kuchner, Marc
2006-01-01
Besides searching for Earth-LIke Planets, TPF can study Jupiters, Neptunes, and all sorts of exotic planets. It can image debris-disks, YSO disks, AGN disks, maybe even AGB disks. And you are probably aware that a large optical space telescope like TPF-C or TPF-O can be a fantastic tool for studying the equation of state of the Dark Energy. I will review some of the future science of TPF-C, TPF-I and TPF-O, focusing on the applications of TPF to the study of objects in our Galaxy: especially circumstellar disks and planets other than exo-Earths.
The story of an ambivalent relationship: Sigmund Freud and Eugen Bleuler.
Falzeder, Ernst
2007-06-01
This paper examines the short-lived flirtation between psychoanalysis and academia and psychiatry in Europe and the reasons for, and consequences of, the fact that their paths diverged. It is argued that Bleuler's break with the psychoanalytic movement is a crucial and, until now, largely underestimated turning point. Bleuler's separation from the psychoanalytic movement was probably more important for the course it has since taken than those of Adler, Stekel, or even Jung. Bleuler's analysis by correspondence by Freud, and its failure, was of paramount importance for the future relationship between Freud and Bleuler, and for Bleuler's assessment of psychoanalysis.
Using Extreme Tropical Precipitation Statistics to Constrain Future Climate States
NASA Astrophysics Data System (ADS)
Igel, M.; Biello, J. A.
2017-12-01
Tropical precipitation is characterized by a rapid growth in mean intensity as the column humidity increases. This behavior is examined in both a cloud resolving model and with high-resolution observations of precipitation and column humidity from CloudSat and AIRS, respectively. The model and the observations exhibit remarkable consistency and suggest a new paradigm for extreme precipitation. We show that the total precipitation can be decomposed into a product of contributions from a mean intensity, a probability of precipitation, and a global PDF of column humidity values. We use the modeling and observational results to suggest simple, analytic forms for each of these functions. The analytic representations are then used to construct a simple expression for the global accumulated precipitation as a function of the parameters of each of the component functions. As the climate warms, extreme precipitation intensity and global precipitation are expected to increase, though at different rates. When these predictions are incorporated into the new analytic expression for total precipitation, predictions for changes due to global warming to the probability of precipitation and the PDF of column humidity can be made. We show that strong constraints can be imposed on the future shape of the PDF of column humidity but that only weak constraints can be set on the probability of precipitation. These are largely imposed by the intensification of extreme precipitation. This result suggests that understanding precisely how extreme precipitation responds to climate warming is critical to predicting other impactful properties of global hydrology. The new framework can also be used to confirm and discount existing theories for shifting precipitation.
Reducing the Risk of Human Space Missions with INTEGRITY
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.
2003-01-01
The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.
Quantum probability, choice in large worlds, and the statistical structure of reality.
Ross, Don; Ladyman, James
2013-06-01
Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.
Predicting relapse risk in childhood acute lymphoblastic leukaemia.
Teachey, David T; Hunger, Stephen P
2013-09-01
Intensive multi-agent chemotherapy regimens and the introduction of risk-stratified therapy have substantially improved cure rates for children with acute lymphoblastic leukaemia (ALL). Current risk allocation schemas are imperfect, as some children are classified as lower-risk and treated with less intensive therapy relapse, while others deemed higher-risk are probably over-treated. Most cooperative groups previously used morphological clearance of blasts in blood and marrow during the initial phases of chemotherapy as a primary factor for risk group allocation; however, this has largely been replaced by the detection of minimal residual disease (MRD). Other than age and white blood cell count (WBC) at presentation, many clinical variables previously used for risk group allocation are no longer prognostic, as MRD and the presence of sentinel genetic lesions are more reliable at predicting outcome. Currently, a number of sentinel genetic lesions are used by most cooperative groups for risk stratification; however, in the near future patients will probably be risk-stratified using genomic signatures and clustering algorithms, rather than individual genetic alterations. This review will describe the clinical, biological, and response-based features known to predict relapse risk in childhood ALL, including those currently used and those likely to be used in the near future to risk-stratify therapy. © 2013 John Wiley & Sons Ltd.
Emotional Stress as a Risk for Hypertension in Sub-Saharan Africans: Are We Ignoring the Odds?
Malan, Leoné; Malan, Nico T
2017-01-01
Globally most interventions focus on improving lifestyle habits and treatment regimens to combat hypertension as a non-communicable disease (NCD). However, despite these interventions and improved medical treatments, blood pressure (BP) values are still on the rise and poorly controlled in sub-Saharan Africa (SSA). Other factors contributing to hypertension prevalence, such as chronic emotional stress, might provide some insight for future health policy approaches.Currently, Hypertension Society guidelines do not mention emotional stress as a probable cause for hypertension. Recently the 2014 World Global Health reports, suggested that African governments should consider using World Health Organization hypertension data as a proxy indicator for social well-being. However, the possibility that a stressful life and taxing environmental factors might disturb central neural control of BP regulation has largely been ignored in SSA.Linking emotional stress to vascular dysregulation is therefore one way to investigate increased cardiometabolic challenges, neurotransmitter depletion and disturbed hemodynamics. Disruption of stress response pathways and subsequent changes in lifestyle habits as ways of coping with a stressful life, and as probable cause for hypertension prevalence in SSA, may be included in future preventive measures. We will provide an overview on emotional stress and central neural control of BP and will include also implications thereof for clinical practice in SSA cohorts.
Volcano hazards in the Mount Hood region, Oregon
Scott, W.E.; Pierson, T.C.; Schilling, S.P.; Costa, J.E.; Gardner, C.A.; Vallance, J.W.; Major, J.J.
1997-01-01
Mount Hood is a potentially active volcano close to rapidly growing communities and recreation areas. The most likely widespread and hazardous consequence of a future eruption will be for lahars (rapidly moving mudflows) to sweep down the entire length of the Sandy (including the Zigzag) and White River valleys. Lahars can be generated by hot volcanic flows that melt snow and ice or by landslides from the steep upper flanks of the volcano. Structures close to river channels are at greatest risk of being destroyed. The degree of hazard decreases as height above a channel increases, but large lahars can affect areas more than 30 vertical meters (100 vertical feet) above river beds. The probability of eruption-generated lahars affecting the Sandy and White River valleys is 1-in-15 to l-in-30 during the next 30 years, whereas the probability of extensive areas in the Hood River Valley being affected by lahars is about ten times less. The accompanying volcano-hazard-zonation map outlines areas potentially at risk and shows that some areas may be too close for a reasonable chance of escape or survival during an eruption. Future eruptions of Mount Hood could seriously disrupt transportation (air, river, and highway), some municipal water supplies, and hydroelectric power generation and transmission in northwest Oregon and southwest Washington.
Teamwork Training Needs Analysis for Long-Duration Exploration Missions
NASA Technical Reports Server (NTRS)
Smith-Jentsch, Kimberly A.; Sierra, Mary Jane
2016-01-01
The success of future long-duration exploration missions (LDEMs) will be determined largely by the extent to which mission-critical personnel possess and effectively exercise essential teamwork competencies throughout the entire mission lifecycle (e.g., Galarza & Holland, 1999; Hysong, Galarza, & Holland, 2007; Noe, Dachner, Saxton, & Keeton, 2011). To ensure that such personnel develop and exercise these necessary teamwork competencies prior to and over the full course of future LDEMs, it is essential that a teamwork training curriculum be developed and put into place at NASA that is both 1) comprehensive, in that it targets all teamwork competencies critical for mission success and 2) structured around empirically-based best practices for enhancing teamwork training effectiveness. In response to this demand, the current teamwork-oriented training needs analysis (TNA) was initiated to 1) identify the teamwork training needs (i.e., essential teamwork-related competencies) of future LDEM crews, 2) identify critical gaps within NASA’s current and future teamwork training curriculum (i.e., gaps in the competencies targeted and in the training practices utilized) that threaten to impact the success of future LDEMs, and to 3) identify a broad set of practical nonprescriptive recommendations for enhancing the effectiveness of NASA’s teamwork training curriculum in order to increase the probability of future LDEM success.
Array coding for large data memories
NASA Technical Reports Server (NTRS)
Tranter, W. H.
1982-01-01
It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.
The global impact distribution of Near-Earth objects
NASA Astrophysics Data System (ADS)
Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.
2016-02-01
Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.
Responses of large mammals to climate change.
Hetem, Robyn S; Fuller, Andrea; Maloney, Shane K; Mitchell, Duncan
2014-01-01
Most large terrestrial mammals, including the charismatic species so important for ecotourism, do not have the luxury of rapid micro-evolution or sufficient range shifts as strategies for adjusting to climate change. The rate of climate change is too fast for genetic adaptation to occur in mammals with longevities of decades, typical of large mammals, and landscape fragmentation and population by humans too widespread to allow spontaneous range shifts of large mammals, leaving only the expression of latent phenotypic plasticity to counter effects of climate change. The expression of phenotypic plasticity includes anatomical variation within the same species, changes in phenology, and employment of intrinsic physiological and behavioral capacity that can buffer an animal against the effects of climate change. Whether that buffer will be realized is unknown, because little is known about the efficacy of the expression of plasticity, particularly for large mammals. Future research in climate change biology requires measurement of physiological characteristics of many identified free-living individual animals for long periods, probably decades, to allow us to detect whether expression of phenotypic plasticity will be sufficient to cope with climate change.
Responses of large mammals to climate change
Hetem, Robyn S; Fuller, Andrea; Maloney, Shane K; Mitchell, Duncan
2014-01-01
Most large terrestrial mammals, including the charismatic species so important for ecotourism, do not have the luxury of rapid micro-evolution or sufficient range shifts as strategies for adjusting to climate change. The rate of climate change is too fast for genetic adaptation to occur in mammals with longevities of decades, typical of large mammals, and landscape fragmentation and population by humans too widespread to allow spontaneous range shifts of large mammals, leaving only the expression of latent phenotypic plasticity to counter effects of climate change. The expression of phenotypic plasticity includes anatomical variation within the same species, changes in phenology, and employment of intrinsic physiological and behavioral capacity that can buffer an animal against the effects of climate change. Whether that buffer will be realized is unknown, because little is known about the efficacy of the expression of plasticity, particularly for large mammals. Future research in climate change biology requires measurement of physiological characteristics of many identified free-living individual animals for long periods, probably decades, to allow us to detect whether expression of phenotypic plasticity will be sufficient to cope with climate change. PMID:27583293
Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F
2010-12-01
We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.
Code of Federal Regulations, 2014 CFR
2014-07-01
... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...
Code of Federal Regulations, 2013 CFR
2013-07-01
... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...
Peltier, Drew M P; Ibáñez, Inés
2015-01-01
Predicting future forests' structure and functioning is a critical goal for ecologists, thus information on seedling recruitment will be crucial in determining the composition and structure of future forest ecosystems. In particular, seedlings' photosynthetic response to a changing environment will be a key component determining whether particular species establish enough individuals to maintain populations, as growth is a major determinant of survival. We quantified photosynthetic responses of sugar maple (Acer saccharum Marsh.), pignut hickory (Carya glabra Mill.), northern red oak (Quercus rubra L.) and eastern black oak (Quercus velutina Lam.) seedlings to environmental conditions including light habitat, temperature, soil moisture and vapor pressure deficit (VPD) using extensive in situ gas exchange measurements spanning an entire growing season. We estimated the parameters in a hierarchical Bayesian version of the Farquhar model of photosynthesis, additionally informed by soil moisture and VPD, and found that maximum Rubisco carboxylation (V(cmax)) and electron transport (J(max)) rates showed significant seasonal variation, but not the peaked patterns observed in studies of adult trees. Vapor pressure deficit and soil moisture limited J(max) and V(cmax) for all four species. Predictions indicate large declines in summer carbon assimilation rates under a 3 °C increase in mean annual temperature projected by climate models, while spring and fall assimilation rates may increase. Our model predicts decreases in summer assimilation rates in gap habitats with at least 90% probability, and with 20-99.9% probability in understory habitats depending on species. Predictions also show 70% probability of increases in fall and 52% probability in spring in understory habitats. All species were impacted, but our findings suggest that oak species may be favored in northeastern North America under projected increases in temperature due to superior assimilation rates under these conditions, though as growing seasons become longer, the effects of climate change on seedling photosynthesis may be complex. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Kelley, Luke Zoltan; Mandel, Ilya; Ramirez-Ruiz, Enrico
2013-06-01
The detection of an electromagnetic transient which may originate from a binary neutron star merger can increase the probability that a given segment of data from the LIGO-Virgo ground-based gravitational-wave detector network contains a signal from a binary coalescence. Additional information contained in the electromagnetic signal, such as the sky location or distance to the source, can help rule out false alarms and thus lower the necessary threshold for a detection. Here, we develop a framework for determining how much sensitivity is added to a gravitational-wave search by triggering on an electromagnetic transient. We apply this framework to a variety of relevant electromagnetic transients, from short gamma-ray bursts (GRBs) to signatures of r-process heating to optical and radio orphan afterglows. We compute the expected rates of multimessenger observations in the advanced detector era and find that searches triggered on short GRBs—with current high-energy instruments, such as Fermi—and nucleosynthetic “kilonovae”—with future optical surveys, like the Large Synoptic Survey Telescope—can boost the number of multimessenger detections by 15% and 40%, respectively, for a binary neutron star progenitor model. Short GRB triggers offer precise merger timing but suffer from detection rates decreased by beaming and the high a priori probability that the source is outside the LIGO-Virgo sensitive volume. Isotropic kilonovae, on the other hand, could be commonly observed within the LIGO-Virgo sensitive volume with an instrument roughly an order of magnitude more sensitive than current optical surveys. We propose that the most productive strategy for making multimessenger gravitational-wave observations is using triggers from future deep, optical all-sky surveys, with characteristics comparable to the Large Synoptic Survey Telescope, which could make as many as ten such coincident observations a year.
Implementation of a Collision Probability Prediction Technique for Constellation Maneuver Planning
NASA Technical Reports Server (NTRS)
Concha, Marco a.
2007-01-01
On March 22, 2006, the Space Technology 5 (ST5) constellation spacecraft were successfully delivered to orbit by a Pegasus XI, launch vehicle. An unexpected relative motion experienced by the constellation after orbit insertion brought about a problem. Soon after launch the observed relative position of the inert rocket body was between the leading and the middle spacecraft within the constellation. The successful planning and execution of an orbit maneuver that would create a fly-by of the rocket body was required to establish the.formation. This maneuver would create a close approach that needed to conform to predefined collision probability requirements. On April 21, 2006, the ST5 "155" spacecraft performed a large orbit maneuver and successfully passed the inert Pegasus 3rd Stage Rocket Body on April 30, 2006 15:20 UTC at a distance of 2.55 km with a Probability of Collision of less than 1.0E-06. This paper will outline the technique that was implemented to establish the safe planning and execution of the fly-by maneuver. The method makes use of Gaussian distribution models of state covariance to determine underlying probabilities of collision that arise under low velocity encounters. Specific numerical examples used for this analysis are discussed in detail. The mechanics of this technique are explained to foster deeper understanding of the concepts presented and to improve existing processes for use in future constellation maneuver planning.
NASA Astrophysics Data System (ADS)
Muthsam, O.; Vogler, C.; Suess, D.
2017-12-01
It is assumed that heat-assisted magnetic recording is the recording technique of the future. For pure hard magnetic grains in high density media with an average diameter of 5 nm and a height of 10 nm, the switching probability is not sufficiently high for the use in bit-patterned media. Using a bilayer structure with 50% hard magnetic material with low Curie temperature and 50% soft magnetic material with high Curie temperature to obtain more than 99.2% switching probability leads to very large jitter. We propose an optimized material composition to reach a switching probability of Pswitch > 99.2% and simultaneously achieve the narrow transition jitter of pure hard magnetic material. Simulations with a continuous laser spot were performed with the atomistic simulation program VAMPIRE for a single cylindrical recording grain with a diameter of 5 nm and a height of 10 nm. Different configurations of soft magnetic material and different amounts of hard and soft magnetic material were tested and discussed. Within our analysis, a composition with 20% soft magnetic and 80% hard magnetic material reaches the best results with a switching probability Pswitch > 99.2%, an off-track jitter parameter σoff,80/20 = 0.46 nm and a down-track jitter parameter σdown,80/20 = 0.49 nm.
NASA Astrophysics Data System (ADS)
Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.
2017-12-01
Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of getting a hit is N%" or "the probability of an earthquake is N%" involves specifying the assumptions made. Different plausible assumptions yield a wide range of estimates. In both seismology and sports, how to better predict future performance remains an important question.
Earthquake probabilities in the San Francisco Bay Region: 2000 to 2030 - a summary of findings
,
1999-01-01
The San Francisco Bay region sits astride a dangerous “earthquake machine,” the tectonic boundary between the Pacific and North American Plates. The region has experienced major and destructive earthquakes in 1838, 1868, 1906, and 1989, and future large earthquakes are a certainty. The ability to prepare for large earthquakes is critical to saving lives and reducing damage to property and infrastructure. An increased understanding of the timing, size, location, and effects of these likely earthquakes is a necessary component in any effective program of preparedness. This study reports on the probabilities of occurrence of major earthquakes in the San Francisco Bay region (SFBR) for the three decades 2000 to 2030. The SFBR extends from Healdsberg on the northwest to Salinas on the southeast and encloses the entire metropolitan area, including its most rapidly expanding urban and suburban areas. In this study a “major” earthquake is defined as one with M≥6.7 (where M is moment magnitude). As experience from the Northridge, California (M6.7, 1994) and Kobe, Japan (M6.9, 1995) earthquakes has shown us, earthquakes of this size can have a disastrous impact on the social and economic fabric of densely urbanized areas. To reevaluate the probability of large earthquakes striking the SFBR, the U.S. Geological Survey solicited data, interpretations, and analyses from dozens of scientists representing a wide crosssection of the Earth-science community (Appendix A). The primary approach of this new Working Group (WG99) was to develop a comprehensive, regional model for the long-term occurrence of earthquakes, founded on geologic and geophysical observations and constrained by plate tectonics. The model considers a broad range of observations and their possible interpretations. Using this model, we estimate the rates of occurrence of earthquakes and 30-year earthquake probabilities. Our study considers a range of magnitudes for earthquakes on the major faults in the region—an innovation over previous studies of the SFBR that considered only a small number of potential earthquakes of fixed magnitude.
A pilot study of naturally occurring high-probability request sequences in hostage negotiations.
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research.
A PILOT STUDY OF NATURALLY OCCURRING HIGH-PROBABILITY REQUEST SEQUENCES IN HOSTAGE NEGOTIATIONS
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research. PMID:19949541
Earthquake forecast for the Wasatch Front region of the Intermountain West
DuRoss, Christopher B.
2016-04-18
The Working Group on Utah Earthquake Probabilities has assessed the probability of large earthquakes in the Wasatch Front region. There is a 43 percent probability of one or more magnitude 6.75 or greater earthquakes and a 57 percent probability of one or more magnitude 6.0 or greater earthquakes in the region in the next 50 years. These results highlight the threat of large earthquakes in the region.
A Deterministic Approach to Active Debris Removal Target Selection
NASA Astrophysics Data System (ADS)
Lidtke, A.; Lewis, H.; Armellin, R.
2014-09-01
Many decisions, with widespread economic, political and legal consequences, are being considered based on space debris simulations that show that Active Debris Removal (ADR) may be necessary as the concerns about the sustainability of spaceflight are increasing. The debris environment predictions are based on low-accuracy ephemerides and propagators. This raises doubts about the accuracy of those prognoses themselves but also the potential ADR target-lists that are produced. Target selection is considered highly important as removal of many objects will increase the overall mission cost. Selecting the most-likely candidates as soon as possible would be desirable as it would enable accurate mission design and allow thorough evaluation of in-orbit validations, which are likely to occur in the near-future, before any large investments are made and implementations realized. One of the primary factors that should be used in ADR target selection is the accumulated collision probability of every object. A conjunction detection algorithm, based on the smart sieve method, has been developed. Another algorithm is then applied to the found conjunctions to compute the maximum and true probabilities of collisions taking place. The entire framework has been verified against the Conjunction Analysis Tools in AGIs Systems Toolkit and relative probability error smaller than 1.5% has been achieved in the final maximum collision probability. Two target-lists are produced based on the ranking of the objects according to the probability they will take part in any collision over the simulated time window. These probabilities are computed using the maximum probability approach, that is time-invariant, and estimates of the true collision probability that were computed with covariance information. The top-priority targets are compared, and the impacts of the data accuracy and its decay are highlighted. General conclusions regarding the importance of Space Surveillance and Tracking for the purpose of ADR are also drawn and a deterministic method for ADR target selection, which could reduce the number of ADR missions to be performed, is proposed.
Coe, J.A.; Michael, J.A.; Crovelli, R.A.; Savage, W.Z.; Laprade, W.T.; Nashem, W.D.
2004-01-01
Ninety years of historical landslide records were used as input to the Poisson and binomial probability models. Results from these models show that, for precipitation-triggered landslides, approximately 9 percent of the area of Seattle has annual exceedance probabilities of 1 percent or greater. Application of the Poisson model for estimating the future occurrence of individual landslides results in a worst-case scenario map, with a maximum annual exceedance probability of 25 percent on a hillslope near Duwamish Head in West Seattle. Application of the binomial model for estimating the future occurrence of a year with one or more landslides results in a map with a maximum annual exceedance probability of 17 percent (also near Duwamish Head). Slope and geology both play a role in localizing the occurrence of landslides in Seattle. A positive correlation exists between slope and mean exceedance probability, with probability tending to increase as slope increases. Sixty-four percent of all historical landslide locations are within 150 m (500 ft, horizontal distance) of the Esperance Sand/Lawton Clay contact, but within this zone, no positive or negative correlation exists between exceedance probability and distance to the contact.
Predicting the impact of tsunami in California under rising sea level
NASA Astrophysics Data System (ADS)
Dura, T.; Garner, A. J.; Weiss, R.; Kopp, R. E.; Horton, B.
2017-12-01
The flood hazard for the California coast depends not only on the magnitude, location, and rupture length of Alaska-Aleutian subduction zone earthquakes and their resultant tsunamis, but also on rising sea levels, which combine with tsunamis to produce overall flood levels. The magnitude of future sea-level rise remains uncertain even on the decadal scale, with future sea-level projections becoming even more uncertain at timeframes of a century or more. Earthquake statistics indicate that timeframes of ten thousand to one hundred thousand years are needed to capture rare, very large earthquakes. Because of the different timescales between reliable sea-level projections and earthquake distributions, simply combining the different probabilities in the context of a tsunami hazard assessment may be flawed. Here, we considered 15 earthquakes between Mw 8 to Mw 9.4 bound by -171oW and -140oW of the Alaska-Aleutian subduction zone. We employed 24 realizations at each magnitude with random epicenter locations and different fault length-to-width ratios, and simulated the tsunami evolution from these 360 earthquakes at each decade from the years 2000 to 2200. These simulations were then carried out for different sea-level-rise projections to analyze the future flood hazard for California. Looking at the flood levels at tide gauges, we found that the flood level simulated at, for example, the year 2100 (including respective sea-level change) is different from the flood level calculated by adding the flood for the year 2000 to the sea-level change prediction for the year 2100. This is consistent for all sea-level rise scenarios, and this difference in flood levels range between 5% and 12% for the larger half of the given magnitude interval. Focusing on flood levels at the tide gauge in the Port of Los Angeles, the most probable flood level (including all earthquake magnitudes) in the year 2000 was 5 cm. Depending on the sea-level predictions, in the year 2050 the most probable flood levels could rise to 20 to 30 cm, but increase significantly from 2100 to 2200 to between 0.5 m and 2.5 m. Aside from the significant increase in flood level, it should be noted that the range over which potential most probable flood levels can vary is significant and defines a tremendous challenge for long-term planning of hazard mitigating measures.
Methods, apparatus and system for notification of predictable memory failure
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2017-01-03
A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.
United States Geological Survey fire science: fire danger monitoring and forecasting
Eidenshink, Jeff C.; Howard, Stephen M.
2012-01-01
Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.
Domestication and Breeding of Tomatoes: What have We Gained and What Can We Gain in the Future?
Bai, Yuling; Lindhout, Pim
2007-01-01
Background It has been shown that a large variation is present and exploitable from wild Solanum species but most of it is still untapped. Considering the thousands of Solanum accessions in different gene banks and probably even more that are still untouched in the Andes, it is a challenge to exploit the diversity of tomato. What have we gained from tomato domestication and breeding and what can we gain in the future? Scope This review summarizes progress on tomato domestication and breeding and current efforts in tomato genome research. Also, it points out potential challenges in exploiting tomato biodiversity and depicts future perspectives in tomato breeding with the emerging knowledge from tomato-omics. Conclusions From first domestication to modern breeding, the tomato has been continually subjected to human selection for a wide array of applications in both science and commerce. Current efforts in tomato breeding are focused on discovering and exploiting genes for the most important traits in tomato germplasm. In the future, breeders will design cultivars by a process named ‘breeding by design’ based on the combination of science and technologies from the genomic era as well as their practical skills. PMID:17717024
Dosimetry in nuclear medicine therapy: radiobiology application and results.
Strigari, L; Benassi, M; Chiesa, C; Cremonesi, M; Bodei, L; D'Andrea, M
2011-04-01
The linear quadratic model (LQM) has largely been used to assess the radiobiological damage to tissue by external beam fractionated radiotherapy and more recently has been extended to encompass a general continuous time varying dose rate protocol such as targeted radionuclide therapy (TRT). In this review, we provide the basic aspects of radiobiology, from a theoretical point of view, starting from the "four Rs" of radiobiology and introducing the biologically effective doses, which may be used to quantify the impact of a treatment on both tumors and normal tissues. We also present the main parameters required in the LQM, and illustrate the main models of tumor control probability and normal tissue complication probability and summarize the main dose-effect responses, reported in literature, which demonstrate the tentative link between targeted radiotherapy doses and those used in conventional radiotherapy. A better understanding of the radiobiology and mechanisms of action of TRT could contribute to describe the clinical data and guide the development of future compounds and the designing of prospective clinical trials.
Ochratoxin A in Moroccan foods: occurrence and legislation.
Zinedine, Abdellah
2010-05-01
Ochratoxin A (OTA) is secondary metabolite naturally produced in food and feed by toxigenic fungi, especially some Aspergillus species and Penicillium verucosum. OTA is one of the most studied mycotoxins and is of great interest due to its toxic effects on human and animals. OTA is produced in different food and feed matrices and contaminates a large range of base foods including cereals and derivatives, spices, dried fruits, wine and coffee, etc. Morocco, a North African country, has a climate characterized by high humidity and temperature, which probably favors the growth of molds. This contribution gives an overview of principal investigations about the presence of OTA in foods available in Morocco. Due to its toxicity, OTA presence is increasingly regulated worldwide, especially in countries of the European Union. However, up until now, no regulation limits were in force in Morocco, probably due to the ignorance of the health and economic problems resulting from OTA contamination. Finally, recommendations and future research directions are given required to assess the situation completely.
Ochratoxin A in Moroccan Foods: Occurrence and Legislation
Zinedine, Abdellah
2010-01-01
Ochratoxin A (OTA) is secondary metabolite naturally produced in food and feed by toxigenic fungi, especially some Aspergillus species and Penicillium verucosum. OTA is one of the most studied mycotoxins and is of great interest due to its toxic effects on human and animals. OTA is produced in different food and feed matrices and contaminates a large range of base foods including cereals and derivatives, spices, dried fruits, wine and coffee, etc. Morocco, a North African country, has a climate characterized by high humidity and temperature, which probably favors the growth of molds. This contribution gives an overview of principal investigations about the presence of OTA in foods available in Morocco. Due to its toxicity, OTA presence is increasingly regulated worldwide, especially in countries of the European Union. However, up until now, no regulation limits were in force in Morocco, probably due to the ignorance of the health and economic problems resulting from OTA contamination. Finally, recommendations and future research directions are given required to assess the situation completely. PMID:22069630
A temporal forecast of radiation environments for future space exploration missions.
Kim, Myung-Hee Y; Cucinotta, Francis A; Wilson, John W
2007-06-01
The understanding of future space radiation environments is an important goal for space mission operations, design, and risk assessment. We have developed a solar cycle statistical model in which sunspot number is coupled to space-related quantities, such as the galactic cosmic radiation (GCR) deceleration potential (phi) and the mean occurrence frequency of solar particle events (SPEs). Future GCR fluxes were derived from a predictive model, in which the temporal dependence represented by phi was derived from GCR flux and ground-based Climax neutron monitor rate measurements over the last four decades. These results showed that the point dose equivalent inside a typical spacecraft in interplanetary space was influenced by solar modulation by up to a factor of three. It also has been shown that a strong relationship exists between large SPE occurrences and phi. For future space exploration missions, cumulative probabilities of SPEs at various integral fluence levels during short-period missions were defined using a database of proton fluences of past SPEs. Analytic energy spectra of SPEs at different ranks of the integral fluences for energies greater than 30 MeV were constructed over broad energy ranges extending out to GeV for the analysis of representative exposure levels at those fluences. Results will guide the design of protection systems for astronauts during future space exploration missions.
Electrofishing effort requirements for estimating species richness in the Kootenai River, Idaho
Watkins, Carson J.; Quist, Michael C.; Shepard, Bradley B.; Ireland, Susan C.
2016-01-01
This study was conducted on the Kootenai River, Idaho to provide insight on sampling requirements to optimize future monitoring effort associated with the response of fish assemblages to habitat rehabilitation. Our objective was to define the electrofishing effort (m) needed to have a 95% probability of sampling 50, 75, and 100% of the observed species richness and to evaluate the relative influence of depth, velocity, and instream woody cover on sample size requirements. Sidechannel habitats required more sampling effort to achieve 75 and 100% of the total species richness than main-channel habitats. The sampling effort required to have a 95% probability of sampling 100% of the species richness was 1100 m for main-channel sites and 1400 m for side-channel sites. We hypothesized that the difference in sampling requirements between main- and side-channel habitats was largely due to differences in habitat characteristics and species richness between main- and side-channel habitats. In general, main-channel habitats had lower species richness than side-channel habitats. Habitat characteristics (i.e., depth, current velocity, and woody instream cover) were not related to sample size requirements. Our guidelines will improve sampling efficiency during monitoring effort in the Kootenai River and provide insight on sampling designs for other large western river systems where electrofishing is used to assess fish assemblages.
A Bayesian blind survey for cold molecular gas in the Universe
NASA Astrophysics Data System (ADS)
Lentati, L.; Carilli, C.; Alexander, P.; Walter, F.; Decarli, R.
2014-10-01
A new Bayesian method for performing an image domain search for line-emitting galaxies is presented. The method uses both spatial and spectral information to robustly determine the source properties, employing either simple Gaussian, or other physically motivated models whilst using the evidence to determine the probability that the source is real. In this paper, we describe the method, and its application to both a simulated data set, and a blind survey for cold molecular gas using observations of the Hubble Deep Field-North taken with the Plateau de Bure Interferometer. We make a total of six robust detections in the survey, five of which have counterparts in other observing bands. We identify the most secure detections found in a previous investigation, while finding one new probable line source with an optical ID not seen in the previous analysis. This study acts as a pilot application of Bayesian statistics to future searches to be carried out both for low-J CO transitions of high-redshift galaxies using the Jansky Very Large Array (JVLA), and at millimetre wavelengths with Atacama Large Millimeter/submillimeter Array (ALMA), enabling the inference of robust scientific conclusions about the history of the molecular gas properties of star-forming galaxies in the Universe through cosmic time.
Testing physical models for dipolar asymmetry with CMB polarization
NASA Astrophysics Data System (ADS)
Contreras, D.; Zibin, J. P.; Scott, D.; Banday, A. J.; Górski, K. M.
2017-12-01
The cosmic microwave background (CMB) temperature anisotropies exhibit a large-scale dipolar power asymmetry. To determine whether this is due to a real, physical modulation or is simply a large statistical fluctuation requires the measurement of new modes. Here we forecast how well CMB polarization data from Planck and future experiments will be able to confirm or constrain physical models for modulation. Fitting several such models to the Planck temperature data allows us to provide predictions for polarization asymmetry. While for some models and parameters Planck polarization will decrease error bars on the modulation amplitude by only a small percentage, we show, importantly, that cosmic-variance-limited (and in some cases even Planck) polarization data can decrease the errors by considerably better than the expectation of √{2 } based on simple ℓ-space arguments. We project that if the primordial fluctuations are truly modulated (with parameters as indicated by Planck temperature data) then Planck will be able to make a 2 σ detection of the modulation model with 20%-75% probability, increasing to 45%-99% when cosmic-variance-limited polarization is considered. We stress that these results are quite model dependent. Cosmic variance in temperature is important: combining statistically isotropic polarization with temperature data will spuriously increase the significance of the temperature signal with 30% probability for Planck.
A large silent earthquake and the future rupture of the Guerrero seismic
NASA Astrophysics Data System (ADS)
Kostoglodov, V.; Lowry, A.; Singh, S.; Larson, K.; Santiago, J.; Franco, S.; Bilham, R.
2003-04-01
The largest global earthquakes typically occur at subduction zones, at the seismogenic boundary between two colliding tectonic plates. These earthquakes release elastic strains accumulated over many decades of plate motion. Forecasts of these events have large errors resulting from poor knowledge of the seismic cycle. The discovery of slow slip events or "silent earthquakes" in Japan, Alaska, Cascadia and Mexico provides a new glimmer of hope. In these subduction zones, the seismogenic part of the plate interface is loading not steadily as hitherto believed, but incrementally, partitioning the stress buildup with the slow slip events. If slow aseismic slip is limited to the region downdip of the future rupture zone, slip events may increase the stress at the base of the seismogenic region, incrementing it closer to failure. However if some aseismic slip occurs on the future rupture zone, the partitioning may significantly reduce the stress buildup rate (SBR) and delay a future large earthquake. Here we report characteristics of the largest slow earthquake observed to date (Mw 7.5), and its implications for future failure of the Guerrero seismic gap, Mexico. The silent earthquake began in October 2001 and lasted for 6-7 months. Slow slip produced measurable displacements over an area of 550x250 km2. Average slip on the interface was about 10 cm and the equivalent magnitude, Mw, was 7.5. A shallow subhorizontal configuration of the plate interface in Guererro is a controlling factor for the physical conditions favorable for such extensive slow slip. The total coupled zone in Guerrero is 120-170 km wide while the seismogenic, shallowest portion is only 50 km. This future rupture zone may slip contemporaneously with the deeper aseismic sleep, thereby reducing SBR. The slip partitioning between seismogenic and transition coupled zones may diminish SBR up to 50%. These two factors are probably responsible for a long (at least since 1911) quiet on the Guerrero seismic gap in Mexico. The discovery of silent earthquakes in Guerrero in 1972, 1979, 1998, and 2001-2002 calls for a reassessment of the seismic potential and careful seismotectonic monitoring of the seismic gaps in Mexico.
Hawaii natural compounds are promising to reduce ovarian cancer deaths.
Fei-Zhang, David J; Li, Chunshun; Cao, Shugeng
2016-07-02
The low survival rate of patients with ovarian cancer largely results from the advanced ovarian tumors as well as tumor resistance to chemotherapy, leading to metastasis and recurrence. However, it is missing as to an effective therapeutic approach that focuses on these aspects to prolong progression-free survival and to decrease mortality in ovarian cancer patients. Here, based on our cancer drug discovery studies, we provide prospective insights into the development of a future line of drugs to effectively reduce ovarian cancer deaths. Pathways that increase the probability of cancer, such as the defective Fanconi anemia (FA) pathway, may render cancer cells more sensitive to new drug targeting.
NASA Astrophysics Data System (ADS)
Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine
2016-04-01
Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.
Submarine landslides of the Southern California Borderland
Lee, H.J.; Greene, H. Gary; Edwards, B.D.; Fisher, M.A.; Normark, W.R.
2009-01-01
Conventional bathymetry, sidescan-sonar and seismic-reflection data, and recent, multibeam surveys of large parts of the Southern California Borderland disclose the presence of numerous submarine landslides. Most of these features are fairly small, with lateral dimensions less than ??2 km. In areas where multibeam surveys are available, only two large landslide complexes were identified on the mainland slope- Goleta slide in Santa Barbara Channel and Palos Verdes debris avalanche on the San Pedro Escarpment south of Palos Verdes Peninsula. Both of these complexes indicate repeated recurrences of catastrophic slope failure. Recurrence intervals are not well constrained but appear to be in the range of 7500 years for the Goleta slide. The most recent major activity of the Palos Verdes debris avalanche occurred roughly 7500 years ago. A small failure deposit in Santa Barbara Channel, the Gaviota mudflow, was perhaps caused by an 1812 earthquake. Most landslides in this region are probably triggered by earthquakes, although the larger failures were likely conditioned by other factors, such as oversteepening, development of shelf-edge deltas, and high fluid pressures. If a subsequent future landslide were to occur in the area of these large landslide complexes, a tsunami would probably result. Runup distances of 10 m over a 30-km-long stretch of the Santa Barbara coastline are predicted for a recurrence of the Goleta slide, and a runup of 3 m over a comparable stretch of the Los Angeles coastline is modeled for the Palos Verdes debris avalanche. ?? 2009 The Geological Society of America.
Large area robust identification of snow cover from multitemporal COSMO-SkyMed images
NASA Astrophysics Data System (ADS)
Pettinato, S.; Santi, E.; Paloscia, S.; Aiazzi, B.; Baronti, S.; Palchetti, E.; Garzelli, A.
2015-10-01
This paper investigates the ability of the Information Theoretic Snow Detection Algorithm (ITSDA) in detecting changes due to snow cover between summer and winter seasons on large area images acquired by COSMO-SkyMed constellation. ITSDA is a method for change detection in multitemporal SAR images, which has been recently applied by the authors to a subset of Cosmo-SkyMed data. The proposed technique is based on a nonparametric approach in the framework of Shannon's information theory, and in particular it features the conditional probability of the local means between the two images taken at different times. Such an unsupervised approach does not require any preliminary despeckling procedure to be performed before the calculation of the change map. In the case of a low quantity of anomalous changes in relatively small-size images, a mean shift procedure can be utilized for refining the map. However, in the present investigation, the changes to be identified are pervasive in large size images. Consequently, for computational issues, the mean shift refinement has been omitted in the present work. However, a simplified implementation of mean shift procedure to save time will be possibly considered in future submissions. In any case, the present version of ITSDA method preserve its characteristics of flexibility and sensibility to backscattering changes, thanks to the possibility of setting up the number of quantization levels in the estimation of the conditional probability between the amplitude values at the two acquisition dates.
Handbook for Conducting Future Studies in Education.
ERIC Educational Resources Information Center
Phi Delta Kappa, Bloomington, IN.
This handbook is designed to aid school administrators, policy-makers, and teachers in bringing a "futures orientation" to their schools. The first part of the book describes a "futuring process" developed as a tool for examining alternative future probabilities. It consists of a series of diverging and converging techniques that alternately…
WKB theory of large deviations in stochastic populations
NASA Astrophysics Data System (ADS)
Assaf, Michael; Meerson, Baruch
2017-06-01
Stochasticity can play an important role in the dynamics of biologically relevant populations. These span a broad range of scales: from intra-cellular populations of molecules to population of cells and then to groups of plants, animals and people. Large deviations in stochastic population dynamics—such as those determining population extinction, fixation or switching between different states—are presently in a focus of attention of statistical physicists. We review recent progress in applying different variants of dissipative WKB approximation (after Wentzel, Kramers and Brillouin) to this class of problems. The WKB approximation allows one to evaluate the mean time and/or probability of population extinction, fixation and switches resulting from either intrinsic (demographic) noise, or a combination of the demographic noise and environmental variations, deterministic or random. We mostly cover well-mixed populations, single and multiple, but also briefly consider populations on heterogeneous networks and spatial populations. The spatial setting also allows one to study large fluctuations of the speed of biological invasions. Finally, we briefly discuss possible directions of future work.
NASA Astrophysics Data System (ADS)
Gilligan, J. M.; Nay, J. J.; van der Linden, M.
2016-12-01
Despite overwhelming scientific evidence and an almost complete consensus among scientists, a large fraction of the American public is not convinced that global warming is anthropogenic. This doubt correlates strongly with political, ideological, and cultural orientation. [1] It has been proposed that people who do not trust climate scientists tend to trust markets, so prediction markets might be able to influence their beliefs about the causes of climate change. [2] We present results from an agent-based simulation of a prediction market in which traders invest based on their beliefs about what drives global temperature change (here, either CO2 concentration or total solar irradiance (TSI), which is a popular hypothesis among many who doubt the dominant role of CO2). At each time step, traders use historical and observed temperatures and projected future forcings (CO2 or TSI) to update Bayesian posterior probability distributions for future temperatures, conditional on their belief about what drives climate change. Traders then bet on future temperatures by trading in climate futures. Trading proceeds by a continuous double auction. Traders are randomly assigned initial beliefs about climate change, and they have some probability of changing their beliefs to match those of the most successful traders in their social network. We simulate two alternate realities in which the global temperature is controlled either by CO2 or by TSI, with stochastic noise. In both cases traders' beliefs converge, with a large majority reaching agreement on the actual cause of climate change. This convergence is robust, but the speed with which consensus emerges depends on characteristics of the traders' psychology and the structure of the market. Our model can serve as a test-bed for studying how beliefs might evolve under different market structures and different modes of decision-making and belief-change. We will report progress on studying alternate models of belief-change. This work was partially supported by National Science Foundation grants EAR-1416964, EAR-1204685, and IIS-1526860. The model code is available at https://github.com/JohnNay/predMarket [1] A Leiserowitz, E Maibach, & C Roser-Renouf, Global Warming's Six Americas (Yale U., 2009). [2] MP Vandenbergh, KT Raimi, & JM Gilligan. UCLA Law Rev. 61, 1962 (2014).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong
Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less
Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong; ...
2016-12-08
Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less
Bioinformatic analyses to select phenotype affecting polymorphisms in HTR2C gene.
Piva, Francesco; Giulietti, Matteo; Baldelli, Luisa; Nardi, Bernardo; Bellantuono, Cesario; Armeni, Tatiana; Saccucci, Franca; Principato, Giovanni
2011-08-01
Single nucleotide polymorphisms (SNPs) in serotonin related genes influence mental disorders, responses to pharmacological and psychotherapeutic treatments. In planning association studies, researchers that want to investigate new SNPs have to select some among a large number of candidates. Our aim is to guide researchers in the selection of the most likely phenotype affecting polymorphisms. Here, we studied serotonin receptor 2C (HTR2C) SNPs because, till now, only relatively few of about 2000 are investigated. We used the most updated and assessed bioinformatic tools to predict which variations can give rise to biological effects among 2450 HTR2C SNPs. We suggest 48 SNPs that are worth considering in future association studies in the field of psychiatry, psychology and pharmacogenomics. Moreover, our analyses point out the biological level probably affected, such as transcription, splicing, miRNA regulation and protein structure, thus allowing to suggest future molecular investigations. Although few association studies are available in literature, their results are in agreement with our predictions, showing that our selection methods can help to guide future association studies. Copyright © 2011 John Wiley & Sons, Ltd.
A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.
Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E
2015-01-01
One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.
A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size
Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E.
2015-01-01
One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics. PMID:26381745
SignalPlant: an open signal processing software platform.
Plesinger, F; Jurco, J; Halamek, J; Jurak, P
2016-07-01
The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75 × 10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.
Model-informed risk assessment for Zika virus outbreaks in the Asia-Pacific regions.
Teng, Yue; Bi, Dehua; Xie, Guigang; Jin, Yuan; Huang, Yong; Lin, Baihan; An, Xiaoping; Tong, Yigang; Feng, Dan
2017-05-01
Recently, Zika virus (ZIKV) has been recognized as a significant threat to global public health. The disease was present in large parts of the Americas, the Caribbean, and also the western Pacific area with southern Asia during 2015 and 2016. However, little is known about the factors affecting the transmission of ZIKV. We used Gradient Boosted Regression Tree models to investigate the effects of various potential explanatory variables on the spread of ZIKV, and used current with historical information from a range of sources to assess the risks of future ZIKV outbreaks. Our results indicated that the probability of ZIKV outbreaks increases with vapor pressure, the occurrence of Dengue virus, and population density but decreases as health expenditure, GDP, and numbers of travelers. The predictive results revealed the potential risk countries of ZIKV infection in the Asia-Pacific regions between October 2016 and January 2017. We believe that the high-risk conditions would continue in South Asia and Australia over this period. By integrating information on eco-environmental, social-economical, and ZIKV-related niche factors, this study estimated the probability for locally acquired mosquito-borne ZIKV infections in the Asia-Pacific region and improves the ability to forecast, and possibly even prevent, future outbreaks of ZIKV. Copyright © 2017 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.
2009-04-01
Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in most of the areas, and therefore a high potential danger. The FlamMap outputs and the derived fire probability maps can be used in decision support systems for fire spread and behaviour and for fire danger assessment with actual and future fire regimes.
McGowan, Conor P.; Allan, Nathan; Servoss, Jeff; Hedwall, Shaula J.; Wooldridge, Brian
2017-01-01
Assessment of a species' status is a key part of management decision making for endangered and threatened species under the U.S. Endangered Species Act. Predicting the future state of the species is an essential part of species status assessment, and projection models can play an important role in developing predictions. We built a stochastic simulation model that incorporated parametric and environmental uncertainty to predict the probable future status of the Sonoran desert tortoise in the southwestern United States and North Central Mexico. Sonoran desert tortoise was a Candidate species for listing under the Endangered Species Act, and decision makers wanted to use model predictions in their decision making process. The model accounted for future habitat loss and possible effects of climate change induced droughts to predict future population growth rates, abundances, and quasi-extinction probabilities. Our model predicts that the population will likely decline over the next few decades, but there is very low probability of quasi-extinction less than 75 years into the future. Increases in drought frequency and intensity may increase extinction risk for the species. Our model helped decision makers predict and characterize uncertainty about the future status of the species in their listing decision. We incorporated complex ecological processes (e.g., climate change effects on tortoises) in transparent and explicit ways tailored to support decision making processes related to endangered species.
Convergence of Transition Probability Matrix in CLVMarkov Models
NASA Astrophysics Data System (ADS)
Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.
2018-04-01
A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.
A 30-year history of earthquake crisis communication in California and lessons for the future
NASA Astrophysics Data System (ADS)
Jones, L.
2015-12-01
The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories
Knotting probability of self-avoiding polygons under a topological constraint.
Uehara, Erica; Deguchi, Tetsuo
2017-09-07
We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius r ex . For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius r ex . It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius r ex corresponds to the screening length.
Knotting probability of self-avoiding polygons under a topological constraint
NASA Astrophysics Data System (ADS)
Uehara, Erica; Deguchi, Tetsuo
2017-09-01
We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius rex. For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius rex. It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius rex corresponds to the screening length.
Reconstructing the deadly eruptive events of 1790 CE at Kīlauea Volcano, Hawai‘i
Swanson, Don; Weaver, Samantha J; Houghton, Bruce F.
2014-01-01
A large number of people died during an explosive eruption of Kīlauea Volcano in 1790 CE. Detailed study of the upper part of the Keanakāko‘i Tephra has identified the deposits that may have been responsible for the deaths. Three successive units record shifts in eruption style that agree well with accounts of the eruption based on survivor interviews 46 yr later. First, a wet fall of very fine, accretionary-lapilli–bearing ash created a “cloud of darkness.” People walked across the soft deposit, leaving footprints as evidence. While the ash was still unconsolidated, lithic lapilli fell into it from a high eruption column that was seen from 90 km away. Either just after this tephra fall or during its latest stage, pulsing dilute pyroclastic density currents, probably products of a phreatic eruption, swept across the western flank of Kīlauea, embedding lapilli in the muddy ash and crossing the trail along which the footprints occur. The pyroclastic density currents were most likely responsible for the fatalities, as judged from the reported condition and probable location of the bodies. This reconstruction is relevant today, as similar eruptions will probably occur in the future at Kīlauea and represent its most dangerous and least predictable hazard.
A causal loop analysis of the sustainability of integrated community case management in Rwanda.
Sarriot, Eric; Morrow, Melanie; Langston, Anne; Weiss, Jennifer; Landegger, Justine; Tsuma, Laban
2015-04-01
Expansion of community health services in Rwanda has come with the national scale up of integrated Community Case Management (iCCM) of malaria, pneumonia and diarrhea. We used a sustainability assessment framework as part of a large-scale project evaluation to identify factors affecting iCCM sustainability (2011). We then (2012) used causal-loop analysis to identify systems determinants of iCCM sustainability from a national systems perspective. This allows us to develop three high-probability future scenarios putting the achievements of community health at risk, and to recommend mitigating strategies. Our causal loop diagram highlights both balancing and reinforcing loops of cause and effect in the national iCCM system. Financial, political and technical scenarios carry high probability for threatening the sustainability through: (1) reduction in performance-based financing resources, (2) political shocks and erosion of political commitment for community health, and (3) insufficient progress in resolving district health systems--"building blocks"--performance gaps. In a complex health system, the consequences of choices may be delayed and hard to predict precisely. Causal loop analysis and scenario mapping make explicit complex cause-and-effects relationships and high probability risks, which need to be anticipated and mitigated. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Keyser, Alisa; Westerling, Anthony LeRoy
2017-05-01
A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.
E-Cigarettes and the Drug Use Patterns of Adolescents.
Miech, Richard A; O'Malley, Patrick M; Johnston, Lloyd D; Patrick, Megan E
2016-05-01
This study examines the role of e-cigarettes in the drug use patterns of adolescents. Of specific interest is whether adolescent e-cigarette users fall into a group of (1) youth who do not use traditional drugs of abuse or (2) polysubstance users. Using latent class analysis, we identify major "classes" of substance users on the basis of recent use of e-cigarettes, alcohol, marijuana, cigarettes, and prescription drugs. Analyses are conducted separately for adolescents in 8th, 10th, and 12th grades. Data come from 16 615 participants in the 2014 Monitoring the Future survey. Youth who do not use traditional drugs of abuse account for about 50% of e-cigarette users in 8th grade, 35% in 10th grade, and 17% in 12th grade. These youth come from a large "low-level users" group found in each grade, characterized by low probability of use for all substances (e-cigarette probability in this group for 8th graders = .046; 10th graders = .071; 12th graders = .027). Other e-cigarette users come from a smaller, "poly-users" group found in each grade, characterized by high-to-moderate probabilities (.83-.21) of using e-cigarettes and other substances. Specific to 12th grade is a third, additional polysubstance group characterized by high likelihood of e-cigarette use (.93). The proportion of e-cigarette users who do not use traditional drugs of abuse is larger at younger ages. Longitudinal panel studies starting at 8th and 10th grades may best inform the current debate on whether e-cigarette use is a risk or protective factor for future transition to the use of other substances. The proportion of e-cigarette users who do not use traditional drugs of abuse is larger at younger ages. Longitudinal panel studies starting at 8th and 10th grades may best inform the current debate on whether e-cigarette use is a risk or protective factor for future transition to the use of other substances. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Martin, Julien; Runge, Michael C.; Flewelling, Leanne J.; Deutsch, Charles J.; Landsberg, Jan H.
2017-11-20
Red tides (blooms of the harmful alga Karenia brevis) are one of the major sources of mortality for the Florida manatee (Trichechus manatus latirostris), especially in southwest Florida. It has been hypothesized that the frequency and severity of red tides may increase in the future because of global climate change and other factors. To improve our ecological forecast for the effects of red tides on manatee population dynamics and long-term persistence, we conducted a formal expert judgment process to estimate probability distributions for the frequency and relative magnitude of red-tide-related manatee mortality (RTMM) events over a 100-year time horizon in three of the four regions recognized as manatee management units in Florida. This information was used to update a population viability analysis for the Florida manatee (the Core Biological Model). We convened a panel of 12 experts in manatee biology or red-tide ecology; the panel met to frame, conduct, and discuss the elicitation. Each expert provided a best estimate and plausible low and high values (bounding a confidence level of 80 percent) for each parameter in each of three regions (Northwest, Southwest, and Atlantic) of the subspecies’ range (excluding the Upper St. Johns River region) for two time periods (0−40 and 41−100 years from present). We fitted probability distributions for each parameter, time period, and expert by using these three elicited values. We aggregated the parameter estimates elicited from individual experts and fitted a parametric distribution to the aggregated results.Across regions, the experts expected the future frequency of RTMM events to be higher than historical levels, which is consistent with the hypothesis that global climate change (among other factors) may increase the frequency of red-tide blooms. The experts articulated considerable uncertainty, however, about the future frequency of RTMM events. The historical frequency of moderate and intense RTMM (combined) in the Southwest region was 0.35 (80-percent confidence interval [CI]: 0.21−0.52), whereas the forecast probability was 0.48 (80-percent CI: 0.30−0.64) over a 40-year projected time horizon. Moderate and intense RTMM events are expected to continue to be most frequent in the Southwest region, to increase in mean frequency in the Northwest region (historical frequency of moderate and intense RTMM events [combined] in the Northwest region was 0, whereas the forecast probability was 0.12 [80-percent CI: 0.02−0.39] over a 40-year projected time horizon) and in the Atlantic region (historical frequency of moderate and intense RTMM events [combined] in the Atlantic region was 0.05 [80-percent CI: 0.005–0.18], whereas the forecast probability was 0.11 [80-percent CI: 0.03−0.25] over a 40-year projected time horizon), and to remain absent from the Upper St. Johns River region. The impact of red-tide blooms on manatee mortality has been measured for the Southwest region but not for the Northwest and Atlantic regions, where such events have been rare. The expert panel predicted that the median magnitude of RTMM events in the Atlantic and Northwest regions will be much smaller than that in the Southwest; given the large uncertainties, however, they acknowledged the possibility that these events could be larger in their mortality impacts than in the Southwest region. By its nature, forecasting requires expert judgment because it is impossible to have empirical evidence about the future. The large uncertainties in parameter estimates over a 100-year timeframe are to be expected and may also indicate that the training provided to panelists successfully minimized one common pitfall of expert judgment, that of overconfidence. This study has provided useful and needed inputs to the Florida manatee population viability analysis associated with an important and recurrent source of mortality from harmful algal blooms.
Schwalm, Donelle; Epps, Clinton W; Rodhouse, Thomas J; Monahan, William B; Castillo, Jessica A; Ray, Chris; Jeffress, Mackenzie R
2016-04-01
Ecological niche theory holds that species distributions are shaped by a large and complex suite of interacting factors. Species distribution models (SDMs) are increasingly used to describe species' niches and predict the effects of future environmental change, including climate change. Currently, SDMs often fail to capture the complexity of species' niches, resulting in predictions that are generally limited to climate-occupancy interactions. Here, we explore the potential impact of climate change on the American pika using a replicated place-based approach that incorporates climate, gene flow, habitat configuration, and microhabitat complexity into SDMs. Using contemporary presence-absence data from occupancy surveys, genetic data to infer connectivity between habitat patches, and 21 environmental niche variables, we built separate SDMs for pika populations inhabiting eight US National Park Service units representing the habitat and climatic breadth of the species across the western United States. We then predicted occurrence probability under current (1981-2010) and three future time periods (out to 2100). Occurrence probabilities and the relative importance of predictor variables varied widely among study areas, revealing important local-scale differences in the realized niche of the American pika. This variation resulted in diverse and - in some cases - highly divergent future potential occupancy patterns for pikas, ranging from complete extirpation in some study areas to stable occupancy patterns in others. Habitat composition and connectivity, which are rarely incorporated in SDM projections, were influential in predicting pika occupancy in all study areas and frequently outranked climate variables. Our findings illustrate the importance of a place-based approach to species distribution modeling that includes fine-scale factors when assessing current and future climate impacts on species' distributions, especially when predictions are intended to manage and conserve species of concern within individual protected areas. © 2015 John Wiley & Sons Ltd.
Contrasting effects of climate change on rabbit populations through reproduction.
Tablado, Zulima; Revilla, Eloy
2012-01-01
Climate change is affecting many physical and biological processes worldwide. Anticipating its effects at the level of populations and species is imperative, especially for organisms of conservation or management concern. Previous studies have focused on estimating future species distributions and extinction probabilities directly from current climatic conditions within their geographical ranges. However, relationships between climate and population parameters may be so complex that to make these high-level predictions we need first to understand the underlying biological processes driving population size, as well as their individual response to climatic alterations. Therefore, the objective of this study is to investigate the influence that climate change may have on species population dynamics through altering breeding season. We used a mechanistic model based on drivers of rabbit reproductive physiology together with demographic simulations to show how future climate-driven changes in breeding season result in contrasting rabbit population trends across Europe. In the Iberian Peninsula, where rabbits are a native species of high ecological and economic value, breeding seasons will shorten and become more variable leading to population declines, higher extinction risk, and lower resilience to perturbations. Whereas towards north-eastern countries, rabbit numbers are expected to increase through longer and more stable reproductive periods, which augment the probability of new rabbit invasions in those areas. Our study reveals the type of mechanisms through which climate will cause alterations at the species level and emphasizes the need to focus on them in order to better foresee large-scale complex population trends. This is especially important in species like the European rabbit whose future responses may aggravate even further its dual keystone/pest problematic. Moreover, this approach allows us to predict not only distribution shifts but also future population status and growth, and to identify the demographic parameters on which to focus to mitigate global change effects.
Contrasting Effects of Climate Change on Rabbit Populations through Reproduction
Tablado, Zulima; Revilla, Eloy
2012-01-01
Background Climate change is affecting many physical and biological processes worldwide. Anticipating its effects at the level of populations and species is imperative, especially for organisms of conservation or management concern. Previous studies have focused on estimating future species distributions and extinction probabilities directly from current climatic conditions within their geographical ranges. However, relationships between climate and population parameters may be so complex that to make these high-level predictions we need first to understand the underlying biological processes driving population size, as well as their individual response to climatic alterations. Therefore, the objective of this study is to investigate the influence that climate change may have on species population dynamics through altering breeding season. Methodology/Principal Findings We used a mechanistic model based on drivers of rabbit reproductive physiology together with demographic simulations to show how future climate-driven changes in breeding season result in contrasting rabbit population trends across Europe. In the Iberian Peninsula, where rabbits are a native species of high ecological and economic value, breeding seasons will shorten and become more variable leading to population declines, higher extinction risk, and lower resilience to perturbations. Whereas towards north-eastern countries, rabbit numbers are expected to increase through longer and more stable reproductive periods, which augment the probability of new rabbit invasions in those areas. Conclusions/Significance Our study reveals the type of mechanisms through which climate will cause alterations at the species level and emphasizes the need to focus on them in order to better foresee large-scale complex population trends. This is especially important in species like the European rabbit whose future responses may aggravate even further its dual keystone/pest problematic. Moreover, this approach allows us to predict not only distribution shifts but also future population status and growth, and to identify the demographic parameters on which to focus to mitigate global change effects. PMID:23152836
Adam Duarte,; Hatfield, Jeffrey; Todd M. Swannack,; Michael R. J. Forstner,; M. Clay Green,; Floyd W. Weckerly,
2015-01-01
Population viability analyses provide a quantitative approach that seeks to predict the possible future status of a species of interest under different scenarios and, therefore, can be important components of large-scale species’ conservation programs. We created a model and simulated range-wide population and breeding habitat dynamics for an endangered woodland warbler, the golden-cheeked warbler (Setophaga chrysoparia). Habitat-transition probabilities were estimated across the warbler's breeding range by combining National Land Cover Database imagery with multistate modeling. Using these estimates, along with recently published demographic estimates, we examined if the species can remain viable into the future given the current conditions. Lastly, we evaluated if protecting a greater amount of habitat would increase the number of warblers that can be supported in the future by systematically increasing the amount of protected habitat and comparing the estimated terminal carrying capacity at the end of 50 years of simulated habitat change. The estimated habitat-transition probabilities supported the hypothesis that habitat transitions are unidirectional, whereby habitat is more likely to diminish than regenerate. The model results indicated population viability could be achieved under current conditions, depending on dispersal. However, there is considerable uncertainty associated with the population projections due to parametric uncertainty. Model results suggested that increasing the amount of protected lands would have a substantial impact on terminal carrying capacities at the end of a 50-year simulation. Notably, this study identifies the need for collecting the data required to estimate demographic parameters in relation to changes in habitat metrics and population density in multiple regions, and highlights the importance of establishing a common definition of what constitutes protected habitat, what management goals are suitable within those protected areas, and a standard operating procedure to identify areas of priority for habitat conservation efforts. Therefore, we suggest future efforts focus on these aspects of golden-cheeked warbler conservation and ecology.
NASA Astrophysics Data System (ADS)
Taner, M. U.; Ray, P.; Brown, C.
2016-12-01
Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.
Van Cauwenberg, Jelle; Clarys, Peter; De Bourdeaudhuij, Ilse; Van Holle, Veerle; Verté, Dominique; De Witte, Nico; De Donder, Liesbeth; Buffel, Tine; Dury, Sarah; Deforche, Benedicte
2013-08-14
The physical environment may play a crucial role in promoting older adults' walking for transportation. However, previous studies on relationships between the physical environment and older adults' physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults' walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances to facilities. Future research should focus upon the relationship between older adults' physical activity and multiple environmental factors simultaneously instead of separately.
2013-01-01
Background The physical environment may play a crucial role in promoting older adults’ walking for transportation. However, previous studies on relationships between the physical environment and older adults’ physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults’ walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. Methods The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. Results For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Conclusions Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances to facilities. Future research should focus upon the relationship between older adults’ physical activity and multiple environmental factors simultaneously instead of separately. PMID:23945285
Elwood L. Shafer; George H. Moeller; Russell E. Getty
1974-01-01
As an aid to policy- and decision-making about future environmental problems, a panel of experts was asked to predict the probabilities of future events associated with natural-resource management, wildland-recreation management, environmental pollution, population-workforce-leisure, and urban environments. Though some of the predictions projected to the year 2050 may...
Gong, Yan-Xiao; Zhang, ShengLi; Xu, P; Zhu, S N
2016-03-21
We propose to generate a single-mode-squeezing two-mode squeezed vacuum state via a single χ(2) nonlinear photonic crystal. The state is favorable for existing Gaussian entanglement distillation schemes, since local squeezing operations can enhance the final entanglement and the success probability. The crystal is designed for enabling three concurrent quasi-phase-matching parametric-down conversions, and hence relieves the auxiliary on-line bi-side local squeezing operations. The compact source opens up a way for continuous-variable quantum technologies and could find more potential applications in future large-scale quantum networks.
Noninvasive metabolic profiling for painless diagnosis of human diseases and disorders.
Mal, Mainak
2016-06-01
Metabolic profiling provides a powerful diagnostic tool complementary to genomics and proteomics. The pain, discomfort and probable iatrogenic injury associated with invasive or minimally invasive diagnostic methods, render them unsuitable in terms of patient compliance and participation. Metabolic profiling of biomatrices like urine, breath, saliva, sweat and feces, which can be collected in a painless manner, could be used for noninvasive diagnosis. This review article covers the noninvasive metabolic profiling studies that have exhibited diagnostic potential for diseases and disorders. Their potential applications are evident in different forms of cancer, metabolic disorders, infectious diseases, neurodegenerative disorders, rheumatic diseases and pulmonary diseases. Large scale clinical validation of such diagnostic methods is necessary in future.
Noninvasive metabolic profiling for painless diagnosis of human diseases and disorders
Mal, Mainak
2016-01-01
Metabolic profiling provides a powerful diagnostic tool complementary to genomics and proteomics. The pain, discomfort and probable iatrogenic injury associated with invasive or minimally invasive diagnostic methods, render them unsuitable in terms of patient compliance and participation. Metabolic profiling of biomatrices like urine, breath, saliva, sweat and feces, which can be collected in a painless manner, could be used for noninvasive diagnosis. This review article covers the noninvasive metabolic profiling studies that have exhibited diagnostic potential for diseases and disorders. Their potential applications are evident in different forms of cancer, metabolic disorders, infectious diseases, neurodegenerative disorders, rheumatic diseases and pulmonary diseases. Large scale clinical validation of such diagnostic methods is necessary in future. PMID:28031956
Danchenko, V G
2011-05-01
The article is devoted to the reconstruction of medical uniforms Russian navy first third of the 18th century. It can be assumed that doctors were in varying degrees, the senior officer's dress, but of course without the braid, although there are exceptions, which related to doctors willing to go to a more senior hypostasis. A number of documents of different structures gives rise to speak with a high probability that the doctors of different ranks, serving in the Marine units that had shaped dress that is largely consistent with their position in the hierarchy of ranks and received in the near future, its development.
Potential distribution dataset of honeybees in Indian Ocean Islands: Case study of Zanzibar Island.
Mwalusepo, Sizah; Muli, Eliud; Nkoba, Kiatoko; Nguku, Everlyn; Kilonzo, Joseph; Abdel-Rahman, Elfatih M; Landmann, Tobias; Fakih, Asha; Raina, Suresh
2017-10-01
Honeybees ( Apis mellifera ) are principal insect pollinators, whose worldwide distribution and abundance is known to largely depend on climatic conditions. However, the presence records dataset on potential distribution of honeybees in Indian Ocean Islands remain less documented. Presence records in shape format and probability of occurrence of honeybees with different temperature change scenarios is provided in this article across Zanzibar Island. Maximum entropy (Maxent) package was used to analyse the potential distribution of honeybees. The dataset provides information on the current and future distribution of the honey bees in Zanzibar Island. The dataset is of great importance for improving stakeholders understanding of the role of temperature change on the spatial distribution of honeybees.
NASA Astrophysics Data System (ADS)
Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.
2015-12-01
The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence probabilities P30(i) for all earthquakes (CEFMs) and calculated maximum coastal tsunami heights. In the synthesis, aleatory uncertainties relating to incompleteness of governing equations, CEFM modeling, bathymetry and topography data, etc, are modeled assuming a log-normal probabilistic distribution. Examples of tsunami hazard curves will be presented.
Multistate modeling of habitat dynamics: Factors affecting Florida scrub transition probabilities
Breininger, D.R.; Nichols, J.D.; Duncan, B.W.; Stolen, Eric D.; Carter, G.M.; Hunt, D.K.; Drese, J.H.
2010-01-01
Many ecosystems are influenced by disturbances that create specific successional states and habitat structures that species need to persist. Estimating transition probabilities between habitat states and modeling the factors that influence such transitions have many applications for investigating and managing disturbance-prone ecosystems. We identify the correspondence between multistate capture-recapture models and Markov models of habitat dynamics. We exploit this correspondence by fitting and comparing competing models of different ecological covariates affecting habitat transition probabilities in Florida scrub and flatwoods, a habitat important to many unique plants and animals. We subdivided a large scrub and flatwoods ecosystem along central Florida's Atlantic coast into 10-ha grid cells, which approximated average territory size of the threatened Florida Scrub-Jay (Aphelocoma coerulescens), a management indicator species. We used 1.0-m resolution aerial imagery for 1994, 1999, and 2004 to classify grid cells into four habitat quality states that were directly related to Florida Scrub-Jay source-sink dynamics and management decision making. Results showed that static site features related to fire propagation (vegetation type, edges) and temporally varying disturbances (fires, mechanical cutting) best explained transition probabilities. Results indicated that much of the scrub and flatwoods ecosystem was resistant to moving from a degraded state to a desired state without mechanical cutting, an expensive restoration tool. We used habitat models parameterized with the estimated transition probabilities to investigate the consequences of alternative management scenarios on future habitat dynamics. We recommend this multistate modeling approach as being broadly applicable for studying ecosystem, land cover, or habitat dynamics. The approach provides maximum-likelihood estimates of transition parameters, including precision measures, and can be used to assess evidence among competing ecological models that describe system dynamics. ?? 2010 by the Ecological Society of America.
The Importance of Studying Past Extreme Floods to Prepare for Uncertain Future Extremes
NASA Astrophysics Data System (ADS)
Burges, S. J.
2016-12-01
Hoyt and Langbein, 1955 in their book `Floods' wrote: " ..meteorologic and hydrologic conditions will combine to produce superfloods of unprecedented magnitude. We have every reason to believe that in most rivers past floods may not be an accurate measure of ultimate flood potentialities. It is this superflood with which we are always most concerned". I provide several examples to offer some historical perspective on assessing extreme floods. In one example, flooding in the Miami Valley, OH in 1913 claimed 350 lives. The engineering and socio-economic challenges facing the Morgan Engineering Co in how to mitigate against future flood damage and loss of life when limited information was available provide guidance about ways to face an uncertain hydroclimate future, particularly one of a changed climate. A second example forces us to examine mixed flood populations and illustrates the huge uncertainty in assigning flood magnitude and exceedance probability to extreme floods in such cases. There is large uncertainty in flood frequency estimates; knowledge of the total flood hydrograph, not the peak flood flow rate alone, is what is needed for hazard mitigation assessment or design. Some challenges in estimating the complete flood hydrograph in an uncertain future climate, including demands on hydrologic models and their inputs, are addressed.
Schloesser, J.T.; Paukert, Craig P.; Doyle, W.J.; Hill, Tracy D.; Steffensen, K.D.; Travnichek, Vincent H.
2012-01-01
Occupancy modeling was used to determine (1) if detection probabilities (p) for 7 regionally imperiled Missouri River fishes (Scaphirhynchus albus, Scaphirhynchus platorynchus, Cycleptus elongatus, Sander canadensis, Macrhybopsis aestivalis, Macrhybopsis gelida, and Macrhybopsis meeki) differed among gear types (i.e. stationary gill nets, drifted trammel nets, and otter trawls), and (2) how detection probabilities were affected by habitat (i.e. pool, bar, and open water), longitudinal position (five 189 to 367 rkm long segments), sampling year (2003 to 2006), and season (July 1 to October 30 and October 31 to June 30). Adult, large-bodied fishes were best detected with gill nets (p: 0.02–0.74), but most juvenile large-bodied and all small-bodied species were best detected with otter trawls (p: 0.02–0.58). Trammel nets may be a redundant sampling gear for imperiled fishes in the lower Missouri River because most species had greater detection probabilities with gill nets or otter trawls. Detection probabilities varied with river segment for S. platorynchus, C. elongatus, and all small-bodied fishes, suggesting that changes in habitat influenced gear efficiency or abundance changes among river segments. Detection probabilities varied by habitat for adult S. albus and S. canadensis, year for juvenile S. albus, C. elongatus, and S. canadensis, and season for adult S. albus. Concentrating sampling effort on gears with the greatest detection probabilities may increase species detections to better monitor a population's response to environmental change and the effects of management actions on large-river fishes.
Transmission characteristics of MERS and SARS in the healthcare setting: a comparative study.
Chowell, Gerardo; Abdirizak, Fatima; Lee, Sunmi; Lee, Jonggul; Jung, Eunok; Nishiura, Hiroshi; Viboud, Cécile
2015-09-03
The Middle East respiratory syndrome (MERS) coronavirus has caused recurrent outbreaks in the Arabian Peninsula since 2012. Although MERS has low overall human-to-human transmission potential, there is occasional amplification in the healthcare setting, a pattern reminiscent of the dynamics of the severe acute respiratory syndrome (SARS) outbreaks in 2003. Here we provide a head-to-head comparison of exposure patterns and transmission dynamics of large hospital clusters of MERS and SARS, including the most recent South Korean outbreak of MERS in 2015. To assess the unexpected nature of the recent South Korean nosocomial outbreak of MERS and estimate the probability of future large hospital clusters, we compared exposure and transmission patterns for previously reported hospital clusters of MERS and SARS, based on individual-level data and transmission tree information. We carried out simulations of nosocomial outbreaks of MERS and SARS using branching process models rooted in transmission tree data, and inferred the probability and characteristics of large outbreaks. A significant fraction of MERS cases were linked to the healthcare setting, ranging from 43.5 % for the nosocomial outbreak in Jeddah, Saudi Arabia, in 2014 to 100 % for both the outbreak in Al-Hasa, Saudi Arabia, in 2013 and the outbreak in South Korea in 2015. Both MERS and SARS nosocomial outbreaks are characterized by early nosocomial super-spreading events, with the reproduction number dropping below 1 within three to five disease generations. There was a systematic difference in the exposure patterns of MERS and SARS: a majority of MERS cases occurred among patients who sought care in the same facilities as the index case, whereas there was a greater concentration of SARS cases among healthcare workers throughout the outbreak. Exposure patterns differed slightly by disease generation, however, especially for SARS. Moreover, the distributions of secondary cases per single primary case varied highly across individual hospital outbreaks (Kruskal-Wallis test; P < 0.0001), with significantly higher transmission heterogeneity in the distribution of secondary cases for MERS than SARS. Simulations indicate a 2-fold higher probability of occurrence of large outbreaks (>100 cases) for SARS than MERS (2 % versus 1 %); however, owing to higher transmission heterogeneity, the largest outbreaks of MERS are characterized by sharper incidence peaks. The probability of occurrence of MERS outbreaks larger than the South Korean cluster (n = 186) is of the order of 1 %. Our study suggests that the South Korean outbreak followed a similar progression to previously described hospital clusters involving coronaviruses, with early super-spreading events generating a disproportionately large number of secondary infections, and the transmission potential diminishing greatly in subsequent generations. Differences in relative exposure patterns and transmission heterogeneity of MERS and SARS could point to changes in hospital practices since 2003 or differences in transmission mechanisms of these coronaviruses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, Adam; Connaughton, Valerie; Briggs, Michael S.
We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate themore » probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.« less
Shiogama, Hideo; Imada, Yukiko; Mori, Masato; ...
2016-08-07
Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less
Psychophysics of the probability weighting function
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(
An approach to evaluating reactive airborne wind shear systems
NASA Technical Reports Server (NTRS)
Gibson, Joseph P., Jr.
1992-01-01
An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.
McCalpin, J.P.; Nishenko, S.P.
1996-01-01
The chronology of M>7 paleoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat time of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600 years. Four of the central five segments ruptured between ??? 620??30 and 1230??60 calendar years B.P. The remaining segment (Brigham City segment) has not ruptured in the past 2120??100 years. Comparison of the WFZ space-time diagram of paleoearthquakes with synthetic paleoseismic histories indicates that the observed temporal clusters and gaps have about an equal probability (depending on model assumptions) of reflecting random coincidence as opposed to intersegment contagion. Regional seismicity suggests that for exposure times of 50 and 100 years, the probability for an earthquake of M>7 anywhere within the Wasatch Front region, based on a Poisson model, is 0.16 and 0.30, respectively. A fault-specific WFZ model predicts 50 and 100 year probabilities for a M>7 earthquake on the WFZ itself, based on a Poisson model, as 0.13 and 0.25, respectively. In contrast, segment-specific earthquake probabilities that assume quasi-periodic recurrence behavior on the Weber, Provo, and Nephi segments are less (0.01-0.07 in 100 years) than the regional or fault-specific estimates (0.25-0.30 in 100 years), due to the short elapsed times compared to average recurrence intervals on those segments. The Brigham City and Salt Lake City segments, however, have time-dependent probabilities that approach or exceed the regional and fault specific probabilities. For the Salt Lake City segment, these elevated probabilities are due to the elapsed time being approximately equal to the average late Holocene recurrence time. For the Brigham City segment, the elapsed time is significantly longer than the segment-specific late Holocene recurrence time.
Constructing alternative futures
David N. Wear; Robert Huggett; John G. Greis
2013-01-01
The desired product of the Southern Forest Futures Project is a mechanism that will help southerners think about and prepare for future changes in their forests and the benefits they provide. Because any single projection of the worldâs (or a regionâs) biological, physical, and social systems has a high probability of being incorrect, the Futures Project instead...
Nojavan A, Farnaz; Qian, Song S; Paerl, Hans W; Reckhow, Kenneth H; Albright, Elizabeth A
2014-06-15
The present paper utilizes a Bayesian Belief Network (BBN) approach to intuitively present and quantify our current understanding of the complex physical, chemical, and biological processes that lead to eutrophication in an estuarine ecosystem (New River Estuary, North Carolina, USA). The model is further used to explore the effects of plausible future climatic and nutrient pollution management scenarios on water quality indicators. The BBN, through visualizing the structure of the network, facilitates knowledge communication with managers/stakeholders who might not be experts in the underlying scientific disciplines. Moreover, the developed structure of the BBN is transferable to other comparable estuaries. The BBN nodes are discretized exploring a new approach called moment matching method. The conditional probability tables of the variables are driven by a large dataset (four years). Our results show interaction among various predictors and their impact on water quality indicators. The synergistic effects caution future management actions. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Ke; Liao, Hong; Cai, Wenju; Yang, Yang
2018-02-01
Severe haze pollution in eastern China has caused substantial health impacts and economic loss. Conducive atmospheric conditions are important to affect occurrence of severe haze events, and circulation changes induced by future global climate warming are projected to increase the frequency of such events. However, a potential contribution of an anthropogenic influence to recent most severe haze (December 2015 and January 2013) over eastern China remains unclear. Here we show that the anthropogenic influence, which is estimated by using large ensemble runs with a climate model forced with and without anthropogenic forcings, has already increased the probability of the atmospheric patterns conducive to severe haze by at least 45% in January 2013 and 27% in December 2015, respectively. We further confirm that simulated atmospheric circulation pattern changes induced by anthropogenic influence are driven mainly by increased greenhouse gas emissions. Our results suggest that more strict reductions in pollutant emissions are needed under future anthropogenic warming.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ke; Liao, Hong; Cai, Wenju
Severe haze pollution in eastern China has caused substantial health impacts and economic loss. Conducive atmospheric conditions are important to affect occurrence of severe haze events, and circulation changes induced by future global climate warming are projected to increase the frequency of such events. However, a potential contribution of an anthropogenic influence to recent most severe haze (December 2015 and January 2013) over eastern China remains unclear. Here we show that the anthropogenic influence, which is estimated by using large ensemble runs with a climate model forced with and without anthropogenic forcings, has already increased the probability of the atmosphericmore » patterns conducive to severe haze by at least 45% in January 2013 and 27% in December 2015, respectively. We further confirm that simulated atmospheric circulation pattern changes induced by anthropogenic influence are driven mainly by increased greenhouse gas emissions. Our results suggest that more strict reductions in pollutant emissions are needed under future anthropogenic warming.« less
Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds
Conway, C.J.; Gibbs, J.P.
2011-01-01
Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.
Xu, Jason; Minin, Vladimir N
2015-07-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.
Xu, Jason; Minin, Vladimir N.
2016-01-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377
Parameterizing deep convection using the assumed probability density function method
Storer, R. L.; Griffin, B. M.; Höft, J.; ...
2014-06-11
Due to their coarse horizontal resolution, present-day climate models must parameterize deep convection. This paper presents single-column simulations of deep convection using a probability density function (PDF) parameterization. The PDF parameterization predicts the PDF of subgrid variability of turbulence, clouds, and hydrometeors. That variability is interfaced to a prognostic microphysics scheme using a Monte Carlo sampling method. The PDF parameterization is used to simulate tropical deep convection, the transition from shallow to deep convection over land, and mid-latitude deep convection. These parameterized single-column simulations are compared with 3-D reference simulations. The agreement is satisfactory except when the convective forcing ismore » weak. The same PDF parameterization is also used to simulate shallow cumulus and stratocumulus layers. The PDF method is sufficiently general to adequately simulate these five deep, shallow, and stratiform cloud cases with a single equation set. This raises hopes that it may be possible in the future, with further refinements at coarse time step and grid spacing, to parameterize all cloud types in a large-scale model in a unified way.« less
Ghavami, Behnam; Raji, Mohsen; Pedram, Hossein
2011-08-26
Carbon nanotube field-effect transistors (CNFETs) show great promise as building blocks of future integrated circuits. However, synthesizing single-walled carbon nanotubes (CNTs) with accurate chirality and exact positioning control has been widely acknowledged as an exceedingly complex task. Indeed, density and chirality variations in CNT growth can compromise the reliability of CNFET-based circuits. In this paper, we present a novel statistical compact model to estimate the failure probability of CNFETs to provide some material and process guidelines for the design of CNFETs in gigascale integrated circuits. We use measured CNT spacing distributions within the framework of detailed failure analysis to demonstrate that both the CNT density and the ratio of metallic to semiconducting CNTs play dominant roles in defining the failure probability of CNFETs. Besides, it is argued that the large-scale integration of these devices within an integrated circuit will be feasible only if a specific range of CNT density with an acceptable ratio of semiconducting to metallic CNTs can be adjusted in a typical synthesis process.
Parameterizing deep convection using the assumed probability density function method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storer, R. L.; Griffin, B. M.; Höft, J.
2015-01-06
Due to their coarse horizontal resolution, present-day climate models must parameterize deep convection. This paper presents single-column simulations of deep convection using a probability density function (PDF) parameterization. The PDF parameterization predicts the PDF of subgrid variability of turbulence, clouds, and hydrometeors. That variability is interfaced to a prognostic microphysics scheme using a Monte Carlo sampling method.The PDF parameterization is used to simulate tropical deep convection, the transition from shallow to deep convection over land, and midlatitude deep convection. These parameterized single-column simulations are compared with 3-D reference simulations. The agreement is satisfactory except when the convective forcing is weak.more » The same PDF parameterization is also used to simulate shallow cumulus and stratocumulus layers. The PDF method is sufficiently general to adequately simulate these five deep, shallow, and stratiform cloud cases with a single equation set. This raises hopes that it may be possible in the future, with further refinements at coarse time step and grid spacing, to parameterize all cloud types in a large-scale model in a unified way.« less
Parameterizing deep convection using the assumed probability density function method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storer, R. L.; Griffin, B. M.; Hoft, Jan
2015-01-06
Due to their coarse horizontal resolution, present-day climate models must parameterize deep convection. This paper presents single-column simulations of deep convection using a probability density function (PDF) parameterization. The PDF parameterization predicts the PDF of subgrid variability of turbulence, clouds, and hydrometeors. That variability is interfaced to a prognostic microphysics scheme using a Monte Carlo sampling method.The PDF parameterization is used to simulate tropical deep convection, the transition from shallow to deep convection over land, and mid-latitude deep convection.These parameterized single-column simulations are compared with 3-D reference simulations. The agreement is satisfactory except when the convective forcing is weak. Themore » same PDF parameterization is also used to simulate shallow cumulus and stratocumulus layers. The PDF method is sufficiently general to adequately simulate these five deep, shallow, and stratiform cloud cases with a single equation set. This raises hopes that it may be possible in the future, with further refinements at coarse time step and grid spacing, to parameterize all cloud types in a large-scale model in a unified way.« less
Bergström, Richard
2011-04-01
The established market model for pharmaceutical products, as for most other products, is heavily dependent on sales volumes. Thus, it is a primary interest of the producer to sell large quantities. This may be questionable for medicinal products and probably most questionable for antibacterial remedies. For these products, treatment indications are very complex and encompass both potential patient benefits, possible adverse effects in the actual patient and, which is unique for this therapeutic class, consideration about what effects the drug use will have on the future therapeutic value of the drug. This is because bacteria are sure to develop resistance. The European Federation of Pharmaceutical Industries and Associations (EFPIA) agrees with the general description of the antibacterial resistance problem and wants to participate in measures to counteract antibacterial resistance. Stakeholders should forge an alliance that will address the need for and prudent use of new antibiotics. A variety of incentives probably have to be applied, but having all in common that the financial return has to be separated from the use of the product. Copyright © 2011. Published by Elsevier Ltd.
Using Unplanned Fires to Help Suppressing Future Large Fires in Mediterranean Forests
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire–succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000–2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18–22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change. PMID:24727853
Using unplanned fires to help suppressing future large fires in Mediterranean forests.
Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís
2014-01-01
Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be achieved, especially in the wider context of climate change.
Small and large wetland fragments are equally suited breeding sites for a ground-nesting passerine.
Pasinelli, Gilberto; Mayer, Christian; Gouskov, Alexandre; Schiegg, Karin
2008-06-01
Large habitat fragments are generally thought to host more species and to offer more diverse and/or better quality habitats than small fragments. However, the importance of small fragments for population dynamics in general and for reproductive performance in particular is highly controversial. Using an information-theoretic approach, we examined reproductive performance and probability of local recruitment of color-banded reed buntings Emberiza schoeniclus in relation to the size of 18 wetland fragments in northeastern Switzerland over 4 years. We also investigated if reproductive performance and recruitment probability were density-dependent. None of the four measures of reproductive performance (laying date, nest failure probability, fledgling production per territory, fledgling condition) nor recruitment probability were found to be related to wetland fragment size. In terms of fledgling production, however, fragment size interacted with year, indicating that small fragments were better reproductive grounds in some years than large fragments. Reproductive performance and recruitment probability were not density-dependent. Our results suggest that small fragments are equally suited as breeding grounds for the reed bunting as large fragments and should therefore be managed to provide a habitat for this and other specialists occurring in the same habitat. Moreover, large fragments may represent sinks in specific years because a substantial percentage of all breeding pairs in our study area breed in large fragments, and reproductive failure in these fragments due to the regularly occurring floods may have a much stronger impact on regional population dynamics than comparable events in small fragments.
The future of emissions trading in light of the acid rain experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
McLean, B.J.; Rico, R.
1995-12-31
The idea of emissions trading was developed more than two decades ago by environmental economists eager to provide new ideas for how to improve the efficiency of environmental protection. However, early emissions trading efforts were built on the historical {open_quotes}command and control{close_quotes} infrastructure which has dominated U.S. environmental protection until today. The {open_quotes}command and control{close_quotes} model initially had advantages that were of a very pragmatic character: it assured large pollution reductions in a time when large, cheap reductions were available and necessary; and it did not require a sophisticated government infrastructure. Within the last five years, large-scale emission trading programsmore » have been successfully designed and started that are fundamentally different from the earlier efforts, creating a new paradigm for environmental control just when our understanding of environmental problems is changing as well. The purpose of this paper is to focus on the largest national-scale program--the Acid Rain Program--and from that experience, forecast when emission trading programs may be headed based on our understanding of the factors currently influencing environmental management. The first section of this paper will briefly review the history of emissions trading programs, followed by a summary of the features of the Acid Rain Program, highlighting those features that distinguish it from previous efforts. The last section addresses the opportunities for emissions trading (and its probable future directions).« less
Rewriting Ice Sheet "Glacier-ology"
NASA Astrophysics Data System (ADS)
Bindschadler, R.
2006-12-01
The revolution in glaciology driven by the suite of increasingly sophisticated satellite instruments has been no more extreme than in the area of ice dynamics. Years ago, glaciologists were (probably unwittingly) selective in what properties of mountain glaciers were also applied to ice sheets. This reinforced the view that they responded slowly to their environment. Notions of rapid response driven by the ideas of John Mercer, Bill Budd and Terry Hughes were politely rejected by the centrists of mainstream glaciological thought. How the tables have turned--and by the ice sheets themselves, captured in the act of rapidly changing by modern remote sensors! The saw-toothed record of sea-level change over past glacial-interglacial cycles required the existence of rapid ice loss processes. Satellite based observations, supported by hard-earned field observations have extended the time scale over which ice sheets can suddenly change to ever shorter intervals: from centuries, to decades, to years to even minutes. As changes continue to be observed, the scientific community is forced to consider new or previously ignored processes to explain these observations. The penultimate goal of ice-sheet dynamics is to credibly predict the future of both the Greenland and Antarctic ice sheets. In this important endeavor, there is no substitute for our ability to observe. Without the extensive data sets provided by remote sensing, numerical models can be neither tested nor improved. The impact of remote sensing on our existing ability to predict the future must be compared to our probable state of knowledge and ability were these data never collected. Among many satellite observed phenomena we would be largely or wholly ignorant of are the recent acceleration of ice throughout much of coastal Greenland; the sudden disintegration of multiple ice shelves along the Antarctic Peninsula; and the dramatic thinning and acceleration of the Amundsen Sea sector of West Antarctica. These observations are driving increased concern about rapidly increasing sea level, a process dominated by ice-sheet dynamics and largely identified, quantified, studied and monitored by satellite sensors.
ERIC Educational Resources Information Center
Idaho State Library, Boise.
In l998, Idahoans gathered in a series of six Regional Futures Conferences to identify what they thought was probable during the next ten years, what was possible for libraries to do and be, and what a preferred future of Idaho libraries might be. Participants from the regional conferences then convened to refine and focus descriptions of the…
The Future Outlook for School Facilities Planning and Design.
ERIC Educational Resources Information Center
Brubaker, C. William
School design is influenced by four major factors: the education program, the community site, education technology, and building technology. Schools of the future are discussed in relation to the factors affecting school design. It is probable that future schools will be involved in a broader spectrum of programs and will serve a more diverse…
Warship Combat System Selection Methodology Based on Discrete Event Simulation
2010-09-01
Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the
Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function
ERIC Educational Resources Information Center
Fennell, John; Baddeley, Roland
2012-01-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…
Tropical and Extratropical Cyclone Damages under Climate Change
NASA Astrophysics Data System (ADS)
Ranson, M.; Kousky, C.; Ruth, M.; Jantarasami, L.; Crimmins, A.; Tarquinio, L.
2014-12-01
This paper provides the first quantitative synthesis of the rapidly growing literature on future tropical and extratropical cyclone losses under climate change. We estimate a probability distribution for the predicted impact of changes in global surface air temperatures on future storm damages, using an ensemble of 296 estimates of the temperature-damage relationship from twenty studies. Our analysis produces three main empirical results. First, we find strong but not conclusive support for the hypothesis that climate change will cause damages from tropical cyclones and wind storms to increase, with most models (84 and 92 percent, respectively) predicting higher future storm damages due to climate change. Second, there is substantial variation in projected changes in losses across regions. Potential changes in damages are greatest in the North Atlantic basin, where the multi-model average predicts that a 2.5°C increase in global surface air temperature would cause hurricane damages to increase by 62 percent. The ensemble predictions for Western North Pacific tropical cyclones and European wind storms (extratropical cyclones) are approximately one third of that magnitude. Finally, our analysis shows that existing models of storm damages under climate change generate a wide range of predictions, ranging from moderate decreases to very large increases in losses.
Persistent Cold Air Outbreaks over North America Under Climate Warming
NASA Astrophysics Data System (ADS)
Gao, Y.; Leung, L. R.; Lu, J.
2014-12-01
This study evaluates the change of cold air outbreaks (CAO) over North America using Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble of global climate simulations as well as regional high resolution climate simulations. In future, while robust decrease of CAO duration dominates in most of the North America, the decrease over northwestern U.S. was found to have much smaller magnitude than the surrounding regions. We found statistically significant increase of the sea level pressure over gulf of Alaska, leading to the advection of cold air to northwestern U.S.. By shifting the probability distribution of present temperature towards future warmer conditions, we identified the changes in large scale circulation contribute to about 50% of the enhanced sea level pressure. Using the high resolution regional climate model results, we found that increases of existing snowpack could potentially trigger the increase of CAO in the near future over the southwestern U.S. and Rocky Mountain through surface albedo effects. By the end of this century, the top 5 most extreme historical CAO events may still occur and wind chill warning will continue to have societal impacts over North America in particular over northwestern United States.
Large Instrument Development for Radio Astronomy
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino
2016-10-01
We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.
NASA Astrophysics Data System (ADS)
Coleman, N.; Abramson, L.
2004-05-01
Yucca Mt. (YM) is a potential repository site for high-level radioactive waste and spent fuel. One issue is the potential for future igneous activity to intersect the repository. If the event probability is <1E-8/yr, it need not be considered in licensing. Plio-Quaternary volcanos and older basalts occur near YM. Connor et al (JGR, 2000) estimate a probability of 1E-8/yr to 1E-7/yr for a basaltic dike to intersect the potential repository. Based on aeromagnetic data, Hill and Stamatakos (CNWRA, 2002) propose that additional volcanos may lie buried in nearby basins. They suggest if these volcanos are part of temporal-clustered volcanic activity, the probability of an intrusion may be as high as 1E-6/yr. We examine whether recurrence probabilities >2E-7/yr are realistic given that no dikes have been found in or above the 1.3E7 yr-old potential repository block. For 2E-7/yr (or 1E-6/yr), the expected number of penetrating dikes is 2.6 (respectively, 13), and the probability of at least one penetration is 0.93 (0.999). These results are not consistent with the exploration evidence. YM is one of the most intensively studied places on Earth. Over 20 yrs of studies have included surface and subsurface mapping, geophysical surveys, construction of 10+ km of tunnels in the mountain, drilling of many boreholes, and construction of many pits (DOE, Site Recommendation, 2002). It seems unlikely that multiple dikes could exist within the proposed repository footprint and escape detection. A dike complex dated 11.7 Ma (Smith et al, UNLV, 1997) or 10 Ma (Carr and Parrish, 1985) does exist NW of YM and west of the main Solitario Canyon Fault. These basalts intruded the Tiva Canyon Tuff (12.7 Ma) in an epoch of caldera-forming pyroclastic eruptions that ended millions of yrs ago. We would conclude that basaltic volcanism related to Miocene silicic volcanism may also have ended. Given the nondetection of dikes in the potential repository, we can use a Poisson model to estimate an upper-bound probability of 2E-7/yr (95% conf. level) for an igneous intrusion over the next 1E4 yrs. If we assume one undiscovered dike exists, the upper-bound probability would rise to 4E-7/yr. Higher probabilities may be possible if conditions that fostered Plio-Quaternary volcanism became enhanced over time. To the contrary, basalts of the past 11 Ma in Crater Flat have erupted in four episodes that together show a declining trend in erupted magma volume (DOE, TBD13, 2003). Smith et al (GSA Today, 2002) suggest there may be a common magma source for volcanism in Crater Flat and the Lunar Crater volcanic field, and that recurrence rates for YM could be underestimated. Their interpretation is highly speculative given the 130-km (80-mi) distance between these zones. A claim that crustal extension at YM is anomalously large, possibly favoring renewed volcanism (Wernicke et al, Science, 1999), was contradicted by later work (Savage et al, JGR, 2000). Spatial-temporal models that predict future intrusion probabilities of >2E-7/yr may be overly conservative and unrealistic. Along with currently planned site characterization activities, realistic models could be developed by considering the non-detection of basaltic dikes in the potential repository footprint. (The views expressed are the authors' and do not reflect any final judgment or determination by the Advisory Committee on Nuclear Waste or the Nuclear Regulatory Commission regarding the matters addressed or the acceptability of a license application for a geologic repository at Yucca Mt.)
Large Deviations: Advanced Probability for Undergrads
ERIC Educational Resources Information Center
Rolls, David A.
2007-01-01
In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…
Focus in High School Mathematics: Statistics and Probability
ERIC Educational Resources Information Center
National Council of Teachers of Mathematics, 2009
2009-01-01
Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…
Stochastic Modeling of Past Volcanic Crises
NASA Astrophysics Data System (ADS)
Woo, Gordon
2018-01-01
The statistical foundation of disaster risk analysis is past experience. From a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of Nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios. In particular, the dynamic perturbations that might have transitioned a volcano from an unrest to an eruptive state need to be considered. The stochastic modeling of past volcanic crises leads to estimates of eruption probability that can illuminate historical volcanic crisis decisions. It can also inform future economic risk management decisions in regions where there has been some volcanic unrest, but no actual eruption for at least hundreds of years. Furthermore, the availability of a library of past eruption probabilities would provide benchmark support for estimates of eruption probability in future volcanic crises.
Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay
Anderson, C.R.; Sapiano, M.R.P.; Prasad, M.B.K.; Long, W.; Tango, P.J.; Brown, C.W.; Murtugudde, R.
2010-01-01
Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (???10cellsmL-1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100cellsmL-1) to large- threshold (1000cellsmL-1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of -53%, a Probability of Detection ~75%, a False Alarm Ratio of ~52%, and a Probability of False Detection ~9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed. ?? 2010 Elsevier B.V.
NASA Astrophysics Data System (ADS)
Nasaruddin; Tsujioka, Tetsuo
An optical CDMA (OCDMA) system is a flexible technology for future broadband multiple access networks. A secure OCDMA network in broadband optical access technologies is also becoming an issue of great importance. In this paper, we propose novel reconfigurable wavelength-time (W-T) optical codes that lead to secure transmission in OCDMA networks. The proposed W-T optical codes are constructed by using quasigroups (QGs) for wavelength hopping and one-dimensional optical orthogonal codes (OOCs) for time spreading; we call them QGs/OOCs. Both QGs and OOCs are randomly generated by a computer search to ensure that an eavesdropper could not improve its interception performance by making use of the coding structure. Then, the proposed reconfigurable QGs/OOCs can provide more codewords, and many different code set patterns, which differ in both wavelength and time positions for given code parameters. Moreover, the bit error probability of the proposed codes is analyzed numerically. To realize the proposed codes, a secure system is proposed by employing reconfigurable encoders/decoders based on array waveguide gratings (AWGs), which allow the users to change their codeword patterns to protect against eavesdropping. Finally, the probability of breaking a certain codeword in the proposed system is evaluated analytically. The results show that the proposed codes and system can provide a large codeword pattern, and decrease the probability of breaking a certain codeword, to enhance OCDMA network security.
Potential Future Igneous Activity at Yucca Mountain, Nevada
NASA Astrophysics Data System (ADS)
Cline, M.; Perry, F. V.; Valentine, G. A.; Smistad, E.
2005-12-01
Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgement, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 X 10-8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. U.S. Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 X 10-8 be evaluated. Two consequence scenarios are considered; 1) igneous intrusion-groundwater transport case and 2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of the waste packages into the atmosphere, deposition of a tephra sheet, and redistribution of the contaminated ash. In both cases radioactive material is released to the accessible environment either through groundwater transport or through the atmospheric dispersal and deposition. Six Quaternary volcanic centers exist within 20 km of Yucca Mountain. Lathrop Wells cone (LWC), the youngest (approximately 75,000 yrs), is a well-preserved cinder cone with associated flows and tephra sheet that provides an excellent analogue for consequence studies related to future volcanism. Cone, lavas, hydrovolcanic ash, and ash-fall tephra have been examined to estimate eruptive volume and eruption type. LWC ejecta volumes suggest basaltic volcanism may be waning in the Yucca Mountain region.. The eruptive products indicate a sequence of initial fissure fountaining, early Strombolian ash and lapilli deposition forming the scoria cone, a brief hydrovolcanic pulse (possibly limited to the NW sector), and a violent Strombolian phase. Mathematical models have been developed to represent magmatic processes and their consequences on proposed repository performance. These models address dike propagation, magma interaction and flow into drifts, eruption through the proposed repository, and post intrusion/eruption effects. These models continue to be refined to reduce the uncertainty associated with the consequences from a possible future igneous event.
NASA Technical Reports Server (NTRS)
Naumann, R. J.; Oran, W. A.; Whymark, R. R.; Rey, C.
1981-01-01
The single axis acoustic levitator that was flown on SPAR VI malfunctioned. The results of a series of tests, analyses, and investigation of hypotheses that were undertaken to determine the probable cause of failure are presented, together with recommendations for future flights of the apparatus. The most probable causes of the SPAR VI failure were lower than expected sound intensity due to mechanical degradation of the sound source, and an unexpected external force that caused the experiment sample to move radially and eventually be lost from the acoustic energy well.
NASA Astrophysics Data System (ADS)
Quinn, N.; Bates, P. D.; Siddall, M.
2013-12-01
The rate at which sea levels will rise in the coming century is of great interest to decision makers tasked with developing mitigation policies to cope with the risk of coastal inundation. Accurate estimates of future sea levels are vital in the provision of effective policy. Recent reports from UK Climate Impacts Programme (UKCIP) suggest that mean sea levels in the UK may rise by as much as 80 cm by 2100; however, a great deal of uncertainty surrounds model predictions, particularly the contribution from ice sheets responding to climatic warming. For this reason, the application of semi-empirical modelling approaches for sea level rise predictions has increased of late, the results from which suggest that the rate of sea level rise may be greater than previously thought, exceeding 1 m by 2100. Furthermore, studies in the Red Sea indicate that rapid sea level rise beyond 1m per century has occurred in the past. In light of such research, the latest UKCIP assessment has included a H++ scenario for sea level rise in the UK of up to 1.9 m which is defined as improbable but, crucially, physically plausible. The significance of such low-probability sea level rise scenarios upon the estimation of future flood risk is assessed using the Somerset levels (UK) as a case study. A simple asymmetric probability distribution is constructed to include sea level rise scenarios of up to 1.9 m by 2100 which are added to a current 1:200 year event water level to force a two-dimensional hydrodynamic model of coastal inundation. From the resulting ensemble predictions an estimation of risk by 2100 is established. The results indicate that although the likelihood of extreme sea level rise due to rapid ice sheet mass loss is low, the resulting hazard can be large, resulting in a significant (27%) increase to the projected annual risk. Furthermore, current defence construction guidelines for the coming century in the UK are expected to account for 95% of the sea level rise distribution presented in this research, while the larger, low probability scenarios beyond this level are estimated to contribute a residual annual risk of approximately £0.45 million. These findings clearly demonstrate that uncertainty in future sea level rise is a vital component of coastal flood risk, and therefore, needs to be accounted for by decision makers when considering mitigation policies related to coastal flooding.
Flower, Aquila; G. Gavin, Daniel; Heyerdahl, Emily K.; Parsons, Russell A.; Cohn, Gregory M.
2014-01-01
Insect outbreaks are often assumed to increase the severity or probability of fire occurrence through increased fuel availability, while fires may in turn alter susceptibility of forests to subsequent insect outbreaks through changes in the spatial distribution of suitable host trees. However, little is actually known about the potential synergisms between these natural disturbances. Assessing inter-disturbance synergism is challenging due to the short length of historical records and the confounding influences of land use and climate changes on natural disturbance dynamics. We used dendrochronological methods to reconstruct defoliator outbreaks and fire occurrence at ten sites along a longitudinal transect running from central Oregon to western Montana. We assessed synergism between disturbance types, analyzed long-term changes in disturbance dynamics, and compared these disturbance histories with dendroclimatological moisture availability records to quantify the influence of moisture availability on disturbances. After approximately 1890, fires were largely absent and defoliator outbreaks became longer-lasting, more frequent, and more synchronous at our sites. Fires were more likely to occur during warm-dry years, while outbreaks were most likely to begin near the end of warm-dry periods. Our results show no discernible impact of defoliation events on subsequent fire risk. Any effect from the addition of fuels during defoliation events appears to be too small to detect given the overriding influence of climatic variability. We therefore propose that if there is any relationship between the two disturbances, it is a subtle synergistic relationship wherein climate determines the probability of occurrence of each disturbance type, and each disturbance type damps the severity, but does not alter the probability of occurrence, of the other disturbance type over long time scales. Although both disturbance types may increase in frequency or extent in response to future warming, our records show no precedent that western spruce budworm outbreaks will increase future fire risk. PMID:25526633
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2012-04-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.
Flower, Aquila; Gavin, Daniel G; Heyerdahl, Emily K; Parsons, Russell A; Cohn, Gregory M
2014-01-01
Insect outbreaks are often assumed to increase the severity or probability of fire occurrence through increased fuel availability, while fires may in turn alter susceptibility of forests to subsequent insect outbreaks through changes in the spatial distribution of suitable host trees. However, little is actually known about the potential synergisms between these natural disturbances. Assessing inter-disturbance synergism is challenging due to the short length of historical records and the confounding influences of land use and climate changes on natural disturbance dynamics. We used dendrochronological methods to reconstruct defoliator outbreaks and fire occurrence at ten sites along a longitudinal transect running from central Oregon to western Montana. We assessed synergism between disturbance types, analyzed long-term changes in disturbance dynamics, and compared these disturbance histories with dendroclimatological moisture availability records to quantify the influence of moisture availability on disturbances. After approximately 1890, fires were largely absent and defoliator outbreaks became longer-lasting, more frequent, and more synchronous at our sites. Fires were more likely to occur during warm-dry years, while outbreaks were most likely to begin near the end of warm-dry periods. Our results show no discernible impact of defoliation events on subsequent fire risk. Any effect from the addition of fuels during defoliation events appears to be too small to detect given the overriding influence of climatic variability. We therefore propose that if there is any relationship between the two disturbances, it is a subtle synergistic relationship wherein climate determines the probability of occurrence of each disturbance type, and each disturbance type damps the severity, but does not alter the probability of occurrence, of the other disturbance type over long time scales. Although both disturbance types may increase in frequency or extent in response to future warming, our records show no precedent that western spruce budworm outbreaks will increase future fire risk.
Integrating Future Information through Scenarios. AIR 1985 Annual Forum Paper.
ERIC Educational Resources Information Center
Zentner, Rene D.
The way that higher education planners can take into account changes in the post-industrial society is discussed. The scenario method is proposed as a method of integrating futures information. The planner can be provided with several probable futures, each of which can be incorporated in a scenario. An effective scenario provides the planner…
Global mega forces: Implications for the future of natural resources
George H. Kubik
2012-01-01
The purpose of this paper is to provide an overview of leading global mega forces and their importance to the future of natural resource decisionmaking, policy development, and operation. Global mega forces are defined as a combination of major trends, preferences, and probabilities that come together to produce the potential for future high-impact outcomes. These...
Future-Orientated Approaches to Curriculum Development: Fictive Scripting
ERIC Educational Resources Information Center
Garraway, James
2017-01-01
Though the future cannot be accurately predicted, it is possible to envisage a number of probable developments which can promote thinking about the future and so promote a more informed stance about what should or should not be done. Studies in technology and society have claimed that the use of a type of forecasting using plausible but imaginary…
The Hayward-Rodgers Creek Fault System: Learning from the Past to Forecast the Future
NASA Astrophysics Data System (ADS)
Schwartz, D. P.; Lienkaemper, J. J.; Hecker, S.
2007-12-01
The San Francisco Bay area is located within the Pacific-North American plate boundary. As a result, the region has the highest density of active faults per square kilometer of any urban center in the US. Between the Farallon Islands and Livermore, the faults of the San Andreas fault system are slipping at a rate of about 40 mm/yr. Approximately 25 percent of this rate is accommodated by the Hayward fault and its continuation to the north, the Rodgers Creek fault. The Hayward fault extends 88 km from Warm Springs on the south into San Pablo Bay on the north, traversing the most heavily urbanized part of the Bay Area. The Rodgers Creek fault extends another 63 km, passing through Santa Rosa and ending south of Healdsburg. Geologic, seismologic, and geodetic studies during the past ten years have significantly increased our knowledge of this system. In particular, paleoseismic studies of the timing of past earthquakes have provided critical new information for improving our understanding of how these faults may work in time and space, and for estimating the probability of future earthquakes. The most spectacular result is an 11-earthquake record on the southern Hayward fault that extends back to A.D. 170. It suggests an average time interval between large earthquakes of 170 years for this period, with a shorter interval of 140 years for the five most recent earthquakes. Paleoseismic investigations have also shown that prior to the most recent large earthquake on the southern Hayward fault in 1868, large earthquakes occurred on the southern Hayward fault between 1658 and1786, on the northern Hayward fault between 1640 and 1776, and on the Rodgers Creek fault between 1690 and 1776. These could have been three separate earthquakes. However, the overlapping radiocarbon dates for these paleoearthquakes allow the possibility that these faults may have ruptured together in several different combinations: a combined southern and northern Hayward fault earthquake, a Rodgers Creek-northern Hayward fault earthquake, or a rupture of all three fault sections. Each of these rupture combinations would produce a magnitude larger than 1868 (M~6.9). In 2003, the Working Group on California Earthquake Probabilities released a new earthquake forecast for the Bay Area. Using the earthquake timing data and alternative fault rupture models, the Working Group estimated a 27 percent likelihood of a M?6.7 earthquake along the Hayward-Rodgers Creek fault zone by the year 2031. This is this highest probability of any individual fault system in the Bay Area. New paleoseismic data will allow updating of this forecast.
NASA Astrophysics Data System (ADS)
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning
Saying Goodbye to 'Bonneville' Crater
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Annotated Image NASA's Mars Exploration Rover Spirit took this panoramic camera image on sol 86 (March 31, 2004) before driving 36 meters (118 feet) on sol 87 toward its future destination, the Columbia Hills. This is probably the last panoramic camera image that Spirit will take from the high rim of 'Bonneville' crater, and provides an excellent view of the ejecta-covered path the rover has journeyed thus far. The lander can be seen toward the upper right of the frame and is approximately 321 meters (1060 feet) away from Spirit's current location. The large hill on the horizon is Grissom Hill. The Colombia Hills, located to the left, are not visible in this image.Existential Risk and Cost-Effective Biosecurity
Snyder-Beattie, Andrew
2017-01-01
In the decades to come, advanced bioweapons could threaten human existence. Although the probability of human extinction from bioweapons may be low, the expected value of reducing the risk could still be large, since such risks jeopardize the existence of all future generations. We provide an overview of biotechnological extinction risk, make some rough initial estimates for how severe the risks might be, and compare the cost-effectiveness of reducing these extinction-level risks with existing biosecurity work. We find that reducing human extinction risk can be more cost-effective than reducing smaller-scale risks, even when using conservative estimates. This suggests that the risks are not low enough to ignore and that more ought to be done to prevent the worst-case scenarios. PMID:28806130
Pritychenko, B.; Birch, M.; Singh, B.; ...
2015-11-03
A complete B(E2)↑ evaluation and compilation for even-even nuclei has been presented. The present paper is a continuation of P.H. Stelson and L. Grodzins, and S. Raman et al. nuclear data evaluations and was motivated by a large number of new measurements. It extends the list of evaluated nuclides from 328 to 452, includes an extended list of nuclear reaction kinematics parameters and comprehensive shell model analysis. Evaluation policies for analysis of experimental data have been discussed and conclusions are given. Moreover, future plans for B(E2)↑ systematics and experimental technique analyses of even-even nuclei are outlined.
Statistical Characterization and Classification of Edge-Localized Plasma Instabilities
NASA Astrophysics Data System (ADS)
Webster, A. J.; Dendy, R. O.
2013-04-01
The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.
Assessment of undiscovered oil and gas in the arctic
Gautier, Donald L.; Bird, Kenneth J.; Charpentier, Ronald R.; Grantz, Arthur; Houseknecht, David W.; Klett, Timothy R.; Moore, Thomas E.; Pitman, Janet K.; Schenk, Christopher J.; Schuenemeyer, John H.; Sorensen, Kai; Tennyson, Marilyn E.; Valin, Zenon C.; Wandrey, Craig J.
2009-01-01
Among the greatest uncertainties in future energy supply and a subject of considerable environmental concern is the amount of oil and gas yet to be found in the Arctic. By using a probabilistic geology-based methodology, the United States Geological Survey has assessed the area north of the Arctic Circle and concluded that about 30% of the world’s undiscovered gas and 13% of the world’s undiscovered oil may be found there, mostly offshore under less than 500 meters of water. Undiscovered natural gas is three times more abundant than oil in the Arctic and is largely concentrated in Russia. Oil resources, although important to the interests of Arctic countries, are probably not sufficient to substantially shift the current geographic pattern of world oil production.
Maps showing gas-hydrate distribution off the east coast of the United States
Dillon, William P.; Fehlhaber, Kristen L.; Coleman, Dwight F.; Lee, Myung W.; Hutchinson, Deborah R.
1995-01-01
These maps present the inferred distribution of natural-gas hydrate within the sediments of the eastern United States continental margin (Exclusive Economic Zone) in the offshore region from Georgia to New Jersey (fig. 1). The maps, which were created on the basis of seismic interpretations, represent the first attempt to map volume estimates for gas hydrate. Gas hydrate forms a large reservoir for methane in oceanic sediments. Therefore it potentially may represent a future source of energy and it may influence climate change because methane is a very effective greenhouse gas. Hydrate breakdown probably is a controlling factor for sea-floor landslides, and its presence has significant effect on the acoustic velocity of sea-floor sediments.
Neurons That Update Representations of the Future.
Seriès, Peggy
2018-06-11
A recent article shows that the brain automatically estimates the probabilities of possible future actions before it has even received all the information necessary to decide what to do next. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
Niches, models, and climate change: Assessing the assumptions and uncertainties
Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.
2009-01-01
As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750
Macroweather Predictions and Climate Projections using Scaling and Historical Observations
NASA Astrophysics Data System (ADS)
Hébert, R.; Lovejoy, S.; Del Rio Amador, L.
2017-12-01
There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative Concentration Pathway (RCP) 2.6 for which the probability to remain under 1.5 K is 48%. RCP 4.5 and RCP 8.5-like futures overshoot with very high probability. This underscores that over the next century, the state of the environment will be strongly influenced by past, present and future economical policies.
On the dependence on inclination of capture probability of short-period comets
NASA Astrophysics Data System (ADS)
Yabushita, S.; Tsujii, T.
1990-06-01
Calculation is made of probability of capture whereby a nearly parabolic comet with perihelion near the Jovian orbit comes to have a perihelion distance less than 2.5 AU and a period less than 200 yr. The probability is found to depend strongly on the inclination, in accordance with earlier results of Everhart and of Stagg and Bailey. It is large for orbits close to the ecliptic but decreases drastically for large inclinations. The overall probability of capture from randomly distributed orbits is 0.00044, which shows that either the presently observed short-period comets are not in a steady state or the source flux may be in the Uranus-Neptune zone.
Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M
Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Harris, Adam J. L.; Corner, Adam
2011-01-01
Verbal probability expressions are frequently used to communicate risk and uncertainty. The Intergovernmental Panel on Climate Change (IPCC), for example, uses them to convey risks associated with climate change. Given the potential for human action to mitigate future environmental risks, it is important to understand how people respond to these…
A new algorithm for finding survival coefficients employed in reliability equations
NASA Technical Reports Server (NTRS)
Bouricius, W. G.; Flehinger, B. J.
1973-01-01
Product reliabilities are predicted from past failure rates and reasonable estimate of future failure rates. Algorithm is used to calculate probability that product will function correctly. Algorithm sums the probabilities of each survival pattern and number of permutations for that pattern, over all possible ways in which product can survive.
Large number limit of multifield inflation
NASA Astrophysics Data System (ADS)
Guo, Zhong-Kai
2017-12-01
We compute the tensor and scalar spectral index nt, ns, the tensor-to-scalar ratio r , and the consistency relation nt/r in the general monomial multifield slow-roll inflation models with potentials V ˜∑iλi|ϕi| pi . The general models give a novel relation that nt, ns and nt/r are all proportional to the logarithm of the number of fields Nf when Nf is getting extremely large with the order of magnitude around O (1040). An upper bound Nf≲N*eZ N* is given by requiring the slow variation parameter small enough where N* is the e -folding number and Z is a function of distributions of λi and pi. Besides, nt/r differs from the single-field result -1 /8 with substantial probability except for a few very special cases. Finally, we derive theoretical bounds r >2 /N* (r ≳0.03 ) and for nt, which can be tested by observation in the near future.
Large local reactions to insect envenomation.
Carlson, John; Golden, David B K
2016-08-01
Insect stings often induce large local reactions (LLRs) that result in morbidity. These reactions do have an immunologic basis; however, patients presenting with LLRs should be managed differently than those with systemic allergic reactions, as described in this review. Morbidity results from the inflammation itself along with the iatrogenic consequences of treatment. The prescription of antihistamine medications and the use of antibiotics are generally not indicated for patients with LLRs because of the risks/side-effects of these medications and the low probability of benefit. Some patients are also concerned over the possibility that a future sting will evolve into a life-threatening reaction. Although these reactions do involve IgE, patients are not at sufficient risk to warrant prescription of autoinjectable epinephrine. Venom-specific immunotherapy can be considered when LLRs are frequent and associated with significant impairment. Clinicians can reduce morbidity from LLRs by reassuring the patients, avoiding medications that result in side-effects when they are not indicated, and referring to an allergist when there are additional concerns, such as frequent impairment.
Metadata and annotations for multi-scale electrophysiological data.
Bower, Mark R; Stead, Matt; Brinkmann, Benjamin H; Dufendach, Kevin; Worrell, Gregory A
2009-01-01
The increasing use of high-frequency (kHz), long-duration (days) intracranial monitoring from multiple electrodes during pre-surgical evaluation for epilepsy produces large amounts of data that are challenging to store and maintain. Descriptive metadata and clinical annotations of these large data sets also pose challenges to simple, often manual, methods of data analysis. The problems of reliable communication of metadata and annotations between programs, the maintenance of the meanings within that information over long time periods, and the flexibility to re-sort data for analysis place differing demands on data structures and algorithms. Solutions to these individual problem domains (communication, storage and analysis) can be configured to provide easy translation and clarity across the domains. The Multi-scale Annotation Format (MAF) provides an integrated metadata and annotation environment that maximizes code reuse, minimizes error probability and encourages future changes by reducing the tendency to over-fit information technology solutions to current problems. An example of a graphical utility for generating and evaluating metadata and annotations for "big data" files is presented.
Browning, A C
1974-12-01
In this article, an uncommon form of passenger transport is considered, the moving pavement or pedestrian conveyor running at speeds of up to 16 km/h. There are very little relevant ergonomic data for such devices and some specific laboratory experiments have been carried out using 1000 subjects to represent the general public. It is concluded that whilst high speed pedestrian conveyors are quite feasible, stations along them are likely to be large. The most attractive type is a set of parallel surfaces moving at different speeds and with handholds provided in the form of poles. This type could be extremely convenient for certain locations but will probably have to be restricted in its use to fairly fit adults carrying little luggage, and would find applications in situations where a large number of people need to travel in the same direction. Part 2, Ergonomic considerations of complete conveyor systems, will follow.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY... effort to assess the probability of future behavior which could have an effect adverse to the national... the past but necessarily anticipating the future. Rarely is proof of trustworthiness and reliability...
Numerical Simulation of Stress evolution and earthquake sequence of the Tibetan Plateau
NASA Astrophysics Data System (ADS)
Dong, Peiyu; Hu, Caibo; Shi, Yaolin
2015-04-01
The India-Eurasia's collision produces N-S compression and results in large thrust fault in the southern edge of the Tibetan Plateau. Differential eastern flow of the lower crust of the plateau leads to large strike-slip faults and normal faults within the plateau. From 1904 to 2014, more than 30 earthquakes of Mw > 6.5 occurred sequentially in this distinctive tectonic environment. How did the stresses evolve during the last 110 years, how did the earthquakes interact with each other? Can this knowledge help us to forecast the future seismic hazards? In this essay, we tried to simulate the evolution of the stress field and the earthquake sequence in the Tibetan plateau within the last 110 years with a 2-D finite element model. Given an initial state of stress, the boundary condition was constrained by the present-day GPS observation, which was assumed as a constant rate during the 110 years. We calculated stress evolution year by year, and earthquake would occur if stress exceed the crustal strength. Stress changes due to each large earthquake in the sequence was calculated and contributed to the stress evolution. A key issue is the choice of initial stress state of the modeling, which is actually unknown. Usually, in the study of earthquake triggering, people assume the initial stress is zero, and only calculate the stress changes by large earthquakes - the Coulomb failure stress changes (Δ CFS). To some extent, this simplified method is a powerful tool because it can reveal which fault or which part of a fault becomes more risky or safer relatively. Nonetheless, it has not utilized all information available to us. The earthquake sequence reveals, though far from complete, some information about the stress state in the region. If the entire region is close to a self-organized critical or subcritical state, earthquake stress drop provides an estimate of lower limit of initial state. For locations no earthquakes occurred during the period, initial stress has to be lower than certain value. For locations where large earthquakes occurred during the 110 years, the initial stresses can be inverted if the strength is estimated and the tectonic loading is assumed constant. Therefore, although initial stress state is unknown, we can try to make estimate of a range of it. In this study, we estimated a reasonable range of initial stress, and then based on Coulomb-Mohr criterion to regenerate the earthquake sequence, starting from the Daofu earthquake of 1904. We calculated the stress field evolution of the sequence, considering both the tectonic loading and interaction between the earthquakes. Ultimately we got a sketch of the present stress. Of course, a single model with certain initial stress is just one possible model. Consequently the potential seismic hazards distribution based on a single model is not convincing. We made test on hundreds of possible initial stress state, all of them can produce the historical earthquake sequence occurred, and summarized all kinds of calculated probabilities of the future seismic activity. Although we cannot provide the exact state in the future, but we can narrow the estimate of regions where is in high probability of risk. Our primary results indicate that the Xianshuihe fault and adjacent area is one of such zones with higher risk than other regions in the future. During 2014, there were 6 earthquakes (M > 5.0) happened in this region, which correspond with our result in some degree. We emphasized the importance of the initial stress field for the earthquake sequence, and provided a probabilistic assessment for future seismic hazards. This study may bring some new insights to estimate the initial stress, earthquake triggering, and the stress field evolution .
A post-Tohoku earthquake review of earthquake probabilities in the Southern Kanto District, Japan
NASA Astrophysics Data System (ADS)
Somerville, Paul G.
2014-12-01
The 2011 Mw 9.0 Tohoku earthquake generated an aftershock sequence that affected a large part of northern Honshu, and has given rise to widely divergent forecasts of changes in earthquake occurrence probabilities in northern Honshu. The objective of this review is to assess these forecasts as they relate to potential changes in the occurrence probabilities of damaging earthquakes in the Kanto Region. It is generally agreed that the 2011 Mw 9.0 Tohoku earthquake increased the stress on faults in the southern Kanto district. Toda and Stein (Geophys Res Lett 686, 40: doi:10.1002, 2013) further conclude that the probability of earthquakes in the Kanto Corridor has increased by a factor of 2.5 for the time period 11 March 2013 to 10 March 2018 in the Kanto Corridor. Estimates of earthquake probabilities in a wider region of the Southern Kanto District by Nanjo et al. (Geophys J Int, doi:10.1093, 2013) indicate that any increase in the probability of earthquakes is insignificant in this larger region. Uchida et al. (Earth Planet Sci Lett 374: 81-91, 2013) conclude that the Philippine Sea plate the extends well north of the northern margin of Tokyo Bay, inconsistent with the Kanto Fragment hypothesis of Toda et al. (Nat Geosci, 1:1-6,2008), which attributes deep earthquakes in this region, which they term the Kanto Corridor, to a broken fragment of the Pacific plate. The results of Uchida and Matsuzawa (J Geophys Res 115:B07309, 2013)support the conclusion that fault creep in southern Kanto may be slowly relaxing the stress increase caused by the Tohoku earthquake without causing more large earthquakes. Stress transfer calculations indicate a large stress transfer to the Off Boso Segment as a result of the 2011 Tohoku earthquake. However, Ozawa et al. (J Geophys Res 117:B07404, 2012) used onshore GPS measurements to infer large post-Tohoku creep on the plate interface in the Off-Boso region, and Uchida and Matsuzawa (ibid.) measured similar large creep off the Boso Peninsula. Thus some of the large stress transfer may be undergoing aseismic release, consistent with pre-Tohoku geodetic data, so a large earthquake on the Off Boso segment may have a low probability.
Analysis of blocking probability for OFDM-based variable bandwidth optical network
NASA Astrophysics Data System (ADS)
Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi
2011-12-01
Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.
A Bayesian Sampler for Optimization of Protein Domain Hierarchies
2014-01-01
Abstract The process of identifying and modeling functionally divergent subgroups for a specific protein domain class and arranging these subgroups hierarchically has, thus far, largely been done via manual curation. How to accomplish this automatically and optimally is an unsolved statistical and algorithmic problem that is addressed here via Markov chain Monte Carlo sampling. Taking as input a (typically very large) multiple-sequence alignment, the sampler creates and optimizes a hierarchy by adding and deleting leaf nodes, by moving nodes and subtrees up and down the hierarchy, by inserting or deleting internal nodes, and by redefining the sequences and conserved patterns associated with each node. All such operations are based on a probability distribution that models the conserved and divergent patterns defining each subgroup. When we view these patterns as sequence determinants of protein function, each node or subtree in such a hierarchy corresponds to a subgroup of sequences with similar biological properties. The sampler can be applied either de novo or to an existing hierarchy. When applied to 60 protein domains from multiple starting points in this way, it converged on similar solutions with nearly identical log-likelihood ratio scores, suggesting that it typically finds the optimal peak in the posterior probability distribution. Similarities and differences between independently generated, nearly optimal hierarchies for a given domain help distinguish robust from statistically uncertain features. Thus, a future application of the sampler is to provide confidence measures for various features of a domain hierarchy. PMID:24494927
Michael, Andrew J.
2012-01-01
Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.
Stochastic Modelling of Past Volcanic Crises
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
It is customary to have continuous monitoring of volcanoes showing signs of unrest that might lead to an eruption threatening local populations. Despite scientific progress in estimating the probability of an eruption occurring, the concept of continuously tracking eruption probability remains a future aspiration for volcano risk analysts. During some recent major volcanic crises, attempts have been made to estimate the eruption probability in real time to support government decision-making. These include the possibility of an eruption of Katla linked with the eruption of Eyjafjallajökull in 2010, and the Santorini crisis of 2011-2012. However, once a crisis fades, interest in analyzing the probability that there might have been an eruption tends to wane. There is an inherent outcome bias well known to psychologists: if disaster was avoided, there is perceived to be little purpose in exploring scenarios where a disaster might have happened. Yet the better that previous periods of unrest are understood and modelled, the better that the risk associated with future periods of unrest will be quantified. Scenarios are counterfactual histories of the future. The task of quantifying the probability of an eruption for a past period of unrest should not be merely a statistical calculation, but should serve to elucidate and refine geophysical models of the eruptive processes. This is achieved by using a Bayesian Belief Network approach, in which monitoring observations are used to draw inferences on the underlying causal factors. Specifically, risk analysts are interested in identifying what dynamical perturbations might have tipped an unrest period in history over towards an eruption, and assessing what was the likelihood of such perturbations. Furthermore, in what ways might a historical volcano crisis have turned for the worse? Such important counterfactual questions are addressed in this paper.
History and future of remote sensing technology and education
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1980-01-01
A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.
The role of magical thinking in forecasting the future.
Stavrova, Olga; Meckel, Andrea
2017-02-01
This article explores the role of magical thinking in the subjective probabilities of future chance events. In five experiments, we show that individuals tend to predict a more lucky future (reflected in probability judgements of lucky and unfortunate chance events) for someone who happened to purchase a product associated with a highly moral person than for someone who unknowingly purchased a product associated with a highly immoral person. In the former case, positive events were considered more likely than negative events, whereas in the latter case, the difference in the likelihood judgement of positive and negative events disappeared or even reversed. Our results indicate that this effect is unlikely to be driven by participants' immanent justice beliefs, the availability heuristic, or experimenter demand. Finally, we show that individuals rely more heavily on magical thinking when their need for control is threatened, thus suggesting that lack of control represents a factor in driving magical thinking in making predictions about the future. © 2016 The British Psychological Society.
Probability of criminal acts of violence: a test of jury predictive accuracy.
Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D
2013-01-01
The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.
Visualizing uncertainty about the future.
Spiegelhalter, David; Pearson, Mike; Short, Ian
2011-09-09
We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge.
NASA Astrophysics Data System (ADS)
Rouhani, Hassan; Leconte, Robert
2018-06-01
Climate change will affect precipitation and flood regimes. It is anticipated that the Probable Maximum Precipitation (PMP) and Probable Maximum Flood (PMF) will be modified in a changing climate. This paper aims to quantify and analyze climate change influences on PMP and PMF in three watersheds with different climatic conditions across the province of Québec, Canada. Output data from the Canadian Regional Climate Model (CRCM) was used to estimate PMP and Probable Maximum Snow Accumulation (PMSA) in future climate projections, which was then used to force the SWAT hydrological model to estimate PMF. PMP and PMF values were estimated for two time horizons each spanning 30 years: 1961-1990 (recent past) and 2041-2070 (future). PMP and PMF were separately analyzed for two seasons: summer-fall and spring. Results show that PMF in the watershed located in southern Québec would remain unchanged in the future horizon, but the trend for the watersheds located in the northeastern and northern areas of the province is an increase of up to 11%.
NASA Astrophysics Data System (ADS)
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.
Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G
2016-01-01
Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.
Pathways of fish invasions in the Mid-Atlantic region of the United States
Lapointe, Nicolas W. R.; Fuller, Pam; Neilson, Matthew; Murphy, Brian R.; Angermeier, Paul
2016-01-01
Non-native fish introductions are a major threat to biodiversity and fisheries, and occur through numerous pathways that vary regionally in importance. A key strategy for managing invasions is to focus prevention efforts on pathways posing the greatest risk of future introductions. We identified high-risk pathways for fish establishment in the Mid-Atlantic region of the United States based on estimates of probability of establishment and records of previous introductions, which were considered in the context of emerging socioeconomic trends. We used estimates of propagule pressure, species’ environmental tolerance, and size of species pool to assess the risk of establishment by pathway. Pathways varied considerably in historic importance and species composition, with the majority of species introduced intentionally via stocking (primarily for sport, forage, or biocontrol) or bait release. Bait release, private stocking, illegal introductions intended to establish reproducing populations (e.g., of sport fish), aquaculture, and the sale of live organisms all create risks for future invasions in the Mid-Atlantic region. Of these pathways, bait release probably poses the greatest risk of introductions for the Mid-Atlantic region because propagule pressure is moderate, most released species are tolerant of local environmental conditions, and the pool of species available for transplantation is large. Our findings differ considerably from studies in other regions (e.g., bait release is a dominant pathway in the Mid-Atlantic region, whereas illegal introduction of sport fish is dominant in the western US and aquarium releases are dominant in Florida), demonstrating the need for regional-scale assessments of, and management strategies for, introduction pathways.
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul SF
2015-01-01
Background Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. Objective The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. Methods There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric “Screening Efficiency” that were adopted to evaluate model effectiveness. Results Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Conclusions Individuals in China with high suicide probability are recognizable by profile and text-based information from microblogs. Although there is still much space to improve the performance of classification models in the future, this study may shed light on preliminary screening of risky individuals via machine learning algorithms, which can work side-by-side with expert scrutiny to increase efficiency in large-scale-surveillance of suicide probability from online social media. PMID:26543921
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul Sf; Zhu, Tingshao
2015-01-01
Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric "Screening Efficiency" that were adopted to evaluate model effectiveness. Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Individuals in China with high suicide probability are recognizable by profile and text-based information from microblogs. Although there is still much space to improve the performance of classification models in the future, this study may shed light on preliminary screening of risky individuals via machine learning algorithms, which can work side-by-side with expert scrutiny to increase efficiency in large-scale-surveillance of suicide probability from online social media.
Fukushima Accident: Sequence of Events and Lessons Learned
NASA Astrophysics Data System (ADS)
Morse, Edward C.
2011-10-01
The Fukushima Dai-Ichi nuclear power station suffered a devastating Richter 9.0 earthquake followed by a 14.0 m tsunami on 11 March 2011. The subsequent loss of power for emergency core cooling systems resulted in damage to the fuel in the cores of three reactors. The relief of pressure from the containment in these three reactors led to sufficient hydrogen gas release to cause explosions in the buildings housing the reactors. There was probably subsequent damage to a spent fuel pool of a fourth reactor caused by debris from one of these explosions. Resultant releases of fission product isotopes in air were significant and have been estimated to be in the 3 . 7 --> 6 . 3 ×1017 Bq range (~10 MCi) for 131I and 137Cs combined, or approximately one tenth that of the Chernobyl accident. A synopsis of the sequence of events leading up to this large release of radioactivity will be presented, along with likely scenarios for stabilization and site cleanup in the future. Some aspects of the isotope monitoring programs, both locally and at large, will also be discussed. An assessment of radiological health risk for the plant workers as well as the general public will also be presented. Finally, the impact of this accident on design and deployment of nuclear generating stations in the future will be discussed.
Fixation probability of a nonmutator in a large population of asexual mutators.
Jain, Kavita; James, Ananthu
2017-11-21
In an adapted population of mutators in which most mutations are deleterious, a nonmutator that lowers the mutation rate is under indirect selection and can sweep to fixation. Using a multitype branching process, we calculate the fixation probability of a rare nonmutator in a large population of asexual mutators. We show that when beneficial mutations are absent, the fixation probability is a nonmonotonic function of the mutation rate of the mutator: it first increases sublinearly and then decreases exponentially. We also find that beneficial mutations can enhance the fixation probability of a nonmutator. Our analysis is relevant to an understanding of recent experiments in which a reduction in the mutation rates has been observed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Toward an Objectivistic Theory of Probability
1956-01-01
OBJECTIVISTIC PROBABILITY THEORY 2 7 three potential acts: the individual may choose an apple, an orange or a banana . Each of these acts corresponds to a point...its veneer having begun to peel at one corner, etc., etc. Its future there-ness lies in that it may have its legs gnawed at by the new puppy in the
Estimation of probability of failure for damage-tolerant aerospace structures
NASA Astrophysics Data System (ADS)
Halbert, Keith
The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This dissertation describes and develops new PDTA methodologies that directly address the deficiencies of the currently used tools. The new methods are implemented as a free, publicly licensed and open source R software package that can be downloaded from the Comprehensive R Archive Network. The tools consist of two main components. First, an explicit (and expensive) Monte Carlo approach is presented which simulates the life of an aircraft structural component flight-by-flight. This straightforward MC routine can be used to provide defensible estimates of the failure probabilities for future flights and repair probabilities for future inspections under a variety of failure and maintenance scenarios. This routine is intended to provide baseline estimates against which to compare the results of other, more efficient approaches. Second, an original approach is described which models the fatigue process and future scheduled inspections as a hidden Markov model. This model is solved using a particle-based approximation and the sequential importance sampling algorithm, which provides an efficient solution to the PDTA problem. Sequential importance sampling is an extension of importance sampling to a Markov process, allowing for efficient Bayesian updating of model parameters. This model updating capability, the benefit of which is demonstrated, is lacking in other PDTA approaches. The results of this approach are shown to agree with the results of the explicit Monte Carlo routine for a number of PDTA problems. Extensions to the typical PDTA problem, which cannot be solved using currently available tools, are presented and solved in this work. These extensions include incorporating observed evidence (such as non-destructive inspection results), more realistic treatment of possible future repairs, and the modeling of failure involving more than one crack (the so-called continuing damage problem). The described hidden Markov model / sequential importance sampling approach to PDTA has the potential to improve aerospace structural safety and reduce maintenance costs by providing a more accurate assessment of the risk of failure and the likelihood of repairs throughout the life of an aircraft.
Recent research on the high-probability instructional sequence: A brief review.
Lipschultz, Joshua; Wilder, David A
2017-04-01
The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.
Vacuum quantum stress tensor fluctuations: A diagonalization approach
NASA Astrophysics Data System (ADS)
Schiappacasse, Enrico D.; Fewster, Christopher J.; Ford, L. H.
2018-01-01
Large vacuum fluctuations of a quantum stress tensor can be described by the asymptotic behavior of its probability distribution. Here we focus on stress tensor operators which have been averaged with a sampling function in time. The Minkowski vacuum state is not an eigenstate of the time-averaged operator, but can be expanded in terms of its eigenstates. We calculate the probability distribution and the cumulative probability distribution for obtaining a given value in a measurement of the time-averaged operator taken in the vacuum state. In these calculations, we study a specific operator that contributes to the stress-energy tensor of a massless scalar field in Minkowski spacetime, namely, the normal ordered square of the time derivative of the field. We analyze the rate of decrease of the tail of the probability distribution for different temporal sampling functions, such as compactly supported functions and the Lorentzian function. We find that the tails decrease relatively slowly, as exponentials of fractional powers, in agreement with previous work using the moments of the distribution. Our results lend additional support to the conclusion that large vacuum stress tensor fluctuations are more probable than large thermal fluctuations, and may have observable effects.
Are Earthquake Clusters/Supercycles Real or Random?
NASA Astrophysics Data System (ADS)
Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.
2016-12-01
Long records of earthquakes at plate boundaries such as the San Andreas or Cascadia often show that large earthquakes occur in temporal clusters, also termed supercycles, separated by less active intervals. These are intriguing because the boundary is presumably being loaded by steady plate motion. If so, earthquakes resulting from seismic cycles - in which their probability is small shortly after the past one, and then increases with time - should occur quasi-periodically rather than be more frequent in some intervals than others. We are exploring this issue with two approaches. One is to assess whether the clusters result purely by chance from a time-independent process that has no "memory." Thus a future earthquake is equally likely immediately after the past one and much later, so earthquakes can cluster in time. We analyze the agreement between such a model and inter-event times for Parkfield, Pallet Creek, and other records. A useful tool is transformation by the inverse cumulative distribution function, so the inter-event times have a uniform distribution when the memorylessness property holds. The second is via a time-variable model in which earthquake probability increases with time between earthquakes and decreases after an earthquake. The probability of an event increases with time until one happens, after which it decreases, but not to zero. Hence after a long period of quiescence, the probability of an earthquake can remain higher than the long-term average for several cycles. Thus the probability of another earthquake is path dependent, i.e. depends on the prior earthquake history over multiple cycles. Time histories resulting from simulations give clusters with properties similar to those observed. The sequences of earthquakes result from both the model parameters and chance, so two runs with the same parameters look different. The model parameters control the average time between events and the variation of the actual times around this average, so models can be strongly or weakly time-dependent.
Communicating uncertainty: managing the inherent probabilistic character of hazard estimates
NASA Astrophysics Data System (ADS)
Albarello, Dario
2013-04-01
Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific communicators) of a view of probability (the "frequentist" view) that is useful in scientific practice but is very far from the common use of uncertain reasoning (that is nearer to the "epistemic" view). Considering probability a sort of physical measure inherent in the process under examination (like an acceleration value) instead of a degree of belief (rationally inferred) about any statement concerning future occurrences tends to hide the importance of a shared responsibility about relevant choices that involves scientists and citizens in the same extent.
NASA Astrophysics Data System (ADS)
Brown, Tristan R.
The revised Renewable Fuel Standard requires the annual blending of 16 billion gallons of cellulosic biofuel by 2022 from zero gallons in 2009. The necessary capacity investments have been underwhelming to date, however, and little is known about the likely composition of the future cellulosic biofuel industry as a result. This dissertation develops a framework for identifying and analyzing the industry's likely future composition while also providing a possible explanation for why investment in cellulosic biofuels capacity has been low to date. The results of this dissertation indicate that few cellulosic biofuel pathways will be economically competitive with petroleum on an unsubsidized basis. Of five cellulosic biofuel pathways considered under 20-year price forecasts with volatility, only two achieve positive mean 20-year net present value (NPV) probabilities. Furthermore, recent exploitation of U.S. shale gas reserves and the subsequent fall in U.S. natural gas prices have negatively impacted the economic competitiveness of all but two of the cellulosic biofuel pathways considered; only two of the five pathways achieve substantially higher 20-year NPVs under a post-shale gas economic scenario relative to a pre-shale gas scenario. The economic competitiveness of cellulosic biofuel pathways with petroleum is reduced further when considered under price uncertainty in combination with realistic financial assumptions. This dissertation calculates pathway-specific costs of capital for five cellulosic biofuel pathway scenarios. The analysis finds that the large majority of the scenarios incur costs of capital that are substantially higher than those commonly assumed in the literature. Employment of these costs of capital in a comparative TEA greatly reduces the mean 20-year NPVs for each pathway while increasing their 10-year probabilities of default to above 80% for all five scenarios. Finally, this dissertation quantifies the economic competitiveness of six cellulosic biofuel pathways being commercialized in eight different U.S. states under price uncertainty, utilization of pathway-specific costs of capital, and region-specific economic factors. 10-year probabilities of default in excess of 60% are calculated for all eight location scenarios considered, with default probabilities in excess of 98% calculated for seven of the eight. Negative mean 20-year NPVs are calculated for seven of the eight location scenarios.
Modeling Addictive Consumption as an Infectious Disease*
Alamar, Benjamin; Glantz, Stanton A.
2011-01-01
The dominant model of addictive consumption in economics is the theory of rational addiction. The addict in this model chooses how much they are going to consume based upon their level of addiction (past consumption), the current benefits and all future costs. Several empirical studies of cigarette sales and price data have found a correlation between future prices and consumption and current consumption. These studies have argued that the correlation validates the rational addiction model and invalidates any model in which future consumption is not considered. An alternative to the rational addiction model is one in which addiction spreads through a population as if it were an infectious disease, as supported by the large body of empirical research of addictive behaviors. In this model an individual's probability of becoming addicted to a substance is linked to the behavior of their parents, friends and society. In the infectious disease model current consumption is based only on the level of addiction and current costs. Price and consumption data from a simulation of the infectious disease model showed a qualitative match to the results of the rational addiction model. The infectious disease model can explain all of the theoretical results of the rational addiction model with the addition of explaining initial consumption of the addictive good. PMID:21339848
Do aftershock probabilities decay with time?
Michael, Andrew J.
2012-01-01
So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.
2012-01-01
Cancer risk is an important concern for International Space Station (ISS) missions and future exploration missions. An important question concerns the likelihood of a causal association between a crew members radiation exposure and the occurrence of cancer. The probability of causation (PC), also denoted as attributable risk, is used to make such an estimate. This report summarizes the NASA model of space radiation cancer risks and uncertainties, including improvements to represent uncertainties in tissue-specific cancer incidence models for never-smokers and the U.S. average population. We report on tissue-specific cancer incidence estimates and PC for different post-mission times for ISS and exploration missions. An important conclusion from our analysis is that the NASA policy to limit the risk of exposure-induced death to 3% at the 95% confidence level largely ensures that estimates of the PC for most cancer types would not reach a level of significance. Reducing uncertainties through radiobiological research remains the most efficient method to extend mission length and establish effective mitigators for cancer risks. Efforts to establish biomarkers of space radiation-induced tumors and to estimate PC for rarer tumor types are briefly discussed.
Hazard perception and the economic impact of internment on residential land values
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merz, J.F.
1983-04-01
The potential for large scale natural and man-made hazards exists in the form of hurricanes, earthquakes, volcanoes, floods, dams, accidents involving poisonous, explosive or radioactive materials, and severe pollution or waste disposal mishaps. Regions prone to natural hazards and areas located proximally to technological hazards may be subject to economic losses from low probability-high consequence events. Economic costs may be incurred in: evacuation and relocation of inhabitants; personal, industrial, agricultural, and tax revenue losses; decontamination; property damage or loss of value; and temporary or prolonged internment of land. The value of land in an area subjected to a low probability-highmore » consequence event may decrease, reflecting, a fortiori, a reluctance to continue living in the area or to repopulate a region which had required internment. The future value of such land may be described as a function of location, time, interdiction period (if applicable), and variables reflecting the magnitude of the perceived hazard. This paper presents a study of these variables and proposes a model for land value estimation. As an example, the application of the model to the Love Canal area in Niagara Falls, New York is presented.« less
Janssen, Eva; van Osch, Liesbeth; Lechner, Lilian; Candel, Math; de Vries, Hein
2012-01-01
Despite the increased recognition of affect in guiding probability estimates, perceived risk has been mainly operationalised in a cognitive way and the differentiation between rational and intuitive judgements is largely unexplored. This study investigated the validity of a measurement instrument differentiating cognitive and affective probability beliefs and examined whether behavioural decision making is mainly guided by cognition or affect. Data were obtained from four surveys focusing on smoking (N=268), fruit consumption (N=989), sunbed use (N=251) and sun protection (N=858). Correlational analyses showed that affective likelihood was more strongly correlated with worry compared to cognitive likelihood and confirmatory factor analysis provided support for a two-factor model of perceived likelihood instead of a one-factor model (i.e. cognition and affect combined). Furthermore, affective likelihood was significantly associated with the various outcome variables, whereas the association for cognitive likelihood was absent in three studies. The findings provide support for the construct validity of the measures used to assess cognitive and affective likelihood. Since affective likelihood might be a better predictor of health behaviour than the commonly used cognitive operationalisation, both dimensions should be considered in future research.
Kendall, C.; Silva, S.R.; Kelly, V.J.
2001-01-01
Riverine particulate organic matter (POM) samples were collected bi-weekly to monthly from 40 sites in the Mississippi, Colorado, Rio Grande, and Columbia River Basins (USA) in 1996-97 and analysed for carbon and nitrogen stable isotopic compositions. These isotopic compositions and C : N ratios were used to identify four endmember sources of POM: Plankton, fresh terrestrial plant material, aquatic plants, and soil organic material. This large-scale study also incorporated ancillary chemical and hydrologic data to refine and extend the interpretations of POM sources beyond the source characterizations that could be done solely with isotopic and elemental ratios. The ancillary data were especially useful for differentiating between seasonal changes in POM source materials and the effects of local nutrient sources and in-stream biogeochemical processes. Average values of ??13 C and C : N for all four river systems suggested that plankton is the dominant source of POM in these rivers, with higher percentages of plankton downstream of reservoirs. Although the temporal patterns in some rivers are complex, the low ??13C and C : N values in spring and summer probably indicate plankton blooms, whereas relatively elevated values in fall and winter are consistent with greater proportions of decaying aquatic vegetation and/or terrestrial material. Seasonal shifts in the ??13C of POM when the C : N remains relatively constant probably indicate changes in the relative rates of photosynthesis and respiration. Periodic inputs of plant detritus are suggested by C : N ratios >15, principally on the Columbia and Ohio Rivers. The ??15N and ??13C also reflect the importance of internal and external sources of dissolved carbon and nitrogen, and the degree of in-stream processing. Elevated ??15N values at some sites probably reflect inputs from sewage and/or animal waste. This information on the spatial and temporal variation in sources of POM in four major river systems should prove useful in future food web and nutrient transport studies.
The distribution of density in supersonic turbulence
NASA Astrophysics Data System (ADS)
Squire, Jonathan; Hopkins, Philip F.
2017-11-01
We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.
Multivariate Statistical Inference of Lightning Occurrence, and Using Lightning Observations
NASA Technical Reports Server (NTRS)
Boccippio, Dennis
2004-01-01
Two classes of multivariate statistical inference using TRMM Lightning Imaging Sensor, Precipitation Radar, and Microwave Imager observation are studied, using nonlinear classification neural networks as inferential tools. The very large and globally representative data sample provided by TRMM allows both training and validation (without overfitting) of neural networks with many degrees of freedom. In the first study, the flashing / or flashing condition of storm complexes is diagnosed using radar, passive microwave and/or environmental observations as neural network inputs. The diagnostic skill of these simple lightning/no-lightning classifiers can be quite high, over land (above 80% Probability of Detection; below 20% False Alarm Rate). In the second, passive microwave and lightning observations are used to diagnose radar reflectivity vertical structure. A priori diagnosis of hydrometeor vertical structure is highly important for improved rainfall retrieval from either orbital radars (e.g., the future Global Precipitation Mission "mothership") or radiometers (e.g., operational SSM/I and future Global Precipitation Mission passive microwave constellation platforms), we explore the incremental benefit to such diagnosis provided by lightning observations.
Building a database for statistical characterization of ELMs on DIII-D
NASA Astrophysics Data System (ADS)
Fritch, B. J.; Marinoni, A.; Bortolon, A.
2017-10-01
Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.
Vijayalaxmi; Obe, Guenter
2005-07-01
During the years 1990-2003, a large number of investigations were conducted using animals, cultured rodent and human cells as well as freshly collected human blood lymphocytes to determine the genotoxic potential of exposure to nonionizing radiation emitted from extremely low frequency electromagnetic fields (EMF). Among the 63 peer reviewed scientific reports, the conclusions from 29 studies (46%) did not indicate increased damage to the genetic material, as assessed from DNA strand breaks, incidence of chromosomal aberrations (CA), micronuclei (MN), and sister chromatid exchanges (SCE), in EMF exposed cells as compared with sham exposed and/or unexposed cells, while those from 14 investigations (22%) have suggested an increase in such damage in EMF exposed cells. The observations from 20 other studies (32%) were inconclusive. This study reviews the investigations published in peer reviewed scientific journals during 1990-2003 and attempts to identify probable reason(s) for the conflicting results. Recommendations are made for future research to address some of the controversial observations. Copyright 2005 Wiley-Liss, Inc.
TANDI: threat assessment of network data and information
NASA Astrophysics Data System (ADS)
Holsopple, Jared; Yang, Shanchieh Jay; Sudit, Moises
2006-04-01
Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker's capability and opportunity, and fuse the two to determine the attacker's intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion.
Probability Forecasting Using Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Duncan, M.; Frisbee, J.; Wysack, J.
2014-09-01
Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.
Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems
NASA Astrophysics Data System (ADS)
Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger
2018-05-01
In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.
Cumplido-Hernández, Gustavo; Campos-Arciniega, María Faustina; Chávez-López, Arturo
2007-01-01
Medical specialty training courses have peculiar characteristics that probably influence the learning process of the residents. These training courses take place in large hospitals; the residents are subjected to a rigorous selection process, and at the same time they are affiliated employees of the institution. They work long shifts and are immersed in complex academic and occupational relationships. This study aims to ascertain the significance that these future specialists give to the environment where the training course takes place in relation with their learning process. We used the social anthropology narrative analysis method. A theoretical social perspective was used to emphasize on the context to explain the reality in which the residents live. Discipline, workload, conflictive relationships and strength of family ties were the most significant elements.
Communicating uncertainty in circulation aspects of climate change
NASA Astrophysics Data System (ADS)
Shepherd, Ted
2017-04-01
The usual way of representing uncertainty in climate change is to define a likelihood range of possible futures, conditioned on a particular pathway of greenhouse gas concentrations (RCPs). Typically these likelihood ranges are derived from multi-model ensembles. However, there is no obvious basis for treating such ensembles as probability distributions. Moreover, for aspects of climate related to atmospheric circulation, such an approach generally leads to large uncertainty and low confidence in projections. Yet this does not mean that the associated climate risks are small. We therefore need to develop suitable ways of communicating climate risk whilst acknowledging the uncertainties. This talk will outline an approach based on conditioning the purely thermodynamic aspects of climate change, concerning which there is comparatively high confidence, on circulation-related aspects, and treating the latter through non-probabilistic storylines.
High-efficiency wavefunction updates for large scale Quantum Monte Carlo
NASA Astrophysics Data System (ADS)
Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed
Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.
Quasi-periodic recurrence of large earthquakes on the southern San Andreas fault
Scharer, Katherine M.; Biasi, Glenn P.; Weldon, Ray J.; Fumal, Tom E.
2010-01-01
It has been 153 yr since the last large earthquake on the southern San Andreas fault (California, United States), but the average interseismic interval is only ~100 yr. If the recurrence of large earthquakes is periodic, rather than random or clustered, the length of this period is notable and would generally increase the risk estimated in probabilistic seismic hazard analyses. Unfortunately, robust characterization of a distribution describing earthquake recurrence on a single fault is limited by the brevity of most earthquake records. Here we use statistical tests on a 3000 yr combined record of 29 ground-rupturing earthquakes from Wrightwood, California. We show that earthquake recurrence there is more regular than expected from a Poisson distribution and is not clustered, leading us to conclude that recurrence is quasi-periodic. The observation of unimodal time dependence is persistent across an observationally based sensitivity analysis that critically examines alternative interpretations of the geologic record. The results support formal forecast efforts that use renewal models to estimate probabilities of future earthquakes on the southern San Andreas fault. Only four intervals (15%) from the record are longer than the present open interval, highlighting the current hazard posed by this fault.
Adult Services in the Third Millennium.
ERIC Educational Resources Information Center
Monroe, Margaret E.
1979-01-01
Presents a four-step model for "planning" or "forecasting" future of adult services in public libraries: (1) identification of forces at work; (2) analysis of probable impacts of one force upon another; (3) identification of preferred (and rejected) elements of future with forces that control elements; and (4) strategies to be…
Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay
NASA Astrophysics Data System (ADS)
Anderson, Clarissa R.; Sapiano, Mathew R. P.; Prasad, M. Bala Krishna; Long, Wen; Tango, Peter J.; Brown, Christopher W.; Murtugudde, Raghu
2010-11-01
Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (≥10 cells mL -1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100 cells mL -1) to large- threshold (1000 cells mL -1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of ~53%, a Probability of Detection ˜ 75%, a False Alarm Ratio of ˜ 52%, and a Probability of False Detection ˜9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed.
Medicaid Expansion Under the Affordable Care Act and Insurance Coverage in Rural and Urban Areas.
Soni, Aparna; Hendryx, Michael; Simon, Kosali
2017-04-01
To analyze the differential rural-urban impacts of the Affordable Care Act Medicaid expansion on low-income childless adults' health insurance coverage. Using data from the American Community Survey years 2011-2015, we conducted a difference-in-differences regression analysis to test for changes in the probability of low-income childless adults having insurance in states that expanded Medicaid versus states that did not expand, in rural versus urban areas. Analyses employed survey weights, adjusted for covariates, and included a set of falsification tests as well as sensitivity analyses. Medicaid expansion under the Affordable Care Act increased the probability of Medicaid coverage for targeted populations in rural and urban areas, with a significantly greater increase in rural areas (P < .05), but some of these gains were offset by reductions in individual purchased insurance among rural populations (P < .01). Falsification tests showed that the insurance increases were specific to low-income childless adults, as expected, and were largely insignificant for other populations. The Medicaid expansion increased the probability of having "any insurance" for the pooled urban and rural low-income populations, and it specifically increased Medicaid coverage more in rural versus urban populations. There was some evidence that the expansion was accompanied by some shifting from individual purchased insurance to Medicaid in rural areas, and there is a need for future work to understand the implications of this shift on expenditures, access to care and utilization. © 2017 National Rural Health Association.
Risk, Reward, and Decision-Making in a Rodent Model of Cognitive Aging
Gilbert, Ryan J.; Mitchell, Marci R.; Simon, Nicholas W.; Bañuelos, Cristina; Setlow, Barry; Bizon, Jennifer L.
2011-01-01
Impaired decision-making in aging can directly impact factors (financial security, health care) that are critical to maintaining quality of life and independence at advanced ages. Naturalistic rodent models mimic human aging in other cognitive domains, and afford the opportunity to parse the effects of age on discrete aspects of decision-making in a manner relatively uncontaminated by experiential factors. Young adult (5–7 months) and aged (23–25 months) male F344 rats were trained on a probability discounting task in which they made discrete-trial choices between a small certain reward (one food pellet) and a large but uncertain reward (two food pellets with varying probabilities of delivery ranging from 100 to 0%). Young rats chose the large reward when it was associated with a high probability of delivery and shifted to the small but certain reward as probability of the large reward decreased. As a group, aged rats performed comparably to young, but there was significantly greater variance among aged rats. One subgroup of aged rats showed strong preference for the small certain reward. This preference was maintained under conditions in which large reward delivery was also certain, suggesting decreased sensitivity to reward magnitude. In contrast, another subgroup of aged rats showed strong preference for the large reward at low probabilities of delivery. Interestingly, this subgroup also showed elevated preference for probabilistic rewards when reward magnitudes were equalized. Previous findings using this same aged study population described strongly attenuated discounting of delayed rewards with age, together suggesting that a subgroup of aged rats may have deficits associated with accounting for reward costs (i.e., delay or probability). These deficits in cost-accounting were dissociable from the age-related differences in sensitivity to reward magnitude, suggesting that aging influences multiple, distinct mechanisms that can impact cost–benefit decision-making. PMID:22319463
Risk, reward, and decision-making in a rodent model of cognitive aging.
Gilbert, Ryan J; Mitchell, Marci R; Simon, Nicholas W; Bañuelos, Cristina; Setlow, Barry; Bizon, Jennifer L
2011-01-01
Impaired decision-making in aging can directly impact factors (financial security, health care) that are critical to maintaining quality of life and independence at advanced ages. Naturalistic rodent models mimic human aging in other cognitive domains, and afford the opportunity to parse the effects of age on discrete aspects of decision-making in a manner relatively uncontaminated by experiential factors. Young adult (5-7 months) and aged (23-25 months) male F344 rats were trained on a probability discounting task in which they made discrete-trial choices between a small certain reward (one food pellet) and a large but uncertain reward (two food pellets with varying probabilities of delivery ranging from 100 to 0%). Young rats chose the large reward when it was associated with a high probability of delivery and shifted to the small but certain reward as probability of the large reward decreased. As a group, aged rats performed comparably to young, but there was significantly greater variance among aged rats. One subgroup of aged rats showed strong preference for the small certain reward. This preference was maintained under conditions in which large reward delivery was also certain, suggesting decreased sensitivity to reward magnitude. In contrast, another subgroup of aged rats showed strong preference for the large reward at low probabilities of delivery. Interestingly, this subgroup also showed elevated preference for probabilistic rewards when reward magnitudes were equalized. Previous findings using this same aged study population described strongly attenuated discounting of delayed rewards with age, together suggesting that a subgroup of aged rats may have deficits associated with accounting for reward costs (i.e., delay or probability). These deficits in cost-accounting were dissociable from the age-related differences in sensitivity to reward magnitude, suggesting that aging influences multiple, distinct mechanisms that can impact cost-benefit decision-making.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... for ABC is the projected yield stream with a 70 percent probability of rebuilding success. The Council... to have an 81 percent chance of rebuilding in 10 years, greater than the 70 percent probability... AM applications. Should this ACT be used in the future to trigger AMs, then it may be expected to...
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Vaughan, William W.
1998-01-01
A summary is presented of basic lightning characteristics/criteria for current and future NASA aerospace vehicles. The paper estimates the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating probabilities of launch vehicles/objects being struck by lightning. This paper presents these results.
Janssen, Stefan; Schudoma, Christian; Steger, Gerhard; Giegerich, Robert
2011-11-03
Many bioinformatics tools for RNA secondary structure analysis are based on a thermodynamic model of RNA folding. They predict a single, "optimal" structure by free energy minimization, they enumerate near-optimal structures, they compute base pair probabilities and dot plots, representative structures of different abstract shapes, or Boltzmann probabilities of structures and shapes. Although all programs refer to the same physical model, they implement it with considerable variation for different tasks, and little is known about the effects of heuristic assumptions and model simplifications used by the programs on the outcome of the analysis. We extract four different models of the thermodynamic folding space which underlie the programs RNAFOLD, RNASHAPES, and RNASUBOPT. Their differences lie within the details of the energy model and the granularity of the folding space. We implement probabilistic shape analysis for all models, and introduce the shape probability shift as a robust measure of model similarity. Using four data sets derived from experimentally solved structures, we provide a quantitative evaluation of the model differences. We find that search space granularity affects the computed shape probabilities less than the over- or underapproximation of free energy by a simplified energy model. Still, the approximations perform similar enough to implementations of the full model to justify their continued use in settings where computational constraints call for simpler algorithms. On the side, we observe that the rarely used level 2 shapes, which predict the complete arrangement of helices, multiloops, internal loops and bulges, include the "true" shape in a rather small number of predicted high probability shapes. This calls for an investigation of new strategies to extract high probability members from the (very large) level 2 shape space of an RNA sequence. We provide implementations of all four models, written in a declarative style that makes them easy to be modified. Based on our study, future work on thermodynamic RNA folding may make a choice of model based on our empirical data. It can take our implementations as a starting point for further program development.
Becher, M A; Grimm, V; Knapp, J; Horn, J; Twiston-Davies, G; Osborne, J L
2016-11-24
Social bees are central place foragers collecting floral resources from the surrounding landscape, but little is known about the probability of a scouting bee finding a particular flower patch. We therefore developed a software tool, BEESCOUT, to theoretically examine how bees might explore a landscape and distribute their scouting activities over time and space. An image file can be imported, which is interpreted by the model as a "forage map" with certain colours representing certain crops or habitat types as specified by the user. BEESCOUT calculates the size and location of these potential food sources in that landscape relative to a bee colony. An individual-based model then determines the detection probabilities of the food patches by bees, based on parameter values gathered from the flight patterns of radar-tracked honeybees and bumblebees. Various "search modes" describe hypothetical search strategies for the long-range exploration of scouting bees. The resulting detection probabilities of forage patches can be used as input for the recently developed honeybee model BEEHAVE, to explore realistic scenarios of colony growth and death in response to different stressors. In example simulations, we find that detection probabilities for food sources close to the colony fit empirical data reasonably well. However, for food sources further away no empirical data are available to validate model output. The simulated detection probabilities depend largely on the bees' search mode, and whether they exchange information about food source locations. Nevertheless, we show that landscape structure and connectivity of food sources can have a strong impact on the results. We believe that BEESCOUT is a valuable tool to better understand how landscape configurations and searching behaviour of bees affect detection probabilities of food sources. It can also guide the collection of relevant data and the design of experiments to close knowledge gaps, and provides a useful extension to the BEEHAVE honeybee model, enabling future users to explore how landscape structure and food availability affect the foraging decisions and patch visitation rates of the bees and, in consequence, to predict colony development and survival.
Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis
NASA Astrophysics Data System (ADS)
George, E., Chan, K.
2012-09-01
Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for pairs of large objects may be inadequate. These relationships may also form the basis of an important metric for catalog maintenance by defining the maximum allowable covariance size for effective conjunction analysis. The application of these techniques promises to greatly improve the efficiency and completeness of conjunction analysis.
State-space modeling to support management of brucellosis in the Yellowstone bison population
Hobbs, N. Thompson; Geremia, Chris; Treanor, John; Wallen, Rick; White, P.J.; Hooten, Mevin B.; Rhyan, Jack C.
2015-01-01
The bison (Bison bison) of the Yellowstone ecosystem, USA, exemplify the difficulty of conserving large mammals that migrate across the boundaries of conservation areas. Bison are infected with brucellosis (Brucella abortus) and their seasonal movements can expose livestock to infection. Yellowstone National Park has embarked on a program of adaptive management of bison, which requires a model that assimilates data to support management decisions. We constructed a Bayesian state-space model to reveal the influence of brucellosis on the Yellowstone bison population. A frequency-dependent model of brucellosis transmission was superior to a density-dependent model in predicting out-of-sample observations of horizontal transmission probability. A mixture model including both transmission mechanisms converged on frequency dependence. Conditional on the frequency-dependent model, brucellosis median transmission rate was 1.87 yr−1. The median of the posterior distribution of the basic reproductive ratio (R0) was 1.75. Seroprevalence of adult females varied around 60% over two decades, but only 9.6 of 100 adult females were infectious. Brucellosis depressed recruitment; estimated population growth rate λ averaged 1.07 for an infected population and 1.11 for a healthy population. We used five-year forecasting to evaluate the ability of different actions to meet management goals relative to no action. Annually removing 200 seropositive female bison increased by 30-fold the probability of reducing seroprevalence below 40% and increased by a factor of 120 the probability of achieving a 50% reduction in transmission probability relative to no action. Annually vaccinating 200 seronegative animals increased the likelihood of a 50% reduction in transmission probability by fivefold over no action. However, including uncertainty in the ability to implement management by representing stochastic variation in the number of accessible bison dramatically reduced the probability of achieving goals using interventions relative to no action. Because the width of the posterior predictive distributions of future population states expands rapidly with increases in the forecast horizon, managers must accept high levels of uncertainty. These findings emphasize the necessity of iterative, adaptive management with relatively short-term commitment to action and frequent reevaluation in response to new data and model forecasts. We believe our approach has broad applications.
Can the World's Farmers Feed a World of 10 Billion People In Spite of Climate Change? (Invited)
NASA Astrophysics Data System (ADS)
Easterling, W. E.
2010-12-01
The rapid rise in agricultural productivity due to technological innovation and science-based methods was one of the great human achievements of the 20th century. We now face the prospect of needing to double agricultural output by the latter third of the current century to match the growth of demand for food and fiber—albeit the pace of growth in demand shows signs of slowing in the future. How farmers and the agricultural industry deal with climate change will, in large measure, determine success or failure. The Earth is committed to about the same amount of warming in the future as has been experienced over the past hundred years regardless of future greenhouse gas emissions trajectories; such will require adaptive responses by plants, animals, producers and consumers if society’s goals for global food security are to be met. In this paper, I summarize the state-of-the science of how climate change may affect our global agricultural production system. I review the latest thinking on the combined effects of rising atmospheric CO2 concentration and climate changes on crop productivity across the globe. Prospects for adaptation in agriculturally important regions are examined. While it appears that global food production will be adequate to meet global food demand in spite of advancing climate change, it is clear that many parts of the tropics and dry sub-tropics will see yield decreases and possible loss of comparative advantage. In those regions, continued large population growth and deleterious climate changes will contribute to declining per capita agricultural production. Increasing numbers of people at risk of hunger are probable there.
NASA Technical Reports Server (NTRS)
1990-01-01
The current state is reviewed of the study of chemical evolution and planetary biology and the probable future is discussed of the field, at least for the near term. To this end, the report lists the goals and objectives of future research and makes detailed, comprehensive recommendations for accomplishing them, emphasizing those issues that were inadequately discussed in earlier Space Studies Board reports.
Assessment of potential future hydrogen markets in the U.S.
NASA Technical Reports Server (NTRS)
Kashani, A. K.
1980-01-01
Potential future hydrogen markets in the United States are assessed. Future hydrogen markets for various use sectors are projected, the probable range of hydrogen production costs from various alternatives is estimated, stimuli and barriers to the development of hydrogen markets are discussed, an overview of the status of technologies for the production and utilization of hydrogen is presented, and, finally, societal aspects of hydrogen production and utilization are discussed.
Transition probabilities of health states for workers in Malaysia using a Markov chain model
NASA Astrophysics Data System (ADS)
Samsuddin, Shamshimah; Ismail, Noriszura
2017-04-01
The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.
A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Junghyun; Hayward, Chris; Zeiler, Cleat
Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less
Modeling nonbreeding distributions of shorebirds and waterfowl in response to climate change
Reese, Gordon; Skagen, Susan K.
2017-01-01
To identify areas on the landscape that may contribute to a robust network of conservation areas, we modeled the probabilities of occurrence of several en route migratory shorebirds and wintering waterfowl in the southern Great Plains of North America, including responses to changing climate. We predominantly used data from the eBird citizen-science project to model probabilities of occurrence relative to land-use patterns, spatial distribution of wetlands, and climate. We projected models to potential future climate conditions using five representative general circulation models of the Coupled Model Intercomparison Project 5 (CMIP5). We used Random Forests to model probabilities of occurrence and compared the time periods 1981–2010 (hindcast) and 2041–2070 (forecast) in “model space.” Projected changes in shorebird probabilities of occurrence varied with species-specific general distribution pattern, migration distance, and spatial extent. Species using the western and northern portion of the study area exhibited the greatest likelihoods of decline, whereas species with more easterly occurrences, mostly long-distance migrants, had the greatest projected increases in probability of occurrence. At an ecoregional extent, differences in probabilities of shorebird occurrence ranged from −0.015 to 0.045 when averaged across climate models, with the largest increases occurring early in migration. Spatial shifts are predicted for several shorebird species. Probabilities of occurrence of wintering Mallards and Northern Pintail are predicted to increase by 0.046 and 0.061, respectively, with northward shifts projected for both species. When incorporated into partner land management decision tools, results at ecoregional extents can be used to identify wetland complexes with the greatest potential to support birds in the nonbreeding season under a wide range of future climate scenarios.
Climate change and wetland loss impacts on a Western river's water quality
NASA Astrophysics Data System (ADS)
Records, R. M.; Arabi, M.; Fassnacht, S. R.; Duffy, W. G.; Ahmadi, M.; Hegewisch, K. C.
2014-05-01
An understanding of potential stream water quality conditions under future climate is critical for the sustainability of ecosystems and protection of human health. Changes in wetland water balance under projected climate could alter wetland extent or cause wetland loss. This study assessed the potential climate-induced changes to in-stream sediment and nutrients loads in the historically snow melt-dominated Sprague River, Oregon, Western United States. Additionally, potential water quality impacts of combined changes in wetland water balance and wetland area under future climatic conditions were evaluated. The study utilized the Soil and Water Assessment Tool (SWAT) forced with statistical downscaling of general circulation model (GCM) data from the Coupled Model Intercomparison Project 5 (CMIP5) using the Multivariate Adaptive Constructed Analogs (MACA) method. Our findings suggest that in the Sprague River (1) mid-21st century nutrient and sediment loads could increase significantly during the high flow season under warmer-wetter climate projections, or could change only nominally in a warmer and somewhat drier future; (2) although water quality conditions under some future climate scenarios and no wetland loss may be similar to the past, the combined impact of climate change and wetland losses on nutrient loads could be large; (3) increases in stream total phosphorus (TP) concentration with wetland loss under future climate scenarios would be greatest at high-magnitude, low-probability flows; and (4) loss of riparian wetlands in both headwaters and lowlands could increase outlet TP loads to a similar degree, but this could be due to distinctly different mechanisms in different parts of the watershed.
Vocalization behavior and response of black rails
Legare, M.L.; Eddleman, W.R.; Buckley, P.A.; Kelly, C.
1999-01-01
We measured the vocal responses and movements of radio-tagged black rails (Laterallus jamaicensis) (n = 43, 26 males, 17 females) to playback of vocalizations at 2 sites in Florida during the breeding seasons of 1992-95. We used regression coefficients from logistic regression equations to model the probability of a response conditional to the birds' sex, nesting status, distance to playback source, and the time of survey. With a probability of 0.811, non-nesting male black rails were most likely to respond to playback, while nesting females were the least likely to respond (probability = 0.189). Linear regression was used to determine daily, monthly, and annual variation in response from weekly playback surveys along a fixed route during the breeding seasons of 1993-95. Significant sources of variation in the linear regression model were month (F = 3.89, df = 3, p = 0.0140), year (F = 9.37, df = 2, p = 0.0003), temperature (F = 5.44, df=1, p = 0.0236), and month*year (F = 2.69, df = 5, p = 0.0311). The model was highly significant (p < 0.0001) and explained 53% of the variation of mean response per survey period (R2 = 0.5353). Response probability data obtained from the radio-tagged black rails and data from the weekly playback survey route were combined to provide a density estimate of 0.25 birds/ha for the St. Johns National Wildlife Refuge. Density estimates for black rails may be obtained from playback surveys, and fixed radius circular plots. Circular plots should be considered as having a radius of 80 m and be located so the plot centers are 150 m apart. Playback tapes should contain one series of Kic-kic-kerr and Growl vocalizations recorded within the same geographic region as the study area. Surveys should be conducted from 0-2 hours after sunrise or 0-2 hours before sunset, during the pre-nesting season, and when wind velocity is < 20 kph. Observers should listen for 3-4 minutes after playing the survey tape and record responses heard during that time. Observers should be trained to identify black rail vocalizations and should have acceptable hearing ability. Given the number of variables that may have large effects on the response behavior of black rails to tape playback, we recommend that future studies using playback surveys should be cautious when presenting estimates of 'absolute' density. Though results did account for variation in response behavior, we believe that additional variation in vocal response between sites, with breeding status, and bird density remains in question. Playback surveys along fixed routes providing a simple index of abundance would be useful to monitor populations over large geographic areas, and over time. Considering the limitations of most agency resources for webless waterbirds, index surveys may be more appropriate. Future telemetry studies of this type on other species and at other sites would be useful to calibrate information obtained from playback surveys whether reporting an index of abundance or density estimate.
Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.
Fennell, John; Baddeley, Roland
2012-10-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Emirates Mars Mission Planetary Protection Plan
NASA Astrophysics Data System (ADS)
Awadhi, Mohsen Al
2016-07-01
The United Arab Emirates is planning to launch a spacecraft to Mars in 2020 as part of the Emirates Mars Mission (EMM). The EMM spacecraft, Amal, will arrive in early 2021 and enter orbit about Mars. Through a sequence of subsequent maneuvers, the spacecraft will enter a large science orbit and remain there throughout the primary mission. This paper describes the planetary protection plan for the EMM mission. The EMM science orbit, where Amal will conduct the majority of its operations, is very large compared to other Mars orbiters. The nominal orbit has a periapse altitude of 20,000 km, an apoapse altitude of 43,000 km, and an inclination of 25 degrees. From this vantage point, Amal will conduct a series of atmospheric investigations. Since Amal's orbit is very large, the planetary protection plan is to demonstrate a very low probability that the spacecraft will ever encounter Mars' surface or lower atmosphere during the mission. The EMM team has prepared methods to demonstrate that (1) the launch vehicle targets support a 0.01% probability of impacting Mars, or less, within 50 years; (2) the spacecraft has a 1% probability or less of impacting Mars during 20 years; and (3) the spacecraft has a 5% probability or less of impacting Mars during 50 years. The EMM mission design resembles the mission design of many previous missions, differing only in the specific parameters and final destination. The following sequence describes the mission: 1.The mission will launch in July, 2020. The launch includes a brief parking orbit and a direct injection to the interplanetary cruise. The launch targets are specified by the hyperbolic departure's energy C3, and the hyperbolic departure's direction in space, captured by the right ascension and declination of the launch asymptote, RLA and DLA, respectively. The targets of the launch vehicle are biased away from Mars such that there is a 0.01% probability or less that the launch vehicle arrives onto a trajectory that impacts Mars. 2.The spacecraft is deployed from the launch vehicle and powers on. 3.Within the first month, the spacecraft executes a trajectory correction maneuver to remove the launch bias. The target of this maneuver may still have a small bias to further reduce the probability of inadvertently impacting Mars. 4.Four additional trajectory correction maneuvers are scheduled and planned in the interplanetary cruise in order to target the precise arrival conditions at Mars. The targeted arrival conditions are specified by an altitude above the surface of Mars and an inclination relative to Mars' equator. The closest approach to Mars during the Mars Orbit Insertion (MOI) is over 600 km and the periapsis altitude of the first orbit about Mars is nominally 500 km. The inclination of the first orbit about Mars is nominally around 18 degrees. 5.The Mars Orbit Insertion is performed as a pitch-over burn, approaching no closer than approximately 600 km, and targeting a capture orbit period of 35-40 hours. 6.The spacecraft Capture Orbit has a nominal periapse altitude of 500 km, a nominal apoapse altitude of approximately 45,000 km, and a nominal period of approximately 35 hours. The mission expects that this orbit will be somewhat different after executing the real MOI due to maneuver execution errors. The full range of expected Capture Orbit sizes is acceptable from a planetary protection perspective. 7.The spacecraft remains in the Capture Orbit for two months. 8.The spacecraft then executes three maneuvers in the Transition to Science phase, raising the orbital periapse, raising the orbit inclination, adjusting the apoapse, and placing the argument of periapse near a value of 177 deg. The three maneuvers are nominally one week apart. The first maneuver is large and will raise the periapse significantly, thereafter significantly reducing the probability of Amal impacting Mars in the future.
Changes in the probability of co-occurring extreme climate events
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.
Robust Bayesian Experimental Design for Conceptual Model Discrimination
NASA Astrophysics Data System (ADS)
Pham, H. V.; Tsai, F. T. C.
2015-12-01
A robust Bayesian optimal experimental design under uncertainty is presented to provide firm information for model discrimination, given the least number of pumping wells and observation wells. Firm information is the maximum information of a system can be guaranteed from an experimental design. The design is based on the Box-Hill expected entropy decrease (EED) before and after the experiment design and the Bayesian model averaging (BMA) framework. A max-min programming is introduced to choose the robust design that maximizes the minimal Box-Hill EED subject to that the highest expected posterior model probability satisfies a desired probability threshold. The EED is calculated by the Gauss-Hermite quadrature. The BMA method is used to predict future observations and to quantify future observation uncertainty arising from conceptual and parametric uncertainties in calculating EED. Monte Carlo approach is adopted to quantify the uncertainty in the posterior model probabilities. The optimal experimental design is tested by a synthetic 5-layer anisotropic confined aquifer. Nine conceptual groundwater models are constructed due to uncertain geological architecture and boundary condition. High-performance computing is used to enumerate all possible design solutions in order to identify the most plausible groundwater model. Results highlight the impacts of scedasticity in future observation data as well as uncertainty sources on potential pumping and observation locations.
Sea-level rise and its possible impacts given a 'beyond 4°C world' in the twenty-first century.
Nicholls, Robert J; Marinova, Natasha; Lowe, Jason A; Brown, Sally; Vellinga, Pier; de Gusmão, Diogo; Hinkel, Jochen; Tol, Richard S J
2011-01-13
The range of future climate-induced sea-level rise remains highly uncertain with continued concern that large increases in the twenty-first century cannot be ruled out. The biggest source of uncertainty is the response of the large ice sheets of Greenland and west Antarctica. Based on our analysis, a pragmatic estimate of sea-level rise by 2100, for a temperature rise of 4°C or more over the same time frame, is between 0.5 m and 2 m--the probability of rises at the high end is judged to be very low, but of unquantifiable probability. However, if realized, an indicative analysis shows that the impact potential is severe, with the real risk of the forced displacement of up to 187 million people over the century (up to 2.4% of global population). This is potentially avoidable by widespread upgrade of protection, albeit rather costly with up to 0.02 per cent of global domestic product needed, and much higher in certain nations. The likelihood of protection being successfully implemented varies between regions, and is lowest in small islands, Africa and parts of Asia, and hence these regions are the most likely to see coastal abandonment. To respond to these challenges, a multi-track approach is required, which would also be appropriate if a temperature rise of less than 4°C was expected. Firstly, we should monitor sea level to detect any significant accelerations in the rate of rise in a timely manner. Secondly, we need to improve our understanding of the climate-induced processes that could contribute to rapid sea-level rise, especially the role of the two major ice sheets, to produce better models that quantify the likely future rise more precisely. Finally, responses need to be carefully considered via a combination of climate mitigation to reduce the rise and adaptation for the residual rise in sea level. In particular, long-term strategic adaptation plans for the full range of possible sea-level rise (and other change) need to be widely developed.
Using harmonic oscillators to determine the spot size of Hermite-Gaussian laser beams
NASA Technical Reports Server (NTRS)
Steely, Sidney L.
1993-01-01
The similarity of the functional forms of quantum mechanical harmonic oscillators and the modes of Hermite-Gaussian laser beams is illustrated. This functional similarity provides a direct correlation to investigate the spot size of large-order mode Hermite-Gaussian laser beams. The classical limits of a corresponding two-dimensional harmonic oscillator provide a definition of the spot size of Hermite-Gaussian laser beams. The classical limits of the harmonic oscillator provide integration limits for the photon probability densities of the laser beam modes to determine the fraction of photons detected therein. Mathematica is used to integrate the probability densities for large-order beam modes and to illustrate the functional similarities. The probabilities of detecting photons within the classical limits of Hermite-Gaussian laser beams asymptotically approach unity in the limit of large-order modes, in agreement with the Correspondence Principle. The classical limits for large-order modes include all of the nodes for Hermite Gaussian laser beams; Sturm's theorem provides a direct proof.
Interfutures: Facing the Future, Mastering the Probable and Managing the Unpredictable.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
This report discusses the findings of the three year Interfutures Project which studied the future development of advanced industrial societies and the relations between these countries and the developing countries. The major emphasis of the project was to analyze economic problems. However, political and social elements were also studied. The…
Bean, Nigel G.; Ruberu, Ravi P.
2017-01-01
Background The external validity, or generalizability, of trials and guidelines has been considered poor in the context of multiple morbidity. How multiple morbidity might affect the magnitude of benefit of a given treatment, and thereby external validity, has had little study. Objective To provide a method of decision analysis to quantify the effects of age and comorbidity on the probability of deriving a given magnitude of treatment benefit. Design We developed a method to calculate probabilistically the effect of all of a patient’s comorbidities on their underlying utility, or well-being, at a future time point. From this, we derived a distribution of possible magnitudes of treatment benefit at that future time point. We then expressed this distribution as the probability of deriving at least a given magnitude of treatment benefit. To demonstrate the applicability of this method of decision analysis, we applied it to the treatment of hypercholesterolaemia in a geriatric population of 50 individuals. We highlighted the results of four of these individuals. Results This method of analysis provided individualized quantifications of the effect of age and comorbidity on the probability of treatment benefit. The average probability of deriving a benefit, of at least 50% of the magnitude of benefit available to an individual without comorbidity, was only 0.8%. Conclusion The effects of age and comorbidity on the probability of deriving significant treatment benefits can be quantified for any individual. Even without consideration of other factors affecting external validity, these effects may be sufficient to guide decision-making. PMID:29090189
NASA Astrophysics Data System (ADS)
Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren
2017-11-01
Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.
Gould, A Lawrence; Koglin, Joerg; Bain, Raymond P; Pinto, Cathy-Anne; Mitchel, Yale B; Pasternak, Richard C; Sapre, Aditi
2009-08-01
Studies measuring progression of carotid artery intima-media thickness (cIMT) have been used to estimate the effect of lipid-modifying therapies cardiovascular event risk. The likelihood that future cIMT clinical trials will detect a true treatment effect is estimated by leveraging results from prior studies. The present analyses assess the impact of between- and within-study variability based on currently published data from prior clinical studies on the likelihood that ongoing or future cIMT trials will detect the true treatment effect of lipid-modifying therapies. Published data from six contemporary cIMT studies (ASAP, ARBITER 2, RADIANCE 1, RADIANCE 2, ENHANCE, and METEOR) including data from a total of 3563 patients were examined. Bayesian and frequentist methods were used to assess the impact of between study variability on the likelihood of detecting true treatment effects on 1-year cIMT progression/regression and to provide a sample size estimate that would specifically compensate for the effect of between-study variability. In addition to the well-described within-study variability, there is considerable between-study variability associated with the measurement of annualized change in cIMT. Accounting for the additional between-study variability decreases the power for existing study designs. In order to account for the added between-study variability, it is likely that future cIMT studies would require a large increase in sample size in order to provide substantial probability (> or =90%) to have 90% power of detecting a true treatment effect.Limitation Analyses are based on study level data. Future meta-analyses incorporating patient-level data would be useful for confirmation. Due to substantial within- and between-study variability in the measure of 1-year change of cIMT, as well as uncertainty about progression rates in contemporary populations, future study designs evaluating the effect of new lipid-modifying therapies on atherosclerotic disease progression are likely to be challenged by large sample sizes in order to demonstrate a true treatment effect.
Neutron lifetime measurements with a large gravitational trap for ultracold neutrons
NASA Astrophysics Data System (ADS)
Serebrov, A. P.; Kolomensky, E. A.; Fomin, A. K.; Krasnoshchekova, I. A.; Vassiljev, A. V.; Prudnikov, D. M.; Shoka, I. V.; Chechkin, A. V.; Chaikovskiy, M. E.; Varlamov, V. E.; Ivanov, S. N.; Pirozhkov, A. N.; Geltenbort, P.; Zimmer, O.; Jenke, T.; Van der Grinten, M.; Tucker, M.
2018-05-01
Neutron lifetime is one of the most important physical constants: it determines parameters of the weak interaction and predictions of primordial nucleosynthesis theory. There remains the unsolved problem of a 3.9σ discrepancy between measurements of this lifetime using neutrons in beams and those with stored ultracold neutrons (UCN). In our experiment we measure the lifetime of neutrons trapped by Earth's gravity in an open-topped vessel. Two configurations of the trap geometry are used to change the mean frequency of UCN collisions with the surfaces; this is achieved by plunging an additional surface into the trap without breaking the vacuum. The trap walls are coated with a hydrogen-less fluorine-containing polymer to reduce losses of UCN. The stability of this coating over multiple thermal cycles between 80 and 300 K was tested. At 80 K, the probability of UCN loss due to collisions with the trap walls is just 1.5% of the probability of β decay. The free neutron lifetime is determined by extrapolation to an infinitely large trap with zero collision frequency. The result of these measurements is τn=881.5 ±0 .7stat ±0 .6syst s which is consistent with the conventional value of 880.2 ± 1.0 s presented by the Particle Data Group. Future prospects for this experiment are in further cooling to 10 K, which will lead to an improved accuracy of measurement. In conclusion we present an analysis of currently available data on various measurements of the neutron lifetime.
Battlefield Air Interdiction: Airpower for the Future
1980-01-01
recommendations for the effective use of airpower for this purpose are made. A future war will probably be against the Soviet Union or one of its...emphasis will be placed upon the Soviet forces since it is likely that any future belligerence will be against the _ _......6 I Soviet Union or one of its...offensive operations (see figure 3) stress rapid, continuous movement. Objectives are established which demand high rates of advance. A regiment, for
Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast
Geist, E.L.; Parsons, T.
2009-01-01
Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.
Blecha, Kevin A.; Alldredge, Mat W.
2015-01-01
Animal space use studies using GPS collar technology are increasingly incorporating behavior based analysis of spatio-temporal data in order to expand inferences of resource use. GPS location cluster analysis is one such technique applied to large carnivores to identify the timing and location of feeding events. For logistical and financial reasons, researchers often implement predictive models for identifying these events. We present two separate improvements for predictive models that future practitioners can implement. Thus far, feeding prediction models have incorporated a small range of covariates, usually limited to spatio-temporal characteristics of the GPS data. Using GPS collared cougar (Puma concolor) we include activity sensor data as an additional covariate to increase prediction performance of feeding presence/absence. Integral to the predictive modeling of feeding events is a ground-truthing component, in which GPS location clusters are visited by human observers to confirm the presence or absence of feeding remains. Failing to account for sources of ground-truthing false-absences can bias the number of predicted feeding events to be low. Thus we account for some ground-truthing error sources directly in the model with covariates and when applying model predictions. Accounting for these errors resulted in a 10% increase in the number of clusters predicted to be feeding events. Using a double-observer design, we show that the ground-truthing false-absence rate is relatively low (4%) using a search delay of 2–60 days. Overall, we provide two separate improvements to the GPS cluster analysis techniques that can be expanded upon and implemented in future studies interested in identifying feeding behaviors of large carnivores. PMID:26398546
Probalistic Assessment of Radiation Risk for Solar Particle Events
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
2008-01-01
For long duration missions outside of the protection of the Earth's magnetic field, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon or Earth-to-Mars transit. The large majority (90%) of SPEs have small or no health consequences because the doses are low and the particles do not penetrate to organ depths. However, there is an operational challenge to respond to events of unknown size and duration. We have developed a probabilistic approach to SPE risk assessment in support of mission design and operational planning. Using the historical database of proton measurements during the past 5 solar cycles, the functional form of hazard function of SPE occurrence per cycle was found for nonhomogeneous Poisson model. A typical hazard function was defined as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions of particle fluences for a specified mission period were simulated ranging from its 5th to 95th percentile. Organ doses from large SPEs were assessed using NASA's Baryon transport model, BRYNTRN. The SPE risk was analyzed with the organ dose distribution for the given particle fluences during a mission period. In addition to the total particle fluences of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the cancer risk associated with energetic particles for large events. The probability of exceeding the NASA 30-day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated for various SPE sizes. This probabilistic approach to SPE protection will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks in future work.
Stimulus probability effects in absolute identification.
Kent, Christopher; Lamberts, Koen
2016-05-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Occupancy modeling of Parnassius clodius butterfly populations in Grand Teton National Park, Wyoming
Szcodronski, Kimberly E.; Debinski, Diane M.; Klaver, Robert W.
2018-01-01
Estimating occupancy patterns and identifying vegetation characteristics that influence the presence of butterfly species are essential approaches needed for determining how habitat changes may affect butterfly populations in the future. The montane butterfly species, Parnassius clodius, was investigated to identify patterns of occupancy relating to habitat variables in Grand Teton National Park and Bridger-Teton National Forest, Wyoming, United States. A series of presence–absence surveys were conducted in 2013 in 41 mesic to xeric montane meadows that were considered suitable habitat for P. clodius during their flight season (June–July) to estimate occupancy (ψ) and detection probability (p). According to the null constant parameter model, P. clodius had high occupancy of ψ = 0.78 ± 0.07 SE and detection probability of p = 0.75 ± 0.04 SE. In models testing covariates, the most important habitat indicator for the occupancy of P. clodius was a strong negative association with big sagebrush (Artemisia tridentata; β = − 21.39 ± 21.10 SE) and lupine (Lupinus spp.; β = − 20.03 ± 21.24 SE). While P. clodius was found at a high proportion of meadows surveyed, the presence of A. tridentata may limit their distribution within montane meadows at a landscape scale because A. tridentata dominates a large percentage of the montane meadows in our study area. Future climate scenarios predicted for high elevations globally could cause habitat shifts and put populations of P. clodius and similar non-migratory butterfly populations at risk.
Long-Term Evolution of the Sun and our Biosphere: Causes and Effects?
NASA Astrophysics Data System (ADS)
Des Marais, D. J.
2000-05-01
The course of early biological evolution felt the environmental consequences of changes in the solar output (discussed here), as well as long-term decreases in planetary heat flow and the flux of extraterrestrial impactors. A large, early UV flux fueled the photodissociation of atmospheric water vapor, sustaining a significant hydrogen flux to space. This flux caused Earth's crust to become oxidized, relative to its mantle. Accordingly, reduced gases and aqueous solutes that were erupted volcanically into the relatively more oxidized surface environment created sources of chemical redox energy for the origin and early evolution of life. Although the solar constant has increased some 30 percent over Earth's lifetime, oceans remained remarkably stable for more than 3.8 billion years. Thus a very effective climate regulation was probably achieved by decreasing over time the atmospheric inventories of greenhouse gases such as carbon dioxide and methane. Such decreases probably had major consequences for the biosphere. Substantial early marine bicarbonate and carbon dioxide inventories sustained abundant abiotic precipitation of carbonates, with consequences for the stability and habitability of key aqueous environments. A long-term decline in carbon dioxide levels increased the bioenergetic requirements for carbon dioxide as well as other aspects of the physiology of photosynthetic microorganisms. The long-term trend of global mean surface temperature is still debated, as is the role of the sun's evolution in that trend. Future increases in the solar constant will drive atmospheric carbon dioxide levels down further, challenging plants to cope with ever-dwindling concentrations of carbon substrates. Climate regulation will be achieved by modulating an increasing abundance of high-albedo water vapor clouds. Future biological evolution defies precise predictions, however it is certain that the sun's continuing evolution will play a key role.
Mueller, Brigitte; Zhang, Xuebin; Zwiers, Francis W.
2016-04-07
We project that within the next two decades, half of the world's population will regularly (every second summer on average) experience regional summer mean temperatures that exceed those of the historically hottest summer, even under the moderate RCP4.5 emissions pathway. This frequency threshold for hot temperatures over land, which have adverse effects on human health, society and economy, might be broached in little more than a decade under the RCP8.5 emissions pathway. These hot summer frequency projections are based on adjusted RCP4.5 and 8.5 temperature projections, where the adjustments are performed with scaling factors determined by regularized optimal fingerprinting analyzesmore » that compare historical model simulations with observations over the period 1950-2012. A temperature reconstruction technique is then used to simulate a multitude of possible past and future temperature evolutions, from which the probability of a hot summer is determined for each region, with a hot summer being defined as the historically warmest summer on record in that region. Probabilities with and without external forcing show that hot summers are now about ten times more likely (fraction of attributable risk 0.9) in many regions of the world than they would have been in the absence of past greenhouse gas increases. In conclusion, the adjusted future projections suggest that the Mediterranean, Sahara, large parts of Asia and the Western US and Canada will be among the first regions for which hot summers will become the norm (i.e. occur on average every other year), and that this will occur within the next 1-2 decades.« less
Predicted deep-sea coral habitat suitability for the U.S. West coast.
Guinotte, John M; Davies, Andrew J
2014-01-01
Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled.
NASA Astrophysics Data System (ADS)
Singh, Shailesh Kumar
2014-05-01
Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.
Predicted Deep-Sea Coral Habitat Suitability for the U.S. West Coast
Guinotte, John M.; Davies, Andrew J.
2014-01-01
Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled. PMID:24759613
van Mantgem, Phillip J.; Caprio, Anthony C.; Stephenson, Nathan L.; Das, Adrian J.
2016-01-01
Prescribed fire is a primary tool used to restore western forests following more than a century of fire exclusion, reducing fire hazard by removing dead and live fuels (small trees and shrubs). It is commonly assumed that the reduced forest density following prescribed fire also reduces competition for resources among the remaining trees, so that the remaining trees are more resistant (more likely to survive) in the face of additional stressors, such as drought. Yet this proposition remains largely untested, so that managers do not have the basic information to evaluate whether prescribed fire may help forests adapt to a future of more frequent and severe drought.During the third year of drought, in 2014, we surveyed 9950 trees in 38 burned and 18 unburned mixed conifer forest plots at low elevation (<2100 m a.s.l.) in Kings Canyon, Sequoia, and Yosemite national parks in California, USA. Fire had occurred in the burned plots from 6 yr to 28 yr before our survey. After accounting for differences in individual tree diameter, common conifer species found in the burned plots had significantly reduced probability of mortality compared to unburned plots during the drought. Stand density (stems ha-1) was significantly lower in burned versus unburned sites, supporting the idea that reduced competition may be responsible for the differential drought mortality response. At the time of writing, we are not sure if burned stands will maintain lower tree mortality probabilities in the face of the continued, severe drought of 2015. Future work should aim to better identify drought response mechanisms and how these may vary across other forest types and regions, particularly in other areas experiencing severe drought in the Sierra Nevada and on the Colorado Plateau.
NASA Astrophysics Data System (ADS)
Mueller, Brigitte; Zhang, Xuebin; Zwiers, Francis W.
2016-04-01
We project that within the next two decades, half of the world’s population will regularly (every second summer on average) experience regional summer mean temperatures that exceed those of the historically hottest summer, even under the moderate RCP4.5 emissions pathway. This frequency threshold for hot temperatures over land, which have adverse effects on human health, society and economy, might be broached in little more than a decade under the RCP8.5 emissions pathway. These hot summer frequency projections are based on adjusted RCP4.5 and 8.5 temperature projections, where the adjustments are performed with scaling factors determined by regularized optimal fingerprinting analyzes that compare historical model simulations with observations over the period 1950-2012. A temperature reconstruction technique is then used to simulate a multitude of possible past and future temperature evolutions, from which the probability of a hot summer is determined for each region, with a hot summer being defined as the historically warmest summer on record in that region. Probabilities with and without external forcing show that hot summers are now about ten times more likely (fraction of attributable risk 0.9) in many regions of the world than they would have been in the absence of past greenhouse gas increases. The adjusted future projections suggest that the Mediterranean, Sahara, large parts of Asia and the Western US and Canada will be among the first regions for which hot summers will become the norm (i.e. occur on average every other year), and that this will occur within the next 1-2 decades.
Repeat migration and disappointment.
Grant, E K; Vanderkamp, J
1986-01-01
This article investigates the determinants of repeat migration among the 44 regions of Canada, using information from a large micro-database which spans the period 1968 to 1971. The explanation of repeat migration probabilities is a difficult task, and this attempt is only partly successful. May of the explanatory variables are not significant, and the overall explanatory power of the equations is not high. In the area of personal characteristics, the variables related to age, sex, and marital status are generally significant and with expected signs. The distance variable has a strongly positive effect on onward move probabilities. Variables related to prior migration experience have an important impact that differs between return and onward probabilities. In particular, the occurrence of prior moves has a striking effect on the probability of onward migration. The variable representing disappointment, or relative success of the initial move, plays a significant role in explaining repeat migration probabilities. The disappointment variable represents the ratio of actural versus expected wage income in the year after the initial move, and its effect on both repeat migration probabilities is always negative and almost always highly significant. The repeat probabilities diminish after a year's stay in the destination region, but disappointment in the most recent year still has a bearing on the delayed repeat probabilities. While the quantitative impact of the disappointment variable is not large, it is difficult to draw comparisons since similar estimates are not available elsewhere.
Enceladus: An Active Cryovolcanic Satellite
NASA Technical Reports Server (NTRS)
Spencer, J. R.; Barr, Amy C.; Esposito, L. W.; Helfenstein, P.; Ingersoll, A. P.; Jaumann, R.; McKay, C. P.; Nimmo, F.; Waite, J. H.
2009-01-01
Enceladus is one of the most remarkable satellites in the solar system, as revealed by Cassini's detection of active plumes erupting from warm fractures near its south pole. This discovery makes Enceladus the only icy satellite known to exhibit ongoing internally driven geological activity. The activity is presumably powered by tidal heating maintained by Enceladus 2:1 mean-motion resonance with Dione, but many questions remain. For instance, it appears difficult or impossible to maintain the currently observed radiated power (probably at least 6 GW) in steady state. It is also not clear how Enceladus first entered its current self-maintaining warm and dissipative state initial heating from non-tidal sources is probably required. There are also many unanswered questions about Enceladus interior. The silicate fraction inferred from its density of 1.68 g per cubic centimeter is probably differentiated into a core, though we have only indirect evidence for differentiation. Above the core there is probably a global or regional water layer, inferred from several models of tidal heating, and an ice shell thick enough to support the 1 kilometer amplitude topography seen on Enceladus. It is possible that dissipation is largely localized beneath the south polar region. Enceladus surface geology, ranging from moderately cratered terrain to the virtually crater-free active south polar region, is highly diverse, tectonically complex, and remarkably symmetrical about the rotation axis and the direction to Saturn. South polar activity is concentrated along the four tiger stripe fractures, which radiate heat at temperatures up to at least 167 K and are the source of multiple plumes ejecting 200 kilograms per second of H2O vapor along with significant N2 (or C2H4), CO2, CH4, NH3, and higher-mass hydrocarbons. The escaping gas maintains Saturn's neutral gas torus, and the plumes also eject a large number of micron-sized H2O ice grains that populate Saturn's E-ring. The mechanism that powers the plumes is not well understood, and whether liquid water is involved is a subject of active debate (but likely nonetheless). Enceladus provides a promising potential habitat for life in the outer solar system, and the active plumes allow the unique opportunity for direct sampling of that zone. Enceladus is thus a prime target for Cassini's continued exploration of the Saturn system, and will be a tempting target for future missions.
Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2011-10-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.
Near-Earth Object Interception Using Nuclear Thermal Rock Propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
X-L. Zhang; E. Ball; L. Kochmanski
Planetary defense has drawn wide study: despite the low probability of a large-scale impact, its consequences would be disastrous. The study presented here evaluates available protection strategies to identify bottlenecks limiting the scale of near-Earth object that could be deflected, using cutting-edge and near-future technologies. It discusses the use of a nuclear thermal rocket (NTR) as a propulsion device for delivery of thermonuclear payloads to deflect or destroy a long-period comet on a collision course with Earth. A ‘worst plausible scenario’ for the available warning time (10 months) and comet approach trajectory are determined, and empirical data are used tomore » make an estimate of the payload necessary to deflect such a comet. Optimizing the tradeoff between early interception and large deflection payload establishes the ideal trajectory for an interception mission to follow. The study also examines the potential for multiple rocket launch dates. Comparison of propulsion technologies for this mission shows that NTR outperforms other options substantially. The discussion concludes with an estimate of the comet size (5 km) that could be deflected usingNTRpropulsion, given current launch capabilities.« less
Deriving photometric redshifts using fuzzy archetypes and self-organizing maps - II. Implementation
NASA Astrophysics Data System (ADS)
Speagle, Joshua S.; Eisenstein, Daniel J.
2017-07-01
With an eye towards the computational requirements of future large-scale surveys such as Euclid and Large Synoptic Survey Telescope (LSST) that will require photometric redshifts (photo-z's) for ≳ 109 objects, we investigate a variety of ways that 'fuzzy archetypes' can be used to improve photometric redshifts and explore their respective statistical interpretations. We characterize their relative performance using an idealized LSST ugrizY and Euclid YJH mock catalogue of 10 000 objects spanning z = 0-6 at Y = 24 mag. We find most schemes are able to robustly identify redshift probability distribution functions that are multimodal and/or poorly constrained. Once these objects are flagged and removed, the results are generally in good agreement with the strict accuracy requirements necessary to meet Euclid weak lensing goals for most redshifts between 0.8 ≲ z ≲ 2. These results demonstrate the statistical robustness and flexibility that can be gained by combining template-fitting and machine-learning methods and provide useful insights into how astronomers can further exploit the colour-redshift relation.
NASA Astrophysics Data System (ADS)
Fisher, J. Richard; Bradley, Richard F.; Brisken, Walter F.; Cotton, William D.; Emerson, Darrel T.; Kerr, Anthony R.; Lacasse, Richard J.; Morgan, Matthew A.; Napier, Peter J.; Norrod, Roger D.; Payne, John M.; Pospieszalski, Marian W.; Symmes, Arthur; Thompson, A. Richard; Webber, John C.
2009-03-01
This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.
Elastic and inelastic neutron scattering cross sections for fission reactor applications
NASA Astrophysics Data System (ADS)
Hicks, S. F.; Chakraborty, A.; Combs, B.; Crider, B. P.; Downes, L.; Girgis, J.; Kersting, L. J.; Kumar, A.; Lueck, C. J.; McDonough, P. J.; McEllistrem, M. T.; Peters, E. E.; Prados-Estevz, F. M.; Schniederjan, J.; Sidwell, L.; Sigillito, A. J.; Vanhoy, J. R.; Watts, D.; Yates, S. W.
2013-04-01
Nuclear data important for the design and development of the next generation of light-water reactors and future fast reactors include neutron elastic and inelastic scattering cross sections on important structural materials, such as Fe, and on coolant materials, such as Na. These reaction probabilities are needed since neutron reactions impact fuel performance during irradiations and the overall efficiency of reactors. While neutron scattering cross sections from these materials are available for certain incident neutron energies, the fast neutron region, particularly above 2 MeV, has large gaps for which no measurements exist, or the existing uncertainties are large. Measurements have been made at the University of Kentucky Accelerator Laboratory to measure neutron scattering cross sections on both Fe and Na in the region where these gaps occur and to reduce the uncertainties on scattering from the ground state and first excited state of these nuclei. Results from measurements on Fe at incident neutron energies between 2 and 4 MeV will be presented and comparisons will be made to model calculations available from data evaluators.
Environmental refugees in a globally warmed world
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, N.
1993-12-01
This paper examines the complex problem of environmental refugees as among the most serious of all the effects of global warming. Shoreline erosion, coastal flooding, and agricultural disruption from drought, soil erosion and desertification are factors now and in the future in creating a group of environmental refugees. Estimates are that at least 10 million such refugees exist today. A preliminary analysis is presented here as a first attempt to understand the full character and extent of the problem. Countries with large delta and coastal areas and large populations are at particular risk from sea-level rise of as little asmore » .5 - 1 meter, compounded by storm surge and salt water intrusions. Bangladesh, Egypt, China, and India are discussed in detail along with Island states at risk. Other global warming effects such as shifts in monsoon systems and severe and persistent droughts make agriculture particularly vulnerable. Lack of soil moisture is during the growing season will probably be the primary problem. Additional and compounding environmental problems are discussed, and an overview of the economic, sociocultural and political consequences is given. 96 refs., 1 tab.« less
Application of single-cell sequencing in human cancer.
Rantalainen, Mattias
2017-11-02
Precision medicine is emerging as a cornerstone of future cancer care with the objective of providing targeted therapies based on the molecular phenotype of each individual patient. Traditional bulk-level molecular phenotyping of tumours leads to significant information loss, as the molecular profile represents an average phenotype over large numbers of cells, while cancer is a disease with inherent intra-tumour heterogeneity at the cellular level caused by several factors, including clonal evolution, tissue hierarchies, rare cells and dynamic cell states. Single-cell sequencing provides means to characterize heterogeneity in a large population of cells and opens up opportunity to determine key molecular properties that influence clinical outcomes, including prognosis and probability of treatment response. Single-cell sequencing methods are now reliable enough to be used in many research laboratories, and we are starting to see applications of these technologies for characterization of human primary cancer cells. In this review, we provide an overview of studies that have applied single-cell sequencing to characterize human cancers at the single-cell level, and we discuss some of the current challenges in the field. © The Author 2017. Published by Oxford University Press.
Increasing stress on disaster-risk finance due to large floods
NASA Astrophysics Data System (ADS)
Jongman, Brenden; Hochrainer-Stigler, Stefan; Feyen, Luc; Aerts, Jeroen C. J. H.; Mechler, Reinhard; Botzen, W. J. Wouter; Bouwer, Laurens M.; Pflug, Georg; Rojas, Rodrigo; Ward, Philip J.
2014-04-01
Recent major flood disasters have shown that single extreme events can affect multiple countries simultaneously, which puts high pressure on trans-national risk reduction and risk transfer mechanisms. So far, little is known about such flood hazard interdependencies across regions and the corresponding joint risks at regional to continental scales. Reliable information on correlated loss probabilities is crucial for developing robust insurance schemes and public adaptation funds, and for enhancing our understanding of climate change impacts. Here we show that extreme discharges are strongly correlated across European river basins. We present probabilistic trends in continental flood risk, and demonstrate that observed extreme flood losses could more than double in frequency by 2050 under future climate change and socio-economic development. We suggest that risk management for these increasing losses is largely feasible, and we demonstrate that risk can be shared by expanding risk transfer financing, reduced by investing in flood protection, or absorbed by enhanced solidarity between countries. We conclude that these measures have vastly different efficiency, equity and acceptability implications, which need to be taken into account in broader consultation, for which our analysis provides a basis.
Davies, A R; Ruggles, R; Young, Y; Clark, H; Reddell, P; Verlander, N Q; Arnold, A; Maguire, H
2013-05-01
In September 2009, an outbreak of Salmonella enterica serovar Enteritidis affected 327 of 1419 inmates at a London prison. We applied a cohort design using aggregated data from the kitchen about portions of food distributed, aligned this with individual food histories from 124 cases (18 confirmed, 106 probable) and deduced the exposures of those remaining well. Results showed that prisoners eating egg cress rolls were 26 times more likely to be ill [risk ratio 25.7, 95% confidence interval (CI) 15.5-42.8, P<0.001]. In a case/non-case multivariable analysis the adjusted odds ratio for egg cress rolls was 41.1 (95% CI 10.3-249.7, P<0.001). The epidemiological investigation was strengthened by environmental and microbiological investigations. This paper outlines an approach to investigations in large complex settings where aggregate data for exposures may be available, and led to the development of guidelines for the management of future gastrointestinal outbreaks in prison settings.
CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.
2013-01-01
The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.
Martin, Petra; Leighl, Natasha B
2017-06-01
This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice.
Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Vaughan, William W.
2000-01-01
A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.
Current and Future Impact Risks from Small Debris to Operational Satellites
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi; Kessler, Don
2011-01-01
The collision between Iridium 33 and Cosmos 2251 in 2009 signaled the potential onset of the collision cascade effect, commonly known as the "Kessler Syndrome", in the low Earth orbit (LEO) region. Recent numerical simulations have shown that the 10 cm and larger debris population in LEO will continue to increase even with a good implementation of the commonly-adopted mitigation measures. This increase is driven by collisions involving large and massive intacts, i.e., rocket bodies and spacecraft. Therefore, active debris removal (ADR) of large and massive intacts with high collision probabilities has been argued as a direct and effective means to remediate the environment in LEO. The major risk for operational satellites in the environment, however, comes from impacts with debris just above the threshold of the protection shields. In general, these are debris in the millimeter to centimeter size regime. Although impacts by these objects are insufficient to lead to catastrophic breakup of the entire vehicle, the damage is certainly severe enough to cause critical failure of the key instruments or the entire payload. The focus of this paper is to estimate the impact risks from 5 mm and 1 cm debris to active payloads in LEO (1) in the current environment and (2) in the future environment based on different projection scenarios, including ADR. The goal of the study is to quantify the benefits of ADR in reducing debris impact risks to operational satellites.
Foditsch, Carla; Oikonomou, Georgios; Machado, Vinícius Silva; Bicalho, Marcela Luccas; Ganda, Erika Korzune; Lima, Svetlana Ferreira; Rossi, Rodolfo; Ribeiro, Bruno Leonardo; Kussler, Arieli; Bicalho, Rodrigo Carvalho
2016-01-01
The main objectives of this prospective cohort study were a) to describe lameness prevalence at drying off in large high producing New York State herds based on visual locomotion score (VLS) and identify potential cow and herd level risk factors, and b) to develop a model that will predict the probability of a cow developing claw horn disruption lesions (CHDL) in the subsequent lactation using cow level variables collected at drying off and/or available from farm management software. Data were collected from 23 large commercial dairy farms located in upstate New York. A total of 7,687 dry cows, that were less than 265 days in gestation, were enrolled in the study. Farms were visited between May 2012 and March 2013, and cows were assessed for body condition score (BCS) and VLS. Data on the CHDL events recorded by the farm employees were extracted from the Dairy-Comp 305 database, as well as information regarding the studied cows’ health events, milk production, and reproductive records throughout the previous and subsequent lactation period. Univariable analyses and mixed multivariable logistic regression models were used to analyse the data at the cow level. The overall average prevalence of lameness (VLS > 2) at drying off was 14%. Lactation group, previous CHDL, mature equivalent 305-d milk yield (ME305), season, BCS at drying off and sire PTA for strength were all significantly associated with lameness at the drying off (cow-level). Lameness at drying off was associated with CHDL incidence in the subsequent lactation, as well as lactation group, previous CHDL and ME305. These risk factors for CHDL in the subsequent lactation were included in our predictive model and adjusted predicted probabilities for CHDL were calculated for all studied cows. ROC analysis identified an optimum cut-off point for these probabilities and using this cut-off point we could predict CHDL incidence in the subsequent lactation with an overall specificity of 75% and sensitivity of 59%. Using this approach, we would have detected 33% of the studied population as being at risk, eventually identifying 59% of future CHDL cases. Our predictive model could help dairy producers focusing their efforts on CHDL reduction by implementing aggressive preventive measures for high risk cows. PMID:26795970
Megacity Megaquakes: Two Near-misses, and the Clues they Leave for Earthquake Interaction
NASA Astrophysics Data System (ADS)
Shapiro, S. A.; Krüger, O.; Dinske, C.; Langenbruch, C.
2011-12-01
Two recent earthquakes left their mark on cities lying well beyond the mainshock rupture zones, raising questions of their future vulnerability, and about earthquake interaction broadly. The 27 February 2010 M=8.8 Maule earthquake struck the Chilean coast, killing 550 people. Chile's capital of Santiago lies 400 km from the high-slip portion of the rupture, and 100 km beyond its edge. The 11 March 2011 M=9.0 Tohoku oki earthquake struck the coast of Japan, its massive tsunami claiming most of its 18,564 victims. Reminiscent of Santiago, Japan's capital of Tokyo lies 400 km from the high-slip portion of the rupture, and 100 km beyond its edge. Because of this distance, both cities largely escaped damage. But it may not have been a clean get-away: The rate of small shocks beneath each city jumped by a factor of about 10 immediately after its megaquake. At Santiago, the quake rate remains two times higher today than it was before the Maule shock; at Tokyo it is three times higher. What this higher rate of moderate (M<6) quakes portends for the likelihood of large ones is difficult--but imperative--to answer, as Tokyo and Santiago are probably just the most striking cases of a common phenomenon: Seismicity increases well beyond the rupture zone, as also seen in the 1999 Izmit-Düzce and 2010 Darfield-Christchurch sequences. Are the Tokyo and Santiago earthquakes, 100 km from the fault rupture, aftershocks? The seismicity beneath Santiago is occurring on the adjacent unruptured section of the Chile-Peru trench megathrust, whereas shocks beneath Tokyo illuminate a deeper, separate fault system. In both cases, the rate of shocks underwent an Omori decay, although the decay ceased beneath Tokyo about a year after the mainshock. Coulomb calculations suggest that the stress imparted by the nearby megaquakes brought the faults beneath Santiago and Tokyo closer to failure (Lorito et al, Nature Geoscience 2010; Toda and Stein, GRL 2013). So, they are aftershocks in the sense that they are contingent on the mainshock, and underwent at least an initial decay. But aftershocks do not necessarily signal a heightened likelihood of large shocks. They could instead accompany post-seismic creep, with the creep shedding the stress imposed by the megaquakes. These aftershocks are too deep for GPS observations to reveal unequivocally whether the faults are locked or creeping. But one clue is that the ratio of small to large shocks was not changed by the megaquakes. This distribution could be a reliable pointer for the probability of lager quakes, and so large shocks may now indeed be more probable than before the megaquakes--by a factor of at least two.
Megacity Megaquakes: Two Near-misses, and the Clues they Leave for Earthquake Interaction
NASA Astrophysics Data System (ADS)
Stein, R. S.; Toda, S.
2013-12-01
Two recent earthquakes left their mark on cities lying well beyond the mainshock rupture zones, raising questions of their future vulnerability, and about earthquake interaction broadly. The 27 February 2010 M=8.8 Maule earthquake struck the Chilean coast, killing 550 people. Chile's capital of Santiago lies 400 km from the high-slip portion of the rupture, and 100 km beyond its edge. The 11 March 2011 M=9.0 Tohoku oki earthquake struck the coast of Japan, its massive tsunami claiming most of its 18,564 victims. Reminiscent of Santiago, Japan's capital of Tokyo lies 400 km from the high-slip portion of the rupture, and 100 km beyond its edge. Because of this distance, both cities largely escaped damage. But it may not have been a clean get-away: The rate of small shocks beneath each city jumped by a factor of about 10 immediately after its megaquake. At Santiago, the quake rate remains two times higher today than it was before the Maule shock; at Tokyo it is three times higher. What this higher rate of moderate (M<6) quakes portends for the likelihood of large ones is difficult--but imperative--to answer, as Tokyo and Santiago are probably just the most striking cases of a common phenomenon: Seismicity increases well beyond the rupture zone, as also seen in the 1999 Izmit-Düzce and 2010 Darfield-Christchurch sequences. Are the Tokyo and Santiago earthquakes, 100 km from the fault rupture, aftershocks? The seismicity beneath Santiago is occurring on the adjacent unruptured section of the Chile-Peru trench megathrust, whereas shocks beneath Tokyo illuminate a deeper, separate fault system. In both cases, the rate of shocks underwent an Omori decay, although the decay ceased beneath Tokyo about a year after the mainshock. Coulomb calculations suggest that the stress imparted by the nearby megaquakes brought the faults beneath Santiago and Tokyo closer to failure (Lorito et al, Nature Geoscience 2010; Toda and Stein, GRL 2013). So, they are aftershocks in the sense that they are contingent on the mainshock, and underwent at least an initial decay. But aftershocks do not necessarily signal a heightened likelihood of large shocks. They could instead accompany post-seismic creep, with the creep shedding the stress imposed by the megaquakes. These aftershocks are too deep for GPS observations to reveal unequivocally whether the faults are locked or creeping. But one clue is that the ratio of small to large shocks was not changed by the megaquakes. This distribution could be a reliable pointer for the probability of lager quakes, and so large shocks may now indeed be more probable than before the megaquakes--by a factor of at least two.
Sample size requirements for the design of reliability studies: precision consideration.
Shieh, Gwowen
2014-09-01
In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.
Wan, Zhaofei; Liu, Xiaojun; Wang, Xinhong; Liu, Fuqiang; Liu, Weimin; Wu, Yue; Pei, Leilei; Yuan, Zuyi
2014-04-01
Arterial elasticity has been shown to predict cardiovascular disease (CVD) in apparently healthy populations. The present study aimed to explore whether arterial elasticity could predict CVD events in Chinese patients with angiographic coronary artery disease (CAD). Arterial elasticity of 365 patients with angiographic CAD was measured. During follow-up (48 months; range 6-65), 140 CVD events occurred (including 34 deaths). Univariate Cox analysis demonstrated that both large arterial elasticity and small arterial elasticity were significant predictors of CVD events. Multivariate Cox analysis indicated that small arterial elasticity remained significant. Kaplan-Meier analysis showed that the probability of having a CVD event/CVD death increased with a decrease of small arterial elasticity (P < .001, respectively). Decreased small arterial elasticity independently predicts the risk of CVD events in Chinese patients with angiographic CAD.
Theoretical comparison of maser materials for a 32-GHz maser amplifier
NASA Technical Reports Server (NTRS)
Lyons, James R.
1988-01-01
The computational results of a comparison of maser materials for a 32 GHz maser amplifier are presented. The search for a better maser material is prompted by the relatively large amount of pump power required to sustain a population inversion in ruby at frequencies on the order of 30 GHz and above. The general requirements of a maser material and the specific problems with ruby are outlined. The spin Hamiltonian is used to calculate energy levels and transition probabilities for ruby and twelve other materials. A table is compiled of several attractive operating points for each of the materials analyzed. All the materials analyzed possess operating points that could be superior to ruby. To complete the evaluation of the materials, measurements of inversion ratio and pump power requirements must be made in the future.
Sun to breathe new life into old reservoir. [Texas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bleakley, W.B.
1981-04-01
The McCleskey sand reservoir, Ranger field, Eastland County, Texas, approx. 40 miles east of Abilene, has a short, frustrating early life and a disappointing maturity, but promises an exciting future. Sun Production Co. feels that application of new technologies in reservoir management, production techniques, and geologic interpretation will pave the way to recovery of a large portion of the 41 API oil remaining in place. Estimates indicate that the early productive life of the field, which yielded approx. 27 million bbl, accounted for as little as 15% of the original oil in place, and probably not more than 20%. Amore » pilot waterflood will get underway soon, and careful monitoring of that project should provide answers to remaining questions. All signs are favorable, and Sun is optimistic about final results.« less
Automatic spacecraft detumbling by internal mass motion
NASA Technical Reports Server (NTRS)
Edwards, T. L.; Kaplan, M. H.
1974-01-01
In the operation of future manned space vehicles, there will always be a finite probability that an accident will occur which results in uncontrolled tumbling of a craft. Hard docking by a manned rescue vehicle is not acceptable because of the hazardous environment to which rescue crewmen would be exposed and excessive maneuvering accelerations during docking operations. A movable-mass control concept, which is activated upon initiation of tumbling and is autonomous, can convert tumbling motion into simple spin. The complete equations of motion for an asymmetric rigid spacecraft containing a movable mass are presented, and appropriate control law and system parameters are selected to minimize kinetic energy, resulting in simple spin about the major principal axis. Simulations indicate that for a large space station experiencing a collision, which results in tumbling, a 1% movable mass is capable of stabilizing motion in 2 hr.
Development of future indications for BOTOX.
Brin, Mitchell F
2009-10-01
Since the late 1970s, local injections of BoNT have provided clinical benefit for patients with inappropriately contracting muscles with or without pain or sensory disturbance. Marketing authorization for some BoNTs, depending on country, include core indications of dystonia (blepharospasm and cervical dystonia), large muscle spastic disorders (not yet approved in the United States, e.g., adult post-stroke spasticity and equinus foot deformity), hyperhidrosis and aesthetic. Subsequent development has extended to selected conditions characterized by recurrent or chronic pain (migraine headache), and urologic indications (neurogenic/idiopathic overactive bladder; prostate hyperplasia), with multiple additional opportunities available. Portfolio management requires a careful individual opportunity assessment of scientific and technical aspects (basic science foundation, potential to treat unmet medical need, product-specific risk in specific populations, therapeutic margin/safety profile, and probability of successful registration pathway). This article describes ongoing development targets for BOTOX.
Temperature and tree growth [editorial
Michael G. Ryan
2010-01-01
Tree growth helps US forests take up 12% of the fossil fuels emitted in the USA (Woodbury et al. 2007), so predicting tree growth for future climates matters. Predicting future climates themselves is uncertain, but climate scientists probably have the most confidence in predictions for temperature. Temperatures are projected to rise by 0.2 °C in the next two decades,...
20 Years of Research into Violence and Trauma: Past and Future Developments
ERIC Educational Resources Information Center
Kamphuis, Jan H.; Emmelkamp, Paul M. G.
2005-01-01
This reflection on major developments in the past, present, and future of the wider field of violence and trauma is a personal (and probably biased) sampling of what the authors hold to be important. The authors reviewed advances for victims and perpetrators of violence separately. For victims, the authors note that empirical research has…
Coalescence computations for large samples drawn from populations of time-varying sizes
Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek
2017-01-01
We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404
Performance evaluation of an importance sampling technique in a Jackson network
NASA Astrophysics Data System (ADS)
brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed
2014-03-01
Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.
Sensitivity of feedforward neural networks to weight errors
NASA Technical Reports Server (NTRS)
Stevenson, Maryhelen; Widrow, Bernard; Winter, Rodney
1990-01-01
An analysis is made of the sensitivity of feedforward layered networks of Adaline elements (threshold logic units) to weight errors. An approximation is derived which expresses the probability of error for an output neuron of a large network (a network with many neurons per layer) as a function of the percentage change in the weights. As would be expected, the probability of error increases with the number of layers in the network and with the percentage change in the weights. The probability of error is essentially independent of the number of weights per neuron and of the number of neurons per layer, as long as these numbers are large (on the order of 100 or more).
Wavefront sensing, control, and pointing
NASA Technical Reports Server (NTRS)
Pitts, Thomas; Sevaston, George; Agronin, Michael; Bely, Pierre; Colavita, Mark; Clampin, Mark; Harvey, James; Idell, Paul; Sandler, Dave; Ulmer, Melville
1992-01-01
A majority of future NASA astrophysics missions from orbiting interferometers to 16-m telescopes on the Moon have, as a common requirement, the need to bring light from a large entrance aperture to the focal plane in a way that preserves the spatial coherence properties of the starlight. Only by preserving the phase of the incoming wavefront, can many scientific observations be made, observations that range from measuring the red shift of quasi-stellar objects (QSO's) to detecting the IR emission of a planet in orbit around another star. New technologies for wavefront sensing, control, and pointing hold the key to advancing our observatories of the future from those already launched or currently under development. As the size of the optical system increases, either to increase the sensitivity or angular resolution of the instrument, traditional technologies for maintaining optical wavefront accuracy become prohibitively expensive or completely impractical. For space-based instruments, the low mass requirement and the large temperature excursions further challenge existing technologies. The Hubble Space Telescope (HST) is probably the last large space telescope to rely on passive means to keep its primary optics stable and the optical system aligned. One needs only look to the significant developments in wavefront sensing, control, and pointing that have occurred over the past several years to appreciate the potential of this technology for transforming the capability of future space observatories. Future developments in space-borne telescopes will be based in part on developments in ground-based systems. Telescopes with rigid primary mirrors much larger than 5 m in diameter are impractical because of gravity loading. New technologies are now being introduced, such as active optics, that address the scale problem and that allow very large telescopes to be built. One approach is a segmented design such as that being pioneered by the W.M. Keck telescope now under construction at the Mauna Kea Observatory. It consists of 36 hexagonal mirror segments, supported on a framework structure, which are positioned by actuators located between the structure and the mirrors. The figure of the telescope is initialized by making observations of a bright star using a Shack Hartmann sensor integrated with a white light interferometer. Then, using sensed data from the mirror edges to control these actuators, the figure of the mosaic of 36 segments is maintained as if it were a rigid primary mirror. Another active optics approach is the use of a thin meniscus mirror with actuators. This technique was demonstrated on the European Southern Observatory's New Technology Telescope (NTT) and is planned for use in the Very Large Telescope (consists of four 8-m apertures), which is now entering the design phase.
Spatial organization of foreshocks as a tool to forecast large earthquakes.
Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C
2012-01-01
An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models.
Spatial organization of foreshocks as a tool to forecast large earthquakes
Lippiello, E.; Marzocchi, W.; de Arcangelis, L.; Godano, C.
2012-01-01
An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg2), with significant probability gains with respect to standard models. PMID:23152938
Evaluation of a seismic quiescence pattern in southeastern sicily
NASA Astrophysics Data System (ADS)
Mulargia, F.; Broccio, F.; Achilli, V.; Baldi, P.
1985-07-01
Southeastern Sicily experienced a very peculiar seismic activity in historic times, with a long series of ruinous earthquakes. A last large event, with magnitude probably in excess of 7.5, occurred on Jan., 11, 1693, totally destroying the city of Catania and killing 60,000 people. Only a few moderate events were reported since then, and a seismic gap issue has been proposed on this basis. A close scrutiny of the available data further shows that all significant seismic activity ceased after year 1850, suggesting one of the largest quiescence patterns ever encountered. This is examined together with the complex tectonic setting of the region, characterized by a wrenching mechanism with most significant seismicity located in its northern graben structure. An attempt to ascertain the imminence and the size of a future earthquake through commonly accepted empirical relations based on size and duration of the quiescence pattern did not provide any feasible result. A precision levelling survey which we recently completed yielded a relative subsidence of ~ 3 mm/yr, consistent with an aseismic slip on the northern graben structure at a rate of ~ 15 mm/yr. Comparing these results with sedimentological and tidal data suggests that the area is undergoing an accelerated deformation process; this issue is further supported by Rikitake's ultimate strain statistics. If the imminence of a damaging ( M = 5.4) event is strongly favoured by Weibull statistics applied to the time series of occurrence of large events, the accumulated strain does not appear sufficient for a large earthquake ( M ⪸ 7.0). Within the limits of reliability of present semi-empirical approaches we conclude that the available evidence is consistent with the occurrence of a moderate-to-large ( M ≅ 6.0) event in the near future. Several questions regarding the application of simple models to real (and complex) tectonic settings remain nevertheless unanswered.
NASA Astrophysics Data System (ADS)
Fu, R.; Fernando, D. N.; YANG, Z.; Solis, R.
2013-12-01
'Flash' droughts refer to those droughts that intensify rapidly in spring and summer, coupled with a strong increase of summer extreme temperatures, such as those that occurred over Texas in 2011 and the Great Plains in 2012. These droughts represent a great threat to North American water security. Climate models have failed to predict these 'flash' droughts and are ambiguous in projecting their future changes largely because of models' weaknesses in predicting summer rainfall and soil moisture feedbacks. By contrast, climate models are more reliable in simulating changes of large-scale circulation and warming of temperatures during the winter and spring seasons. We present a prototype of an early warning indicator for the risk of 'flash' droughts in summer by using the large-scale circulation and land surface conditions in winter and spring based on observed relationships between these conditions and their underlying physical mechanisms established by previous observations and numerical model simulations. This prototype 'flash' drought indicator (IFDW) currently uses global and regional reanalysis products (e.g., CFSR, MERRA, NLDAS products) in winter and spring to provide an assessment of summer drought severity similar to drought severity indices like PDSI (Palmer Drought Severity Index), SPI (Standard Precipitation Index) etc., provided by the National Integrated Drought Information Center (NIDIS) with additional information about uncertainty and past probability distributions of IFDW. Preliminary evaluation of hindcasts suggests that the indicator captures the occurrences of all the regional severe to extreme summer droughts during the past 63 years (1949-2011) over the US Great Plains, and 95% of the drought ending. This prototype IFDW has several advantages over the available drought indices that simply track local drought conditions in the past, present and future: 1) It mitigates the weakness of current climate models in predicting future summer droughts and takes advantage of model strengths and our understanding of the mechanisms that control 'flash' droughts; 2) It provides actionable drought risk information for stakeholders before droughts become fully developed in the current climate; 3) It can potentially link the future increase of temperatures in winter and spring to the risk of 'flash' droughts in summer. Such a link would make the projected changes of the 'flash' droughts more intuitive and compelling to high-level decision makers and the public.
NASA Astrophysics Data System (ADS)
Shkolnik, Igor; Pavlova, Tatiana; Efimov, Sergey; Zhuravlev, Sergey
2018-01-01
Climate change simulation based on 30-member ensemble of Voeikov Main Geophysical Observatory RCM (resolution 25 km) for northern Eurasia is used to drive hydrological model CaMa-Flood. Using this modeling framework, we evaluate the uncertainties in the future projection of the peak river discharge and flood hazard by 2050-2059 relative to 1990-1999 under IPCC RCP8.5 scenario. Large ensemble size, along with reasonably high modeling resolution, allows one to efficiently sample natural climate variability and increase our ability to predict future changes in the hydrological extremes. It has been shown that the annual maximum river discharge can almost double by the mid-XXI century in the outlets of major Siberian rivers. In the western regions, there is a weak signal in the river discharge and flood hazard, hardly discernible above climate variability. Annual maximum flood area is projected to increase across Siberia mostly by 2-5% relative to the baseline period. A contribution of natural climate variability at different temporal scales to the uncertainty of ensemble prediction is discussed. The analysis shows that there expected considerable changes in the extreme river discharge probability at locations of the key hydropower facilities. This suggests that the extensive impact studies are required to develop recommendations for maintaining regional energy security.
NASA Astrophysics Data System (ADS)
Ji, J.
2014-07-01
Primitive asteroids are remnant building blocks in the Solar System formation. They provide key clues for us to reach in-depth understanding of the process of planetary formation, the complex environment of early Solar nebula, and even the occurrence of life on the Earth. On 13 December 2012, Chang'e-2 completed a successful flyby of the near-Earth asteroid (4179) Toutatis at a closest distance of 770 meters from the asteroid's surface. The observations show that Toutatis has an irregular surface and its shape resembles a ginger-root with a smaller lobe (head) and a larger lobe (body). Such bifurcated configuration is indicative of a contact binary origin for Toutatis. In addition, the images with a 3-m resolution or higher provide a number of new discoveries about this asteroid, such as an 800-meter basin at the end of the large lobe, a sharply perpendicular silhouette near the neck region, and direct evidence of boulders and regolith, indicating that Toutatis is probably a rubble-pile asteroid. The Chang'e-2 observations have provided significant new insights into the geological features and the formation and evolution of this asteroid. Moreover, a conceptual introduction to future Chinese missions to asteroids, such as the major scientific objectives, scientific payloads, and potential targets, will be briefly given. The proposed mission will benefit a lot from potential international collaboration in the future.
Durand, Casey P; Tang, Xiaohui; Gabriel, Kelley P; Sener, Ipek N; Oluyomi, Abiodun O; Knell, Gregory; Porter, Anna K; Oelscher, Deanna M; Kohl, Harold W
2016-06-01
Use of public transit is cited as a way to help individuals incorporate regular physical activity into their day. As a novel research topic, however, there is much we do not know. The aim of this analysis was to identify the correlation between distance to a transit stop and the probability it will be accessed by walking. We also sought to understand if this relation was moderated by trip, personal or household factors. Data from the 2012 California Household Travel Survey was used for this cross-sectional analysis. 2,573 individuals were included, representing 6,949 transit trips. Generalized estimating equations modeled the probability of actively accessing public transit as a function of distance from origin to transit stop, and multiple trip, personal and household variables. Analyses were conducted in 2014 and 2015. For each mile increase in distance from the point of origin to the transit stop, the probability of active access decreased by 12%. With other factors held equal, at two miles from a transit stop there is a 50% chance someone will walk to a stop versus non-active means. The distance-walking relation was modified by month the trips were taken. Individuals appear to be willing to walk further to reach transit than existing guidelines indicate. This implies that for any given transit stop, the zone of potential riders who will walk to reach transit is relatively large. Future research should clarify who transit-related walkers are, and why some are more willing to walk longer distances to transit than others.
The abundance of biotic exoplanets and life on planets of Red Dwarf stars
NASA Astrophysics Data System (ADS)
Wandel, Amri; Gale, Joseph
2016-07-01
The Kepler mission has shown that Earthlike planets orbiting within the Habitable Zones of their host stars are common. We derive an expression for the abundance of life bearing (biotic) extra-solar-system planets (exoplanets) in terms of the (yet unknown) probability for the evolution of biotic life. This "biotic probability" may be estimated by future missions and observations, e.g. spectral analyses of the atmospheres of exoplanets, looking for biomarkers. We show that a biotic probability in the range 0.001-1 implies that a biotic planet may be expected within ~10-100 light years from Earth. Of particular interest in the search for exolife are planets orbiting Red Dwarf (RD) stars, the most frequent stellar type. Previous researches suggested that conditions on planets near RDs would be inimical to life, e.g. the Habitable Zone of RDs is small, so their habitable planets would be close enough to be tidally locked. Recent calculations show that this and other properties of RDs, presumed hostile for the evolution of life, are less severe than originally estimated. We conclude that RD planets could be hospitable for the evolution of life as we know it, not less so than planets of solar-type stars. This result, together with the large number of RDs and their Kepler planet-statistics, makes finding life on RD planets ~10-1000 times more likely than on planets of solar-type stars. Our nearest biotic RD-planet is likely to be 2-10 times closer than the nearest solar-type one.
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
A computational framework to empower probabilistic protein design
Fromer, Menachem; Yanover, Chen
2008-01-01
Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717
NASA Astrophysics Data System (ADS)
Zuluaga, Jorge I.; Sánchez-Hernández, Oscar; Sucerquia, Mario; Ferrín, Ignacio
2018-06-01
With the advent of more and deeper sky surveys, the discovery of interstellar small objects entering into the solar system has been finally possible. In 2017 October 19, using observations of the Pan-STARRS survey, a fast moving object, now officially named 1I/2017 U1 (‘Oumuamua), was discovered in a heliocentric unbound trajectory, suggesting an interstellar origin. Assessing the provenance of interstellar small objects is key for understanding their distribution, spatial density, and the processes responsible for their ejection from planetary systems. However, their peculiar trajectories place a limit on the number of observations available to determine a precise orbit. As a result, when its position is propagated ∼105–106 years backward in time, small errors in orbital elements become large uncertainties in position in the interstellar space. In this paper we present a general method for assigning probabilities to nearby stars of being the parent system of an observed interstellar object. We describe the method in detail and apply it for assessing the origin of ‘Oumuamua. A preliminary list of potential progenitors and their corresponding probabilities is provided. In the future, when further information about the object and/or the nearby stars be refined, the probabilities computed with our method can be updated. We provide all the data and codes we developed for this purpose in the form of an open source C/C++/Python package, iWander, which is publicly available at http://github.com/seap-udea/iWander.
Durand, Casey P.; Tang, Xiaohui; Gabriel, Kelley P.; Sener, Ipek N.; Oluyomi, Abiodun O.; Knell, Gregory; Porter, Anna K.; oelscher, Deanna M.; Kohl, Harold W.
2015-01-01
Introduction Use of public transit is cited as a way to help individuals incorporate regular physical activity into their day. As a novel research topic, however, there is much we do not know. The aim of this analysis was to identify the correlation between distance to a transit stop and the probability it will be accessed by walking. We also sought to understand if this relation was moderated by trip, personal or household factors. Methods Data from the 2012 California Household Travel Survey was used for this cross-sectional analysis. 2,573 individuals were included, representing 6,949 transit trips. Generalized estimating equations modeled the probability of actively accessing public transit as a function of distance from origin to transit stop, and multiple trip, personal and household variables. Analyses were conducted in 2014 and 2015. Results For each mile increase in distance from the point of origin to the transit stop, the probability of active access decreased by 12%. With other factors held equal, at two miles from a transit stop there is a 50% chance someone will walk to a stop versus non-active means. The distance-walking relation was modified by month the trips were taken. Conclusions Individuals appear to be willing to walk further to reach transit than existing guidelines indicate. This implies that for any given transit stop, the zone of potential riders who will walk to reach transit is relatively large. Future research should clarify who transit-related walkers are, and why some are more willing to walk longer distances to transit than others. PMID:27429905
Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K
2014-09-01
Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. © 2014 American Society for Nutrition.
Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties
NASA Astrophysics Data System (ADS)
Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris
2014-08-01
We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples
Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.
2017-01-01
Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254
Climate change and wetland loss impacts on a western river's water quality
NASA Astrophysics Data System (ADS)
Records, R. M.; Arabi, M.; Fassnacht, S. R.; Duffy, W. G.; Ahmadi, M.; Hegewisch, K. C.
2014-11-01
An understanding of potential stream water quality conditions under future climate is critical for the sustainability of ecosystems and the protection of human health. Changes in wetland water balance under projected climate could alter wetland extent or cause wetland loss (e.g., via increased evapotranspiration and lower growing season flows leading to reduced riparian wetland inundation) or altered land use patterns. This study assessed the potential climate-induced changes to in-stream sediment and nutrient loads in the snowmelt-dominated Sprague River, Oregon, western US. Additionally, potential water quality impacts of combined changes in wetland water balance and wetland area under future climatic conditions were evaluated. The study utilized the Soil and Water Assessment Tool (SWAT) forced with statistical downscaling of general circulation model (GCM) data from the Coupled Model Intercomparison Project 5 (CMIP5) using the Multivariate Adaptive Constructed Analogs (MACA) method. Our findings suggest that, in the Sprague River, (1) mid-21st century nutrient and sediment loads could increase significantly during the high-flow season under warmer, wetter climate projections or could change only nominally in a warmer and somewhat drier future; (2) although water quality conditions under some future climate scenarios and no wetland loss may be similar to the past, the combined impact of climate change and wetland losses on nutrient loads could be large; (3) increases in stream total phosphorus (TP) concentration with wetland loss under future climate scenarios would be greatest at high-magnitude, low-probability flows; and (4) loss of riparian wetlands in both headwaters and lowlands could increase outlet TP loads to a similar degree, but this could be due to distinctly different mechanisms in different parts of the watershed.
Sansone, Genevieve; Fong, Geoffrey T; Hall, Peter A; Guignard, Romain; Beck, François; Mons, Ute; Pötschke-Langer, Martina; Yong, Hua-Hie; Thompson, Mary E; Omar, Maizurah; Jiang, Yuan
2013-04-15
Prior studies have demonstrated that time perspective-the propensity to consider short-versus long-term consequences of one's actions-is a potentially important predictor of health-related behaviors, including smoking. However, most prior studies have been conducted within single high-income countries. The aim of this study was to examine whether time perspective was associated with the likelihood of being a smoker or non-smoker across five countries that vary in smoking behavior and strength of tobacco control policies. The data were from the International Tobacco Control (ITC) Surveys in five countries with large probability samples of both smokers (N=10,341) and non-smokers (N=4,955): Scotland, France, Germany, China, and Malaysia. The surveys were conducted between 2005-2008. Survey respondents indicated their smoking status (smoker vs. non-smoker) and time perspective (future oriented vs. not future-oriented) and provided demographic information. Across all five countries, non-smokers were significantly more likely to be future-oriented (66%) than were smokers (57%), χ(2)(1, N = 15,244) = 120.64, p < .001. This bivariate relationship between time perspective and smoking status held in a multivariate analysis. After controlling for country, age, sex, income, education, and ethnicity (language in France), those who were future-oriented had 36% greater odds of being a non-smoker than a smoker (95% CI: 1.22 to 1.51, p<.001). These findings establish time perspective as an important predictor of smoking status across multiple countries and suggest the potential value of incorporating material to enhance future orientation in smoking cessation interventions.
2013-01-01
Background Prior studies have demonstrated that time perspective—the propensity to consider short-versus long-term consequences of one’s actions—is a potentially important predictor of health-related behaviors, including smoking. However, most prior studies have been conducted within single high-income countries. The aim of this study was to examine whether time perspective was associated with the likelihood of being a smoker or non-smoker across five countries that vary in smoking behavior and strength of tobacco control policies. Methods The data were from the International Tobacco Control (ITC) Surveys in five countries with large probability samples of both smokers (N=10,341) and non-smokers (N=4,955): Scotland, France, Germany, China, and Malaysia. The surveys were conducted between 2005–2008. Survey respondents indicated their smoking status (smoker vs. non-smoker) and time perspective (future oriented vs. not future-oriented) and provided demographic information. Results Across all five countries, non-smokers were significantly more likely to be future-oriented (66%) than were smokers (57%), χ2(1, N = 15,244) = 120.64, p < .001. This bivariate relationship between time perspective and smoking status held in a multivariate analysis. After controlling for country, age, sex, income, education, and ethnicity (language in France), those who were future-oriented had 36% greater odds of being a non-smoker than a smoker (95% CI: 1.22 to 1.51, p<.001). Conclusion These findings establish time perspective as an important predictor of smoking status across multiple countries and suggest the potential value of incorporating material to enhance future orientation in smoking cessation interventions. PMID:23587205
Yoshioka, Akio; Fukuzawa, Kaori; Mochizuki, Yuji; Yamashita, Katsumi; Nakano, Tatsuya; Okiyama, Yoshio; Nobusawa, Eri; Nakajima, Katsuhisa; Tanaka, Shigenori
2011-09-01
Ab initio electronic-state calculations for influenza virus hemagglutinin (HA) trimer complexed with Fab antibody were performed on the basis of the fragment molecular orbital (FMO) method at the second and third-order Møller-Plesset (MP2 and MP3) perturbation levels. For the protein complex containing 2351 residues and 36,160 atoms, the inter-fragment interaction energies (IFIEs) were evaluated to illustrate the effective interactions between all the pairs of amino acid residues. By analyzing the calculated data on the IFIEs, we first discussed the interactions and their fluctuations between multiple domains contained in the trimer complex. Next, by combining the IFIE data between the Fab antibody and each residue in the HA antigen with experimental data on the hemadsorption activity of HA mutants, we proposed a protocol to predict probable mutations in HA. The proposed protocol based on the FMO-MP2.5 calculation can explain the historical facts concerning the actual mutations after the emergence of A/Hong Kong/1/68 influenza virus with subtype H3N2, and thus provides a useful methodology to enumerate those residue sites likely to mutate in the future. Copyright © 2011 Elsevier Inc. All rights reserved.
Challenges in assessing seismic hazard in intraplate Europe
NASA Astrophysics Data System (ADS)
Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon
2016-04-01
Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.
Kurlan, R; Whitmore, D; Irvine, C; McDermott, M P; Como, P G
1994-04-01
To determine whether children requiring special education represent a high-risk group for identifying Tourette's syndrome (TS), we performed direct examinations for the presence of tics in 35 special education and 35 regular classroom students from a single school district. Of the special education students, nine (26%) had definite or probable tics as compared with only two (6%) of the regular classroom students. About one-third of the students with tics currently meet diagnostic criteria for TS and probably more will do so in the future. About one-half of the subjects with tics have evidence of obsessive-compulsive behavior (OCB) or an attention-deficit hyperactivity disorder (ADHD). For three randomly selected students with definite tics, direct examinations of first-degree relatives revealed the presence of tics in all families. Subjects to the limitations of this pilot study, we conclude that TS and related tic disorders are commonly associated with the need for special education in this single school district. TS might also be an important contributor to school problems in the childhood population at large and may be a highly prevalent condition. In addition, we conclude that childhood tics are associated with OCB and ADHD, are genetically determined, and are part of the TS clinical spectrum.
Heterologous expression of an active chitin synthase from Rhizopus oryzae.
Salgado-Lugo, Holjes; Sánchez-Arreguín, Alejandro; Ruiz-Herrera, José
2016-12-01
Chitin synthases are highly important enzymes in nature, where they synthesize structural components in species belonging to different eukaryotic kingdoms, including kingdom Fungi. Unfortunately, their structure and the molecular mechanism of synthesis of their microfibrilar product remain largely unknown, probably because no fungal active chitin synthases have been isolated, possibly due to their extreme hydrophobicity. In this study we have turned to the heterologous expression of the transcript from a small chitin synthase of Rhizopus oryzae (RO3G_00942, Chs1) in Escherichia coli. The enzyme was active, but accumulated mostly in inclusion bodies. High concentrations of arginine or urea solubilized the enzyme, but their dilution led to its denaturation and precipitation. Nevertheless, use of urea permitted the purification of small amounts of the enzyme. The properties of Chs1 (Km, optimum temperature and pH, effect of GlcNAc) were abnormal, probably because it lacks the hydrophobic transmembrane regions characteristic of chitin synthases. The product of the enzyme showed that, contrasting with chitin made by membrane-bound Chs's and chitosomes, was only partially in the form of short microfibrils of low crystallinity. This approach may lead to future developments to obtain active chitin synthases that permit understanding their molecular mechanism of activity, and microfibril assembly. Copyright © 2016. Published by Elsevier Inc.
ASAR images a diverse set of deformation patterns at Kilauea volcano, Hawai'i
Poland, Michael P.
2007-01-01
Since 2003, 27 independent look angles have been acquired by ENVISAT’s Advanced Synthetic Aperture Radar (ASAR) instrument over the island of Hawai`i, allowing for the formation of thousands of interferograms showing deformation of the ground surface. On Kīlauea volcano, a transition from minor to broad-scale summit inflation was observed by interferograms that span 2003 to 2006. In addition, radar interferometry (InSAR) observations of Kīlauea led to the discovery of several previously unknown areas of localized subsidence in the caldera and along the volcano’s east rift zone. These features are probably caused by the cooling and contraction of accumulated lavas. After November 2005, a surface instability near the point that lava entered the ocean on the south flank of Kīlauea was observed in interferograms. The motion is most likely a result of unbuttressing of a portion of the coast following the collapse of a large lava delta in November 2005. InSAR data can also be used to map lava flow development over time, providing ~30 m spatial resolution maps at approximately monthly intervals. Future applications of InSAR to Kīlauea will probably result in more discoveries and insights, both as the style of volcano deformation changes and as data from new instruments are acquired.
Comments on potential geologic and seismic hazards affecting coastal Ventura County, California
Ross, Stephanie L.; Boore, David M.; Fisher, Michael A.; Frankel, Arthur D.; Geist, Eric L.; Hudnut, Kenneth W.; Kayen, Robert E.; Lee, Homa J.; Normark, William R.; Wong, Florence L.
2004-01-01
This report examines the regional seismic and geologic hazards that could affect proposed liquefied natural gas (LNG) facilities in coastal Ventura County, California. Faults throughout this area are thought to be capable of producing earthquakes of magnitude 6.5 to 7.5, which could produce surface fault offsets of as much as 15 feet. Many of these faults are sufficiently well understood to be included in the current generation of the National Seismic Hazard Maps; others may become candidates for inclusion in future revisions as research proceeds. Strong shaking is the primary hazard that causes damage from earthquakes and this area is zoned with a high level of shaking hazard. The estimated probability of a magnitude 6.5 or larger earthquake (comparable in size to the 2003 San Simeon quake) occurring in the next 30 years within 30 miles of Platform Grace is 50-60%; for Cabrillo Port, the estimate is a 35% likelihood. Combining these probabilities of earthquake occurrence with relationships that give expected ground motions yields the estimated seismic-shaking hazard. In parts of the project area, the estimated shaking hazard is as high as along the San Andreas Fault. The combination of long-period basin waves and LNG installations with large long-period resonances potentially increases this hazard.
Guagliardi, Ilaria; Cicchella, Domenico; De Rosa, Rosanna; Buttafuoco, Gabriele
2015-07-01
Exposure to lead (Pb) may affect adversely human health. Mapping soil Pb contents is essential to obtain a quantitative estimate of potential risk of Pb contamination. The main aim of this paper was to determine the soil Pb concentrations in the urban and peri-urban area of Cosenza-Rende to map their spatial distribution and assess the probability that soil Pb concentration exceeds a critical threshold that might cause concern for human health. Samples were collected at 149 locations from residual and non-residual topsoil in gardens, parks, flower-beds, and agricultural fields. Fine earth fraction of soil samples was analyzed by X-ray Fluorescence spectrometry. Stochastic images generated by the sequential Gaussian simulation were jointly combined to calculate the probability of exceeding the critical threshold that could be used to delineate the potentially risky areas. Results showed areas in which Pb concentration values were higher to the Italian regulatory values. These polluted areas were quite large and likely, they could create a significant health risk for human beings and vegetation in the near future. The results demonstrated that the proposed approach can be used to study soil contamination to produce geochemical maps, and identify hot-spot areas for soil Pb concentration. Copyright © 2015. Published by Elsevier B.V.
Rodhouse, Thomas J.; Ormsbee, Patricia C.; Irvine, Kathryn M.; Vierling, Lee A.; Szewczak, Joseph M.; Vierling, Kerri T.
2012-01-01
Despite its common status, M. lucifugus was only detected during ∼50% of the surveys in occupied sample units. The overall naïve estimate for the proportion of the study region occupied by the species was 0.69, but after accounting for imperfect detection, this increased to ∼0.90. Our models provide evidence of an association between NPP and forest cover and M. lucifugus distribution, with implications for the projected effects of accelerated climate change in the region, which include net aridification as snowpack and stream flows decline. Annual turnover, the probability that an occupied sample unit was a newly occupied one, was estimated to be low (∼0.04–0.14), resulting in flat trend estimated with relatively high precision (SD = 0.04). We mapped the variation in predicted occurrence probabilities and corresponding prediction uncertainty along the productivity gradient. Our results provide a much needed baseline against which future anticipated declines in M. lucifugus occurrence can be measured. The dynamic distribution modeling approach has broad applicability to regional bat monitoring efforts now underway in several countries and we suggest ways to improve and expand our grid-based monitoring program to gain robust insights into bat population status and trend across large portions of North America.
Precipitation forecast verification over Brazilian watersheds on present and future climate
NASA Astrophysics Data System (ADS)
Xavier, L.; Bruyere, C. L.; Rotunno, O.
2016-12-01
Evaluating the quality of precipitation forecast is an essential step for hydrological studies, among other applications, which is particularly relevant when taking into account climate change and the consequent likely modification of precipitation patterns. In this study we analyzed daily precipitation forecasts given by the global model CESM and the regional model WRF on present and future climate. For present runs, CESM data have been considered from 1980 to 2005, and WRF data from 1990 to 2000. CESM future runs were available for 3 RCP scenarios (4.5, 6.0 and 8.5), over 2005-2100 period; for WRF, future runs spanned 4 different 11-year periods (2020-2030, 2030-2040, 2050-2060 and 2080-2090). WRF simulations had been driven by bias-corrected forcings, and had been done on present climate for a 24 members ensemble created by varying the adopted parameterization schemes. On WRF future climate simulations, data from 3 members out of the original ensemble were available. Precipitation data have been spatially averaged over some large Brazilian watersheds (Amazon and subbasins, Tocantins, Sao Francisco, 4 of Parana`s subbasins) and have been evaluated for present climate against a gauge gridded dataset and ERA Interim data both spanning the 1980-2013 period. The evaluation was focused on the analysis of precipitation forecasts probabilities distribution. Taking into account daily and monthly mean precipitation aggregated on 3-month periods (DJF,MAM,JJA,SON), we adopted some skill measures, amongst them, the Perkins Skill Score (PSS). From the results we verified that on present climate WRF ensemble mean led to clearly better results when compared with CESM data for Amazon, Tocantins and Sao Francisco, but model was not as skillful to the other basins, which could be also been observed for future climate. PSS results from future runs showed that few changes would be observed over the different periods for the considered basins.
Neural response to reward anticipation under risk is nonlinear in probabilities.
Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F
2009-02-18
A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.
Hoeffding Type Inequalities and their Applications in Statistics and Operations Research
NASA Astrophysics Data System (ADS)
Daras, Tryfon
2007-09-01
Large Deviation theory is the branch of Probability theory that deals with rare events. Sometimes, these events can be described by the sum of random variables that deviates from its mean more than a "normal" amount. A precise calculation of the probabilities of such events turns out to be crucial in a variety of different contents (e.g. in Probability Theory, Statistics, Operations Research, Statistical Physics, Financial Mathematics e.t.c.). Recent applications of the theory deal with random walks in random environments, interacting diffusions, heat conduction, polymer chains [1]. In this paper we prove an inequality of exponential type, namely theorem 2.1, which gives a large deviation upper bound for a specific sequence of r.v.s. Inequalities of this type have many applications in Combinatorics [2]. The inequality generalizes already proven results of this type, in the case of symmetric probability measures. We get as consequences to the inequality: (a) large deviations upper bounds for exchangeable Bernoulli sequences of random variables, generalizing results proven for independent and identically distributed Bernoulli sequences of r.v.s. and (b) a general form of Bernstein's inequality. We compare the inequality with large deviation results already proven by the author and try to see its advantages. Finally, using the inequality, we solve one of the basic problems of Operations Research (bin packing problem) in the case of exchangeable r.v.s.
Ramalho, Cristina E; Ottewell, Kym M; Chambers, Brian K; Yates, Colin J; Wilson, Barbara A; Bencini, Roberta; Barrett, Geoff
2018-01-01
The rapid and large-scale urbanization of peri-urban areas poses major and complex challenges for wildlife conservation. We used population viability analysis (PVA) to evaluate the influence of urban encroachment, fire, and fauna crossing structures, with and without accounting for inbreeding effects, on the metapopulation viability of a medium-sized ground-dwelling mammal, the southern brown bandicoot (Isoodon obesulus), in the rapidly expanding city of Perth, Australia. We surveyed two metapopulations over one and a half years, and parameterized the PVA models using largely field-collected data. The models revealed that spatial isolation imposed by housing and road encroachment has major impacts on I. obesulus. Although the species is known to persist in small metapopulations at moderate levels of habitat fragmentation, the models indicate that these populations become highly vulnerable to demographic decline, genetic deterioration, and local extinction under increasing habitat connectivity loss. Isolated metapopulations were also predicted to be highly sensitive to fire, with large-scale fires having greater negative impacts on population abundance than small-scale ones. To reduce the risk of decline and local extirpation of I. obesulus and other small- to medium-sized ground-dwelling mammals in urbanizing, fire prone landscapes, we recommend that remnant vegetation and vegetated, structurally-complex corridors between habitat patches be retained. Well-designed road underpasses can be effective to connect habitat patches and reduce the probability of inbreeding and genetic differentiation; however, adjustment of fire management practices to limit the size of unplanned fires and ensure the retention of long unburnt vegetation will also be required to ensure persistence. Our study supports the evidence that in rapidly urbanizing landscapes, a pro-active conservation approach is required that manages species at the metapopulation level and that prioritizes metapopulations and habitat with greater long-term probability of persistence and conservation capacity, respectively. This strategy may help us prevent future declines and local extirpations, and currently relatively common species from becoming rare.
Ottewell, Kym M.; Chambers, Brian K.; Yates, Colin J.; Wilson, Barbara A.; Bencini, Roberta; Barrett, Geoff
2018-01-01
The rapid and large-scale urbanization of peri-urban areas poses major and complex challenges for wildlife conservation. We used population viability analysis (PVA) to evaluate the influence of urban encroachment, fire, and fauna crossing structures, with and without accounting for inbreeding effects, on the metapopulation viability of a medium-sized ground-dwelling mammal, the southern brown bandicoot (Isoodon obesulus), in the rapidly expanding city of Perth, Australia. We surveyed two metapopulations over one and a half years, and parameterized the PVA models using largely field-collected data. The models revealed that spatial isolation imposed by housing and road encroachment has major impacts on I. obesulus. Although the species is known to persist in small metapopulations at moderate levels of habitat fragmentation, the models indicate that these populations become highly vulnerable to demographic decline, genetic deterioration, and local extinction under increasing habitat connectivity loss. Isolated metapopulations were also predicted to be highly sensitive to fire, with large-scale fires having greater negative impacts on population abundance than small-scale ones. To reduce the risk of decline and local extirpation of I. obesulus and other small- to medium-sized ground-dwelling mammals in urbanizing, fire prone landscapes, we recommend that remnant vegetation and vegetated, structurally-complex corridors between habitat patches be retained. Well-designed road underpasses can be effective to connect habitat patches and reduce the probability of inbreeding and genetic differentiation; however, adjustment of fire management practices to limit the size of unplanned fires and ensure the retention of long unburnt vegetation will also be required to ensure persistence. Our study supports the evidence that in rapidly urbanizing landscapes, a pro-active conservation approach is required that manages species at the metapopulation level and that prioritizes metapopulations and habitat with greater long-term probability of persistence and conservation capacity, respectively. This strategy may help us prevent future declines and local extirpations, and currently relatively common species from becoming rare. PMID:29444118
Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake
NASA Astrophysics Data System (ADS)
Durukal, E.; Sesetyan, K.; Erdik, M.
2009-04-01
The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing period, which would certainly be a major difficulty for the expected low-frequency/high intensity loss case of Istanbul.
First Detected Arrival of a Quantum Walker on an Infinite Line
NASA Astrophysics Data System (ADS)
Thiel, Felix; Barkai, Eli; Kessler, David A.
2018-01-01
The first detection of a quantum particle on a graph is shown to depend sensitively on the distance ξ between the detector and initial location of the particle, and on the sampling time τ . Here, we use the recently introduced quantum renewal equation to investigate the statistics of first detection on an infinite line, using a tight-binding lattice Hamiltonian with nearest-neighbor hops. Universal features of the first detection probability are uncovered and simple limiting cases are analyzed. These include the large ξ limit, the small τ limit, and the power law decay with the attempt number of the detection probability over which quantum oscillations are superimposed. For large ξ the first detection probability assumes a scaling form and when the sampling time is equal to the inverse of the energy band width nonanalytical behaviors arise, accompanied by a transition in the statistics. The maximum total detection probability is found to occur for τ close to this transition point. When the initial location of the particle is far from the detection node we find that the total detection probability attains a finite value that is distance independent.
Smith, Christopher D.; Quist, Michael C.; Hardy, Ryan S.
2015-01-01
Research comparing different sampling techniques helps improve the efficiency and efficacy of sampling efforts. We compared the effectiveness of three sampling techniques (small-mesh hoop nets, benthic trawls, boat-mounted electrofishing) for 30 species in the Green (WY, USA) and Kootenai (ID, USA) rivers by estimating conditional detection probabilities (probability of detecting a species given its presence at a site). Electrofishing had the highest detection probabilities (generally greater than 0.60) for most species (88%), but hoop nets also had high detectability for several taxa (e.g., adult burbot Lota lota, juvenile northern pikeminnow Ptychocheilus oregonensis). Benthic trawls had low detection probabilities (<0.05) for most taxa (84%). Gear-specific effects were present for most species indicating large differences in gear effectiveness among techniques. In addition to gear effects, habitat characteristics also influenced detectability of fishes. Most species-specific habitat relationships were idiosyncratic and reflected the ecology of the species. Overall findings of our study indicate that boat-mounted electrofishing and hoop nets are the most effective techniques for sampling fish assemblages in large, coldwater rivers.
Keefer, David K.; Harp, Edwin L.; Griggs, Gary B.; Evans, Stephen G.; DeGraff, Jerome V.
2002-01-01
The Villa Del Monte landslide was one of 20 large and complex landslides triggered by the 1989 LomaPrieta, California, earthquake in a zone of pervasive coseismicground cracking near the fault rupture. The landslide was approximately 980 m long, 870 m wide, and encompassed an area of approximately 68 ha. Drilling data suggested that movement may have extended to depths as great as 85 m below the ground surface. Even though the landslide moved <1 m, it caused substantial damage to numerous dwellings and other structures, primarily as a result of differential displacements and internal Assuring. Surface cracks, scarps, and compression features delineating the Villa Del Monte landslide were discontinuous, probably because coseismic displacements were small; such discontinuous features were also characteristic of the other large, coseismic landslides in the area, which also moved only short distances during the earthquake. Because features marking landslide boundaries were discontinuous and because other types of coseismic ground cracks were widespread in the area, identification of the landslides required detailed mapping and analysis. Recognition that landslides such as that at Villa Del Monte may occur near earthquake-generating fault ruptures should aid in future hazard evaluations of areas along active faults.
Risk of collective failure provides an escape from the tragedy of the commons.
Santos, Francisco C; Pacheco, Jorge M
2011-06-28
From group hunting to global warming, how to deal with collective action may be formulated in terms of a public goods game of cooperation. In most cases, contributions depend on the risk of future losses. Here, we introduce an evolutionary dynamics approach to a broad class of cooperation problems in which attempting to minimize future losses turns the risk of failure into a central issue in individual decisions. We find that decisions within small groups under high risk and stringent requirements to success significantly raise the chances of coordinating actions and escaping the tragedy of the commons. We also offer insights on the scale at which public goods problems of cooperation are best solved. Instead of large-scale endeavors involving most of the population, which as we argue, may be counterproductive to achieve cooperation, the joint combination of local agreements within groups that are small compared with the population at risk is prone to significantly raise the probability of success. In addition, our model predicts that, if one takes into consideration that groups of different sizes are interwoven in complex networks of contacts, the chances for global coordination in an overall cooperating state are further enhanced.
The dynamic evolutionary history of genome size in North American woodland salamanders.
Newman, Catherine E; Gregory, T Ryan; Austin, Christopher C
2017-04-01
The genus Plethodon is the most species-rich salamander genus in North America, and nearly half of its species face an uncertain future. It is also one of the most diverse families in terms of genome sizes, which range from 1C = 18.2 to 69.3 pg, or 5-20 times larger than the human genome. Large genome size in salamanders results in part from accumulation of transposable elements and is associated with various developmental and physiological traits. However, genome sizes have been reported for only 25% of the species of Plethodon (14 of 55). We collected genome size data for Plethodon serratus to supplement an ongoing phylogeographic study, reconstructed the evolutionary history of genome size in Plethodontidae, and inferred probable genome sizes for the 41 species missing empirical data. Results revealed multiple genome size changes in Plethodon: genomes of western Plethodon increased, whereas genomes of eastern Plethodon decreased, followed by additional decreases or subsequent increases. The estimated genome size of P. serratus was 21 pg. New understanding of variation in genome size evolution, along with genome size inferences for previously unstudied taxa, provide a foundation for future studies on the biology of plethodontid salamanders.
Pre-seismic anomalies from optical satellite observations: a review
NASA Astrophysics Data System (ADS)
Jiao, Zhong-Hu; Zhao, Jing; Shan, Xinjian
2018-04-01
Detecting various anomalies using optical satellite data prior to strong earthquakes is key to understanding and forecasting earthquake activities because of its recognition of thermal-radiation-related phenomena in seismic preparation phases. Data from satellite observations serve as a powerful tool in monitoring earthquake preparation areas at a global scale and in a nearly real-time manner. Over the past several decades, many new different data sources have been utilized in this field, and progressive anomaly detection approaches have been developed. This paper reviews the progress and development of pre-seismic anomaly detection technology in this decade. First, precursor parameters, including parameters from the top of the atmosphere, in the atmosphere, and on the Earth's surface, are stated and discussed. Second, different anomaly detection methods, which are used to extract anomalous signals that probably indicate future seismic events, are presented. Finally, certain critical problems with the current research are highlighted, and new developing trends and perspectives for future work are discussed. The development of Earth observation satellites and anomaly detection algorithms can enrich available information sources, provide advanced tools for multilevel earthquake monitoring, and improve short- and medium-term forecasting, which play a large and growing role in pre-seismic anomaly detection research.
Physics of risk and uncertainty in quantum decision making
NASA Astrophysics Data System (ADS)
Yukalov, V. I.; Sornette, D.
2009-10-01
The Quantum Decision Theory, developed recently by the authors, is applied to clarify the role of risk and uncertainty in decision making and in particular in relation to the phenomenon of dynamic inconsistency. By formulating this notion in precise mathematical terms, we distinguish three types of inconsistency: time inconsistency, planning paradox, and inconsistency occurring in some discounting effects. While time inconsistency is well accounted for in classical decision theory, the planning paradox is in contradiction with classical utility theory. It finds a natural explanation in the frame of the Quantum Decision Theory. Different types of discounting effects are analyzed and shown to enjoy a straightforward explanation within the suggested theory. We also introduce a general methodology based on self-similar approximation theory for deriving the evolution equations for the probabilities of future prospects. This provides a novel classification of possible discount factors, which include the previously known cases (exponential or hyperbolic discounting), but also predicts a novel class of discount factors that decay to a strictly positive constant for very large future time horizons. This class may be useful to deal with very long-term discounting situations associated with intergenerational public policy choices, encompassing issues such as global warming and nuclear waste disposal.
Female competition and aggression: interdisciplinary perspectives
Stockley, Paula; Campbell, Anne
2013-01-01
This paper introduces a Theme Issue combining interdisciplinary perspectives in the study of female competition and aggression. Despite a history of being largely overlooked, evidence is now accumulating for the widespread evolutionary significance of female competition. Here, we provide a synthesis of contributions to this Theme Issue on humans and other vertebrates, and highlight directions for future research. Females compete for resources needed to survive and reproduce, and for preferred mates. Although female aggression takes diverse forms, under most circumstances relatively low-risk competitive strategies are favoured, most probably due to constraints of offspring production and care. In social species, dominance relationships and threats of punishment can resolve social conflict without resort to direct aggression, and coalitions or alliances may reduce risk of retaliation. Consistent with these trends, indirect aggression is a low cost but effective form of competition among young women. Costs are also minimized by flexibility in expression of competitive traits, with aggressive behaviour and competitive signalling tailored to social and ecological conditions. Future research on female competition and the proximate mediators of female aggression will be greatly enhanced by opportunities for interdisciplinary exchange, as evidenced by contributions to this Theme Issue. PMID:24167303
Predicting space climate change
NASA Astrophysics Data System (ADS)
Balcerak, Ernie
2011-10-01
Galactic cosmic rays and solar energetic particles can be hazardous to humans in space, damage spacecraft and satellites, pose threats to aircraft electronics, and expose aircrew and passengers to radiation. A new study shows that these threats are likely to increase in coming years as the Sun approaches the end of the period of high solar activity known as “grand solar maximum,” which has persisted through the past several decades. High solar activity can help protect the Earth by repelling incoming galactic cosmic rays. Understanding the past record can help scientists predict future conditions. Barnard et al. analyzed a 9300-year record of galactic cosmic ray and solar activity based on cosmogenic isotopes in ice cores as well as on neutron monitor data. They used this to predict future variations in galactic cosmic ray flux, near-Earth interplanetary magnetic field, sunspot number, and probability of large solar energetic particle events. The researchers found that the risk of space weather radiation events will likely increase noticeably over the next century compared with recent decades and that lower solar activity will lead to increased galactic cosmic ray levels. (Geophysical Research Letters, doi:10.1029/2011GL048489, 2011)
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
Ou, Jian; Chen, Yongguang; Zhao, Feng; Liu, Jin; Xiao, Shunping
2017-03-19
The extensive applications of multi-function radars (MFRs) have presented a great challenge to the technologies of radar countermeasures (RCMs) and electronic intelligence (ELINT). The recently proposed cognitive electronic warfare (CEW) provides a good solution, whose crux is to perceive present and future MFR behaviours, including the operating modes, waveform parameters, scheduling schemes, etc. Due to the variety and complexity of MFR waveforms, the existing approaches have the drawbacks of inefficiency and weak practicability in prediction. A novel method for MFR behaviour recognition and prediction is proposed based on predictive state representation (PSR). With the proposed approach, operating modes of MFR are recognized by accumulating the predictive states, instead of using fixed transition probabilities that are unavailable in the battlefield. It helps to reduce the dependence of MFR on prior information. And MFR signals can be quickly predicted by iteratively using the predicted observation, avoiding the very large computation brought by the uncertainty of future observations. Simulations with a hypothetical MFR signal sequence in a typical scenario are presented, showing that the proposed methods perform well and efficiently, which attests to their validity.
Ou, Jian; Chen, Yongguang; Zhao, Feng; Liu, Jin; Xiao, Shunping
2017-01-01
The extensive applications of multi-function radars (MFRs) have presented a great challenge to the technologies of radar countermeasures (RCMs) and electronic intelligence (ELINT). The recently proposed cognitive electronic warfare (CEW) provides a good solution, whose crux is to perceive present and future MFR behaviours, including the operating modes, waveform parameters, scheduling schemes, etc. Due to the variety and complexity of MFR waveforms, the existing approaches have the drawbacks of inefficiency and weak practicability in prediction. A novel method for MFR behaviour recognition and prediction is proposed based on predictive state representation (PSR). With the proposed approach, operating modes of MFR are recognized by accumulating the predictive states, instead of using fixed transition probabilities that are unavailable in the battlefield. It helps to reduce the dependence of MFR on prior information. And MFR signals can be quickly predicted by iteratively using the predicted observation, avoiding the very large computation brought by the uncertainty of future observations. Simulations with a hypothetical MFR signal sequence in a typical scenario are presented, showing that the proposed methods perform well and efficiently, which attests to their validity. PMID:28335492
Simple gain probability functions for large reflector antennas of JPL/NASA
NASA Technical Reports Server (NTRS)
Jamnejad, V.
2003-01-01
Simple models for the patterns as well as their cumulative gain probability and probability density functions of the Deep Space Network antennas are developed. These are needed for the study and evaluation of interference from unwanted sources such as the emerging terrestrial system, High Density Fixed Service, with the Ka-band receiving antenna systems in Goldstone Station of the Deep Space Network.
Recalculated probability of M ≥ 7 earthquakes beneath the Sea of Marmara, Turkey
Parsons, T.
2004-01-01
New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved regional A.D. 1500-2000 earthquake catalog and interevent model, which form the basis for time-dependent probability estimates. Calculations presented here also employ detailed models of coseismic and postseismic slip associated with the 17 August 1999 M = 7.4 Izmit earthquake to investigate effects of stress transfer on seismic hazard. Probability changes caused by the 1999 shock depend on Marmara Sea fault-stressing rates, which are calculated with a new finite element model. The combined 2004-2034 regional Poisson probability of M≥7 earthquakes is ~38%, the regional time-dependent probability is 44 ± 18%, and incorporation of stress transfer raises it to 53 ± 18%. The most important effect of adding time dependence and stress transfer to the calculations is an increase in the 30 year probability of a M ??? 7 earthquake affecting Istanbul. The 30 year Poisson probability at Istanbul is 21%, and the addition of time dependence and stress transfer raises it to 41 ± 14%. The ranges given on probability values are sensitivities of the calculations to input parameters determined by Monte Carlo analysis; 1000 calculations are made using parameters drawn at random from distributions. Sensitivities are large relative to mean probability values and enhancements caused by stress transfer, reflecting a poor understanding of large-earthquake aperiodicity.
Large Aperture "Photon Bucket" Optical Receiver Performance in High Background Environments
NASA Technical Reports Server (NTRS)
Vilnrotter, Victor A.; Hoppe, D.
2011-01-01
The potential development of large aperture groundbased "photon bucket" optical receivers for deep space communications, with acceptable performance even when pointing close to the sun, is receiving considerable attention. Sunlight scattered by the atmosphere becomes significant at micron wavelengths when pointing to a few degrees from the sun, even with the narrowest bandwidth optical filters. In addition, high quality optical apertures in the 10-30 meter range are costly and difficult to build with accurate surfaces to ensure narrow fields-of-view (FOV). One approach currently under consideration is to polish the aluminum reflector panels of large 34-meter microwave antennas to high reflectance, and accept the relatively large FOV generated by state-of-the-art polished aluminum panels with rms surface accuracies on the order of a few microns, corresponding to several-hundred micro-radian FOV, hence generating centimeter-diameter focused spots at the Cassegrain focus of 34-meter antennas. Assuming pulse-position modulation (PPM) and Poisson-distributed photon-counting detection, a "polished panel" photon-bucket receiver with large FOV will collect hundreds of background photons per PPM slot, along with comparable signal photons due to its large aperture. It is demonstrated that communications performance in terms of PPM symbol-error probability in high-background high-signal environments depends more strongly on signal than on background photons, implying that large increases in background energy can be compensated by a disproportionally small increase in signal energy. This surprising result suggests that large optical apertures with relatively poor surface quality may nevertheless provide acceptable performance for deep-space optical communications, potentially enabling the construction of cost-effective hybrid RF/optical receivers in the future.
Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis
2016-03-01
Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.
NASA Astrophysics Data System (ADS)
Haeberli, W.
2012-12-01
As a consequence of rapid glacier vanishing, an increasing number of smaller and larger lakes are forming in high-mountain regions worldwide. Such new lakes can be touristic landscape attractions and may also represent interesting potentials for hydropower production. However, they more and more often come into existence at the foot of very large and steep icy mountain walls, which are progressively destabilizing due to changing surface and subsurface ice conditions. The probability of far-reaching flood and debris flow catastrophes caused by impact waves from large rock/ice avalanches into lakes may still appear to be small now but steadily increases for long time periods to come. Corresponding projects related to hazard protection and sustainable use should be combined in an integrative and participatory planning process. This planning process must start soon, because the development in nature is fast and most likely accelerating. Technical tools for creating the necessary scientific knowledge basis at local to regional scales exist and can be used. The location of future new lakes in topographic bed depressions of now still glacier-covered areas can be quite safely assessed on the basis of morphological criteria or by applying ice thickness estimates using digital terrain information. Models for ice-thickness estimates couple the depth to bedrock via the basal shear stress with the surface slope and provide a (relative) bed topography which is much more robust than the (absolute) value of the calculated ice thickness. Numerical models at various levels of sophistication can be used to simulate possible future glacier changes in order to establish the probable time of lake formation and the effects of glacier shrinking on runoff seasonality and water supply. The largest uncertainties thereby relate to the large uncertainties of (absolute) ice thickness and mass/energy fluxes at the surface (climate scenarios, precipitation and albedo changes, etc.). Combined glacier/runoff models can be directly built into models of hydropower operation and economics to test the suitability and feasibility of potential projects. Assessments of hazards and risks must consider the entire chain of processes from slope instability in icy or potentially de-buttressed rock walls via impact waves, breaching of moraine dams, floods and debris flows in river channels and, especially, vulnerability and potential damage to people and infrastructure. High-mountain slope stability under conditions of climate change still constitutes the main weakness in the related knowledge basis and represents a corresponding challenge for focused research.
Managing fire and fuels in a warmer climate
David L. Peterson
2010-01-01
This historical perspective on fire provides a window into the future of fire in the Pacific Northwest. Although fire will always be more common in the interior portion of the region, a warmer climate could bring more fire to the westside of the Cascade Range where summers are typically dry and will probably become drier. If future climate resembles the climate now...
Probabilities of Possible Future Prices (Short-Term Energy Outlook Supplement April 2010)
2010-01-01
The Energy Information Administration introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts.
ERIC Educational Resources Information Center
Kyslenko, Dmytro
2017-01-01
The paper discusses the use of information technologies in professional training of future security specialists in the United States, Great Britain, Poland and Israel. The probable use of computer-based techniques being available within the integrated Web-sites have been systematized. It has been suggested that the presented scheme may be of great…
Future fire probability modeling with climate change data and physical chemistry
Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey
2014-01-01
Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...
Scenario studies as a synthetic and integrative research activity for Long-Term Ecological Research
Jonathan R. Thompson; Arnim Wiek; Frederick J. Swanson; Stephen R. Carpenter; Nancy Fresco; Teresa Hollingsworth; Thomas A. Spies; David R. Foster
2012-01-01
Scenario studies have emerged as a powerful approach for synthesizing diverse forms of research and for articulating and evaluating alternative socioecological futures. Unlike predictive modeling, scenarios do not attempt to forecast the precise or probable state of any variable at a given point in the future. Instead, comparisons among a set of contrasting scenarios...
Prediction of the future number of wells in production (in Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coca, B.P.
1981-01-01
A method to predict the number of wells that will continue producing at a certain date in the future is presented. The method is applicable to reservoirs of the depletion type and is based on the survival probability concept. This is useful when forecasting by empirical methods. An example of a field in primary production is presented.
We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact
Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.
2014-01-01
What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073
Steve Ostro and the Near-Earth Asteroid Impact Hazard
NASA Astrophysics Data System (ADS)
Chapman, Clark R.
2009-09-01
The late Steve Ostro, whose scientific interests in Near-Earth Asteroids (NEAs) primarily related to his planetary radar research in the 1980s, soon became an expert on the impact hazard. He quickly realized that radar provided perspectives on close-approaching NEAs that were both very precise as well as complementary to traditional astrometry, enabling good predictions of future orbits and collision probabilities extending for centuries into the future. He also was among the few astronomers who considered the profound issues raised by this newly recognized hazard and by early suggestions of how to mitigate the hazard. With Carl Sagan, Ostro articulated the "deflection dilemma" and other potential low-probability but real dangers of mitigation technologies that might be more serious than the low-probability impact hazard itself. Yet Ostro maintained a deep interest in developing responsible mitigation technologies, in educating the public about the nature of the impact hazard, and in learning more about the population of threatening bodies, especially using the revealing techniques of delay-doppler radar mapping of NEAs and their satellites.
Real-Time Safety Monitoring and Prediction for the National Airspace System
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil
2016-01-01
As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have both an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecasts, predicted health of assets in the airspace, and so on. To this end, we have developed a Real-Time Safety Monitoring (RTSM) that first, estimates the state of the NAS using the dynamic models. Then, given the state estimate and a probability distribution of future inputs to the NAS, the framework predicts the evolution of the NAS, i.e., the future state, and analyzes these future states to predict the occurrence of unsafe events. The entire probability distribution of airspace safety metrics is computed, not just point estimates, without significant assumptions regarding the distribution type and or parameters. We demonstrate our overall approach by predicting the occurrence of some unsafe events and show how these predictions evolve in time as flight operations progress.
Antipoaching standards in onshore hydrocarbon concessions drawn from a Central African case study.
Vanthomme, Hadrien P A; Tobi, Elie; Todd, Angelique F; Korte, Lisa; Alonso, Alfonso
2017-06-01
Unsustainable hunting outside protected areas is threatening tropical biodiversity worldwide and requires conservationists to engage increasingly in antipoaching activities. Following the example of ecocertified logging companies, we argue that other extractive industries managing large concessions should engage in antipoaching activities as part of their environmental management plans. Onshore hydrocarbon concessions should also adopt antipoaching protocols as a standard because they represent a biodiversity threat comparable to logging. We examined the spatiotemporal patterns of small- and large-mammal poaching in an onshore oil concession in Gabon, Central Africa, with a Bayesian occupancy model based on signs of poaching collected from 2010 to 2015 on antipoaching patrols. Patrol locations were initially determined based on local intelligence and past patrol successes (adaptive management) and subsequently with a systematic sampling of the concession. We generated maps of poaching probability in the concession and determined the temporal trends of this threat over 5 years. The spatiotemporal patterns of large- and small-mammal poaching differed throughout the concession, and likely these groups will need different management strategies. By elucidating the relationship between site-specific sampling effort and detection probability, the Bayesian method allowed us to set goals for future antipoaching patrols. Our results indicate that a combination of systematic sampling and adaptive management data is necessary to infer spatiotemporal patterns with the statistical method we used. On the basis of our case study, we recommend hydrocarbon companies interested in implementing efficient antipoaching activities in their onshore concessions to lay the foundation of long-needed industry standards by: adequately measuring antipoaching effort; mixing adaptive management and balanced sampling; setting goals for antipoaching effort; pairing patrols with large-mammal monitoring; supporting antipoaching patrols across the landscape; restricting access to their concessions; performing random searches for bushmeat and mammal products at points of entry; controlling urban and agricultural expansion; supporting bushmeat alternatives; and supporting land-use planning. Published 2016. This article is a U.S. Government work and is in the public domain in the USA. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
Space Situational Awareness of Large Numbers of Payloads From a Single Deployment
NASA Astrophysics Data System (ADS)
Segerman, A.; Byers, J.; Emmert, J.; Nicholas, A.
2014-09-01
The nearly simultaneous deployment of a large number of payloads from a single vehicle presents a new challenge for space object catalog maintenance and space situational awareness (SSA). Following two cubesat deployments last November, it took five weeks to catalog the resulting 64 orbits. The upcoming Kicksat mission will present an even greater SSA challenge, with its deployment of 128 chip-sized picosats. Although all of these deployments are in short-lived orbits, future deployments will inevitably occur at higher altitudes, with a longer term threat of collision with active spacecraft. With such deployments, individual scientific payload operators require rapid precise knowledge of their satellites' locations. Following the first November launch, the cataloguing did not initially associate a payload with each orbit, leaving this to the satellite operators. For short duration missions, the time required to identify an experiment's specific orbit may easily be a large fraction of the spacecraft's lifetime. For a Kicksat-type deployment, present tracking cannot collect enough observations to catalog each small object. The current approach is to treat the chip cloud as a single catalog object. However, the cloud dissipates into multiple subclouds and, ultimately, tiny groups of untrackable chips. One response to this challenge may be to mandate installation of a transponder on each spacecraft. Directional transponder transmission detections could be used as angle observations for orbit cataloguing. Of course, such an approach would only be employable with cooperative spacecraft. In other cases, a probabilistic association approach may be useful, with the goal being to establish the probability of an element being at a given point in space. This would permit more reliable assessment of the probability of collision of active spacecraft with any cloud element. This paper surveys the cataloguing challenges presented by large scale deployments of small spacecraft, examining current methods. Potential new approaches are discussed, including simulations to evaluate their utility. Acknowledgement: This work was supported by the Office of the Assistant Secretary of Defense for R&E, via the Data-to-Decisions program.
Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.
2007-01-01
Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.
Generating intrinsically disordered protein conformational ensembles from a Markov chain
NASA Astrophysics Data System (ADS)
Cukier, Robert I.
2018-03-01
Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.
Projecting Future Sea Level Rise for Water Resources Planning in California
NASA Astrophysics Data System (ADS)
Anderson, J.; Kao, K.; Chung, F.
2008-12-01
Sea level rise is one of the major concerns for the management of California's water resources. Higher water levels and salinity intrusion into the Sacramento-San Joaquin Delta could affect water supplies, water quality, levee stability, and aquatic and terrestrial flora and fauna species and their habitat. Over the 20th century, sea levels near San Francisco Bay increased by over 0.6ft. Some tidal gauge and satellite data indicate that rates of sea level rise are accelerating. Sea levels are expected to continue to rise due to increasing air temperatures causing thermal expansion of the ocean and melting of land-based ice such as ice on Greenland and in southeastern Alaska. For water planners, two related questions are raised on the uncertainty of future sea levels. First, what is the expected sea level at a specific point in time in the future, e.g., what is the expected sea level in 2050? Second, what is the expected point of time in the future when sea levels will exceed a certain height, e.g., what is the expected range of time when the sea level rises by one foot? To address these two types of questions, two factors are considered: (1) long term sea level rise trend, and (2) local extreme sea level fluctuations. A two-step approach will be used to develop sea level rise projection guidelines for decision making that takes both of these factors into account. The first step is developing global sea level rise probability distributions for the long term trends. The second step will extend the approach to take into account the effects of local astronomical tides, changes in atmospheric pressure, wind stress, floods, and the El Niño/Southern Oscillation. In this paper, the development of the first step approach is presented. To project the long term sea level rise trend, one option is to extend the current rate of sea level rise into the future. However, since recent data indicate rates of sea level rise are accelerating, methods for estimating sea level rise that account for this acceleration are needed. One such method is an empirical relationship between air temperatures and global sea levels. The air temperature-sea level rise relationship was applied to the 12 climate change projections selected by the California Climate Action Team to estimate future sea levels. The 95% confidence level developed from the historical data was extrapolated to estimate the uncertainties in the future projections. To create sea level rise trend probability distributions, a lognormal probability distribution and a generalized extreme value probability distribution are used. Parameter estimations for these distributions are subjective and inevitably involve uncertainties, which will be improved as more research is conducted in this area.
Subjective Probabilities in Household Surveys
Hurd, Michael D.
2011-01-01
Subjective probabilities are now collected on a number of large household surveys with the objective of providing data to better understand inter-temporal decision making. Comparison of subjective probabilities with actual outcomes shows that the probabilities have considerable predictive power in situations where individuals have considerable private information such as survival and retirement. In contrast the subjective probability of a stock market gain varies greatly across individuals even though no one has private information and the outcome is the same for everyone. An explanation is that there is considerable variation in accessing and processing information. Further, the subjective probability of a stock market gain is considerably lower than historical averages, providing an explanation for the relatively low frequency of stock holding. An important research objective will be to understand how individuals form their subjective probabilities. PMID:21643535
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.
To the horizon and beyond: Weak lensing of the CMB and binary inspirals into horizonless objects
NASA Astrophysics Data System (ADS)
Kesden, Michael
This thesis examines two predictions of general relativity: weak lensing and gravitational waves. The cosmic microwave background (CMB) is gravitationally lensed by the large-scale structure between the observer and the last- scattering surface. This weak lensing induces non-Gaussian correlations that can be used to construct estimators for the deflection field. The error and bias of these estimators are derived and used to analyze the viability of lensing reconstruction for future CMB experiments. Weak lensing also affects the one-point probability distribution function of the CMB. The skewness and kurtosis induced by lensing and the Sunayev- Zel'dovich (SZ) effect are calculated as functions of the angular smoothing scale of the map. While these functions offer the advantage of easy computability, only the skewness from lensing-SZ correlations can potentially be detected, even in the limit of the largest amplitude fluctuations allowed by observation. Lensing estimators are also essential to constrain inflation, the favored explanation for large-scale isotropy and the origin of primordial perturbations. B-mode polarization is considered to be a "smoking-gun" signature of inflation, and lensing estimators can be used to recover primordial B-modes from lensing-induced contamination. The ability of future CMB experiments to constrain inflation is assessed as functions of survey size and instrumental sensitivity. A final application of lensing estimators is to constrain a possible cutoff in primordial density perturbations on near-horizon scales. The paucity of independent modes on such scales limits the statistical certainty of such a constraint. Measurements of the deflection field can be used to constrain at the 3s level the existence of a cutoff large enough to account for current CMB observations. A final chapter of this thesis considers an independent topic: the gravitational-wave (GW) signature of a binary inspiral into a horizonless object. If the supermassive objects at galactic centers lack the horizons of traditional black holes, inspiraling objects could emit GWs after passing within their surfaces. The GWs produced by such an inspiral are calculated, revealing distinctive features potentially observable by future GW observatories.
Martin, Petra; Leighl, Natasha B.
2017-01-01
This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice. PMID:28607579
A Sensitivity Study on the Effectiveness of Active Debris Removal in LEO
NASA Technical Reports Server (NTRS)
Liou, J. C.; Johnson, Nicholas L.
2007-01-01
The near-Earth orbital debris population will continue to increase in the future due to ongoing space activities, on-orbit explosions, and accidental collisions among resident space objects. Commonly adopted mitigation measures, such as limiting postmission orbital lifetimes of satellites to less than 25 years, will slow down the population growth, but may be insufficient to stabilize the environment. The nature of the growth, in the low Earth orbit (LEO) region, is further demonstrated by a recent study where no future space launches were conducted in the environment projection simulations. The results indicate that, even with no new launches, the LEO debris population would remain relatively constant for only the next 50 years. Beyond that, the debris population would begin to increase noticeably, due to the production of collisional debris. Therefore, to better limit the growth of future debris population to protect the environment, remediation option, i.e., removing existing large and massive objects from orbit, needs to be considered. This paper does not intend to address the technical or economical issues for active debris removal. Rather, the objective is to provide a sensitivity study to quantify the effectiveness of various remediation options. A removal criterion based upon mass and collision probability is developed to rank objects at the beginning of each projection year. This study includes simulations with removal rates ranging from 2 to 20 objects per year, starting in the year 2020. The outcome of each simulation is analyzed, and compared with others. The summary of the study serves as a general guideline for future debris removal consideration.
The safety of high-hazard water infrastructures in the U.S. Pacific Northwest in a changing climate
NASA Astrophysics Data System (ADS)
Chen, X.; Hossain, F.; Leung, L. R.
2017-12-01
The safety of large and aging water infrastructures is gaining attention in water management given the accelerated rate of change in landscape, climate and society. In current engineering practice, such safety is ensured by the design of infrastructure for the Probable Maximum Precipitation (PMP). Recently, several numerical modeling approaches have been proposed to modernize the conventional and ad hoc PMP estimation approach. However, the underlying physics have not been investigated and thus differing PMP estimates are obtained without clarity on their interpretation. In this study, we present a hybrid approach that takes advantage of both traditional engineering practice and modern climate science to estimate PMP for current and future climate conditions. The traditional PMP approach is improved and applied to five statistically downscaled CMIP5 model outputs, producing an ensemble of PMP estimates in the Pacific Northwest (PNW) during the historical (1970-2016) and future (2050-2099) time periods. The new historical PMP estimates are verified against the traditional estimates. PMP in the PNW will increase by 50%±30% of the current level by 2099 under the RCP8.5 scenario. Most of the increase is caused by warming, which mainly affects moisture availability through increased sea surface temperature, with minor contributions from changes in storm efficiency in the future. Moist track change tends to reduce the future PMP. Compared with extreme precipitation, PMP exhibits higher internal variability. Thus long-time records of high-quality data in both precipitation and related meteorological fields (temperature, wind fields) are required to reduce uncertainties in the ensemble PMP estimates.
Using scenarios to assess possible future impacts of invasive species in the Laurentian Great Lakes
Lauber, T. Bruce; Stedman, Richard C.; Connelly, Nancy A; Rudstam, Lars G.; Ready, Richard C; Poe, Gregory L; Bunnell, David B.; Hook, Tomas O.; Koops, Marten A.; Ludsin, Stuart A.; Rutherford, Edward S; Wittmann, Marion E.
2016-01-01
The expected impacts of invasive species are key considerations in selecting policy responses to potential invasions. But predicting the impacts of invasive species is daunting, particularly in large systems threatened by multiple invasive species, such as North America’s Laurentian Great Lakes. We developed and evaluated a scenario-building process that relied on an expert panel to assess possible future impacts of aquatic invasive species on recreational fishing in the Great Lakes. To maximize its usefulness to policy makers, this process was designed to be implemented relatively rapidly and consider a range of species. The expert panel developed plausible, internally-consistent invasion scenarios for 5 aquatic invasive species, along with subjective probabilities of those scenarios. We describe these scenarios and evaluate this approach for assessing future invasive species impacts. The panel held diverse opinions about the likelihood of the scenarios, and only one scenario with impacts on sportfish species was considered likely by most of the experts. These outcomes are consistent with the literature on scenario building, which advocates for developing a range of plausible scenarios in decision making because the uncertainty of future conditions makes the likelihood of any particular scenario low. We believe that this scenario-building approach could contribute to policy decisions about whether and how to address the possible impacts of invasive species. In this case, scenarios could allow policy makers to narrow the range of possible impacts on Great Lakes fisheries they consider and help set a research agenda for further refining invasive species predictions.
Garske, Tini; Van Kerkhove, Maria D; Yactayo, Sergio; Ronveaux, Olivier; Lewis, Rosamund F; Staples, J Erin; Perea, William; Ferguson, Neil M
2014-05-01
Yellow fever is a vector-borne disease affecting humans and non-human primates in tropical areas of Africa and South America. While eradication is not feasible due to the wildlife reservoir, large scale vaccination activities in Africa during the 1940s to 1960s reduced yellow fever incidence for several decades. However, after a period of low vaccination coverage, yellow fever has resurged in the continent. Since 2006 there has been substantial funding for large preventive mass vaccination campaigns in the most affected countries in Africa to curb the rising burden of disease and control future outbreaks. Contemporary estimates of the yellow fever disease burden are lacking, and the present study aimed to update the previous estimates on the basis of more recent yellow fever occurrence data and improved estimation methods. Generalised linear regression models were fitted to a dataset of the locations of yellow fever outbreaks within the last 25 years to estimate the probability of outbreak reports across the endemic zone. Environmental variables and indicators for the surveillance quality in the affected countries were used as covariates. By comparing probabilities of outbreak reports estimated in the regression with the force of infection estimated for a limited set of locations for which serological surveys were available, the detection probability per case and the force of infection were estimated across the endemic zone. The yellow fever burden in Africa was estimated for the year 2013 as 130,000 (95% CI 51,000-380,000) cases with fever and jaundice or haemorrhage including 78,000 (95% CI 19,000-180,000) deaths, taking into account the current level of vaccination coverage. The impact of the recent mass vaccination campaigns was assessed by evaluating the difference between the estimates obtained for the current vaccination coverage and for a hypothetical scenario excluding these vaccination campaigns. Vaccination campaigns were estimated to have reduced the number of cases and deaths by 27% (95% CI 22%-31%) across the region, achieving up to an 82% reduction in countries targeted by these campaigns. A limitation of our study is the high level of uncertainty in our estimates arising from the sparseness of data available from both surveillance and serological surveys. With the estimation method presented here, spatial estimates of transmission intensity can be combined with vaccination coverage levels to evaluate the impact of past or proposed vaccination campaigns, thereby helping to allocate resources efficiently for yellow fever control. This method has been used by the Global Alliance for Vaccines and Immunization (GAVI Alliance) to estimate the potential impact of future vaccination campaigns.
Garske, Tini; Van Kerkhove, Maria D.; Yactayo, Sergio; Ronveaux, Olivier; Lewis, Rosamund F.; Staples, J. Erin; Perea, William; Ferguson, Neil M.
2014-01-01
Background Yellow fever is a vector-borne disease affecting humans and non-human primates in tropical areas of Africa and South America. While eradication is not feasible due to the wildlife reservoir, large scale vaccination activities in Africa during the 1940s to 1960s reduced yellow fever incidence for several decades. However, after a period of low vaccination coverage, yellow fever has resurged in the continent. Since 2006 there has been substantial funding for large preventive mass vaccination campaigns in the most affected countries in Africa to curb the rising burden of disease and control future outbreaks. Contemporary estimates of the yellow fever disease burden are lacking, and the present study aimed to update the previous estimates on the basis of more recent yellow fever occurrence data and improved estimation methods. Methods and Findings Generalised linear regression models were fitted to a dataset of the locations of yellow fever outbreaks within the last 25 years to estimate the probability of outbreak reports across the endemic zone. Environmental variables and indicators for the surveillance quality in the affected countries were used as covariates. By comparing probabilities of outbreak reports estimated in the regression with the force of infection estimated for a limited set of locations for which serological surveys were available, the detection probability per case and the force of infection were estimated across the endemic zone. The yellow fever burden in Africa was estimated for the year 2013 as 130,000 (95% CI 51,000–380,000) cases with fever and jaundice or haemorrhage including 78,000 (95% CI 19,000–180,000) deaths, taking into account the current level of vaccination coverage. The impact of the recent mass vaccination campaigns was assessed by evaluating the difference between the estimates obtained for the current vaccination coverage and for a hypothetical scenario excluding these vaccination campaigns. Vaccination campaigns were estimated to have reduced the number of cases and deaths by 27% (95% CI 22%–31%) across the region, achieving up to an 82% reduction in countries targeted by these campaigns. A limitation of our study is the high level of uncertainty in our estimates arising from the sparseness of data available from both surveillance and serological surveys. Conclusions With the estimation method presented here, spatial estimates of transmission intensity can be combined with vaccination coverage levels to evaluate the impact of past or proposed vaccination campaigns, thereby helping to allocate resources efficiently for yellow fever control. This method has been used by the Global Alliance for Vaccines and Immunization (GAVI Alliance) to estimate the potential impact of future vaccination campaigns. Please see later in the article for the Editors' Summary PMID:24800812
NASA Technical Reports Server (NTRS)
Norbury, John W.
1992-01-01
The very large electromagnetic dissociation (EMD) cross section recently observed by Hill, Wohn, Schwellenbach, and Smith do not agree with Weizsacker-Williams (WW) theory or any simple modification thereof. Calculations are presented for the reaction probabilities for this experiment and the entire single and double nucleon removal EMD data set. It is found that for those few reactions where theory and experiment disagree, the probabilities are exceptionally large. This indicates that WW theory is not valid for these reactions and that one must consider higher order corrections and perhaps even a non-perturbative approach to quantum electrodynamics (QED).
Multiple interactions and rapidity gap survival
NASA Astrophysics Data System (ADS)
Khoze, V. A.; Martin, A. D.; Ryskin, M. G.
2018-05-01
Observations of rare processes containing large rapidity gaps at high energy colliders may be exceptionally informative. However the cross sections of these events are small in comparison with that for the inclusive processes since there is a large probability that the gaps may be filled by secondary particles arising from additional soft interactions or from gluon radiation. Here we review the calculations of the probability that the gaps survive population by particles from these effects for a wide range of different processes.
Current Fluctuations in Stochastic Lattice Gases
NASA Astrophysics Data System (ADS)
Bertini, L.; de Sole, A.; Gabrielli, D.; Jona-Lasinio, G.; Landim, C.
2005-01-01
We study current fluctuations in lattice gases in the macroscopic limit extending the dynamic approach for density fluctuations developed in previous articles. More precisely, we establish a large deviation theory for the space-time fluctuations of the empirical current which include the previous results. We then estimate the probability of a fluctuation of the average current over a large time interval. It turns out that recent results by Bodineau and Derrida [Phys. Rev. Lett.922004180601] in certain cases underestimate this probability due to the occurrence of dynamical phase transitions.
42 CFR 438.700 - Basis for imposition of sanctions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... medical condition or history indicates probable need for substantial future medical services. (4... directly, or indirectly through any agent or independent contractor, marketing materials that have not been...
Hardebeck, Jeanne L.
2012-01-01
The 2003 M 6.5 San Simeon, California, earthquake caused significant damage in the city of Paso Robles and a persistent cluster of aftershocks close to Paso Robles near the Rinconada fault. Given the importance of secondary aftershock triggering in sequences of large events, a concern is whether this cluster of events could trigger another damaging earthquake near Paso Robles. An epidemic‐type aftershock sequence (ETAS) model is fit to the Rinconada seismicity, and multiple realizations indicate a 0.36% probability of at least one M≥6.0 earthquake during the next 30 years. However, this probability estimate is only as good as the projection into the future of the ETAS model. There is evidence that the seismicity may be influenced by fluid pressure changes, which cannot be forecasted using ETAS. The strongest evidence for fluids is the delay between the San Simeon mainshock and a high rate of seismicity in mid to late 2004. This delay can be explained as having been caused by a pore pressure decrease due to an undrained response to the coseismic dilatation, followed by increased pore pressure during the return to equilibrium. Seismicity migration along the fault also suggests fluid involvement, although the migration is too slow to be consistent with pore pressure diffusion. All other evidence, including focal mechanisms and b‐value, is consistent with tectonic earthquakes. This suggests a model where the role of fluid pressure changes is limited to the first seven months, while the fluid pressure equilibrates. The ETAS modeling adequately fits the events after July 2004 when the pore pressure stabilizes. The ETAS models imply that while the probability of a damaging earthquake on the Rinconada fault has approximately doubled due to the San Simeon earthquake, the absolute probability remains low.
Third-party punishment as a costly signal of high continuation probabilities in repeated games.
Jordan, Jillian J; Rand, David G
2017-05-21
Why do individuals pay costs to punish selfish behavior, even as third-party observers? A large body of research suggests that reputation plays an important role in motivating such third-party punishment (TPP). Here we focus on a recently proposed reputation-based account (Jordan et al., 2016) that invokes costly signaling. This account proposed that "trustworthy type" individuals (who are incentivized to cooperate with others) typically experience lower costs of TPP, and thus that TPP can function as a costly signal of trustworthiness. Specifically, it was argued that some but not all individuals face incentives to cooperate, making them high-quality and trustworthy interaction partners; and, because the same mechanisms that incentivize cooperation also create benefits for using TPP to deter selfish behavior, these individuals are likely to experience reduced costs of punishing selfishness. Here, we extend this conceptual framework by providing a concrete, "from-the-ground-up" model demonstrating how this process could work in the context of repeated interactions incentivizing both cooperation and punishment. We show how individual differences in the probability of future interaction can create types that vary in whether they find cooperation payoff-maximizing (and thus make high-quality partners), as well as in their net costs of TPP - because a higher continuation probability increases the likelihood of receiving rewards from the victim of the punished transgression (thus offsetting the cost of punishing). We also provide a simple model of dispersal that demonstrates how types that vary in their continuation probabilities can stably coexist, because the payoff from remaining in one's local environment (i.e. not dispersing) decreases with the number of others who stay. Together, this model demonstrates, from the group up, how TPP can serve as a costly signal of trustworthiness arising from exposure to repeated interactions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Patriarca, Peter A; Van Auken, R Michael; Kebschull, Scott A
2018-01-01
Benefit-risk evaluations of drugs have been conducted since the introduction of modern regulatory systems more than 50 years ago. Such judgments are typically made on the basis of qualitative or semiquantitative approaches, often without the aid of quantitative assessment methods, the latter having often been applied asymmetrically to place emphasis on benefit more so than harm. In an effort to preliminarily evaluate the utility of lives lost or saved, or quality-adjusted life-years (QALY) lost and gained as a means of quantitatively assessing the potential benefits and risks of a new chemical entity, we focused our attention on the unique scenario in which a drug was initially approved based on one set of data, but later withdrawn from the market based on a second set of data. In this analysis, a dimensionless risk to benefit ratio was calculated in each instance, based on the risk and benefit quantified in similar units. The results indicated that FDA decisions to approve the drug corresponded to risk to benefit ratios less than or equal to 0.136, and that decisions to withdraw the drug from the US market corresponded to risk to benefit ratios greater than or equal to 0.092. The probability of FDA approval was then estimated using logistic regression analysis. The results of this analysis indicated that there was a 50% probability of FDA approval if the risk to benefit ratio was 0.121, and that the probability approaches 100% for values much less than 0.121, and the probability approaches 0% for values much greater than 0.121. The large uncertainty in these estimates due to the small sample size and overlapping data may be addressed in the future by applying the methodology to other drugs.
Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin
Massada, Avi Bar; Radeloff, Volker C.; Stewart, Susan I.; Hawbaker, Todd J.
2009-01-01
The rapid growth of housing in and near the wildland–urban interface (WUI) increases wildfirerisk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfirerisk to a 60,000 ha WUI area in northwesternWisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfirerisk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfirerisk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfirerisk and those most vulnerable under extreme weather conditions.
NASA Technical Reports Server (NTRS)
Massey, J. L.
1976-01-01
The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.
NASA Astrophysics Data System (ADS)
Rey Vicario, D.; Holman, I.
2016-12-01
The use of water for irrigation and on-farm reservoir filling is globally important for agricultural production. In humid climates, like the UK, supplemental irrigation can be critical to buffer the effects of rainfall variability and to achieve high quality crops. Given regulatory efforts to secure sufficient environmental river flows and meet rising water demands due to population growth and climate change, increasing water scarcity is likely to compound the drought challenges faced by irrigated agriculture in this region. Currently, water abstraction from surface waters for agricultural irrigation can be restricted by the Environment Agency during droughts under Section 57 of the Water Resources Act (1991), based on abnormally low river flow levels and rainfall forecast, causing significant economic impacts on irrigated agricultural production. The aim of this study is to assess the impact that climate change may have on agricultural abstraction in the UK within the context of the abstraction restriction triggers currently in place. These triggers have been applied to the `Future Flows hydrology' database to assess the likelihood of increasing restrictions on agricultural abstraction in the future by comparing the probability of voluntary and compulsory restrictions in the baseline (1961-1990) and future period (2071-2098) for 282 catchments throughout the whole of the UK. The results of this study show a general increase in the probability of future agricultural irrigation abstraction restrictions in the UK in the summer, particularly in the South West, although there is significant variability between the 11 ensemble members. The results also indicate that UK winters are likely to become wetter in the future, although in some catchments the probability of abstraction restriction in the reservoir refilling winter months (November-February) could increase slightly. An increasing frequency of drought events due to climate change is therefore likely to lead to more water abstraction restrictions, increasing the need for irrigators to adapt their businesses to increase drought resilience and hence food security.
Vrijheid, Martine; Deltour, Isabelle; Krewski, Daniel; Sanchez, Marie; Cardis, Elisabeth
2006-07-01
This paper examines the effects of systematic and random errors in recall and of selection bias in case-control studies of mobile phone use and cancer. These sensitivity analyses are based on Monte-Carlo computer simulations and were carried out within the INTERPHONE Study, an international collaborative case-control study in 13 countries. Recall error scenarios simulated plausible values of random and systematic, non-differential and differential recall errors in amount of mobile phone use reported by study subjects. Plausible values for the recall error were obtained from validation studies. Selection bias scenarios assumed varying selection probabilities for cases and controls, mobile phone users, and non-users. Where possible these selection probabilities were based on existing information from non-respondents in INTERPHONE. Simulations used exposure distributions based on existing INTERPHONE data and assumed varying levels of the true risk of brain cancer related to mobile phone use. Results suggest that random recall errors of plausible levels can lead to a large underestimation in the risk of brain cancer associated with mobile phone use. Random errors were found to have larger impact than plausible systematic errors. Differential errors in recall had very little additional impact in the presence of large random errors. Selection bias resulting from underselection of unexposed controls led to J-shaped exposure-response patterns, with risk apparently decreasing at low to moderate exposure levels. The present results, in conjunction with those of the validation studies conducted within the INTERPHONE study, will play an important role in the interpretation of existing and future case-control studies of mobile phone use and cancer risk, including the INTERPHONE study.
Impacts of future urban expansion on summer climate and heat-related human health in eastern China.
Cao, Qian; Yu, Deyong; Georgescu, Matei; Wu, Jianguo; Wang, Wei
2018-03-01
China is the largest and most rapidly urbanizing nation in the world, and is projected to add an additional 200 million city dwellers by the end of 2030. While this rapid urbanization will lead to vast expansion of built-up areas, the possible climate effect and associated human health impact remain poorly understood. Using a coupled urban-atmospheric model, we first examine potential effects of three urban expansion scenarios to 2030 on summer climate in eastern China. Our simulations indicate extensive warming up to 5°C, 3°C, and 2°C in regard to low- (>0%), high- (>75%), and 100% probability urban growth scenarios, respectively. The partitioning of available energy largely explains the changes in 2-m air temperatures, and increased sensible heat flux with higher roughness length of the underlying urban surface is responsible for the increase of nighttime planetary boundary layer height. In the extreme case (the low-probability expansion pathway), the agglomeration of impervious surfaces substantially reduces low-level atmospheric moisture, consequently resulting in large-scale precipitation reduction. However, the effect of near-surface warming far exceeds that of moisture reduction and imposes non-negligible thermal loads on urban residents. Our study, using a scenario-based approach that accounts for the full range of urban growth uncertainty by 2030, helps better evaluate possible regional climate effects and associated human health outcomes in the most rapidly urbanizing areas of China, and has practical implications for the development of sustainable urban regions that are resilient to changes in both mean and extreme conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, S. R.; Vallisneri, M.; Ellis, J. A.
2016-03-01
Decade-long timing observations of arrays of millisecond pulsars have placed highly constraining upper limits on the amplitude of the nanohertz gravitational-wave stochastic signal from the mergers of supermassive black hole binaries (∼10{sup −15} strain at f = 1 yr{sup −1}). These limits suggest that binary merger rates have been overestimated, or that environmental influences from nuclear gas or stars accelerate orbital decay, reducing the gravitational-wave signal at the lowest, most sensitive frequencies. This prompts the question whether nanohertz gravitational waves (GWs) are likely to be detected in the near future. In this Letter, we answer this question quantitatively using simple statistical estimates,more » deriving the range of true signal amplitudes that are compatible with current upper limits, and computing expected detection probabilities as a function of observation time. We conclude that small arrays consisting of the pulsars with the least timing noise, which yield the tightest upper limits, have discouraging prospects of making a detection in the next two decades. By contrast, we find large arrays are crucial to detection because the quadrupolar spatial correlations induced by GWs can be well sampled by many pulsar pairs. Indeed, timing programs that monitor a large and expanding set of pulsars have an ∼80% probability of detecting GWs within the next 10 years, under assumptions on merger rates and environmental influences ranging from optimistic to conservative. Even in the extreme case where 90% of binaries stall before merger and environmental coupling effects diminish low-frequency gravitational-wave power, detection is delayed by at most a few years.« less
Einhäuser, Wolfgang; Nuthmann, Antje
2016-09-01
During natural scene viewing, humans typically attend and fixate selected locations for about 200-400 ms. Two variables characterize such "overt" attention: the probability of a location being fixated, and the fixation's duration. Both variables have been widely researched, but little is known about their relation. We use a two-step approach to investigate the relation between fixation probability and duration. In the first step, we use a large corpus of fixation data. We demonstrate that fixation probability (empirical salience) predicts fixation duration across different observers and tasks. Linear mixed-effects modeling shows that this relation is explained neither by joint dependencies on simple image features (luminance, contrast, edge density) nor by spatial biases (central bias). In the second step, we experimentally manipulate some of these features. We find that fixation probability from the corpus data still predicts fixation duration for this new set of experimental data. This holds even if stimuli are deprived of low-level images features, as long as higher level scene structure remains intact. Together, this shows a robust relation between fixation duration and probability, which does not depend on simple image features. Moreover, the study exemplifies the combination of empirical research on a large corpus of data with targeted experimental manipulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krier, D. J.; Perry, F. V.
Location, timing, volume, and eruptive style of post-Miocene volcanoes have defined the volcanic hazard significant to a proposed high-level radioactive waste (HLW) and spent nuclear fuel (SNF) repository at Yucca Mountain, Nevada, as a low-probability, high-consequence event. Examination of eruptive centers in the region that may be analogueues to possible future volcanic activity at Yucca Mountain have aided in defining and evaluating the consequence scenarios for intrusion into and eruption above a repository. The probability of a future event intersecting a repository at Yucca Mountain has a mean value of 1.7 x 10{sup -8} per year. This probability comes frommore » the Probabilistic Volcanic Hazard Assessment (PVHA) completed in 1996 and updated to reflect change in repository layout. Since that time, magnetic anomalies representing potential buried volcanic centers have been identified fiom magnetic surveys; however these potential buried centers only slightly increase the probability of an event intersecting the repository. The proposed repository will be located in its central portion of Yucca Mountain at approximately 300m depth. The process for assessing performance of a repository at Yucca Mountain has identified two scenarios for igneous activity that, although having a very low probability of occurrence, could have a significant consequence should an igneous event occur. Either a dike swarm intersecting repository drifts containing waste packages, or a volcanic eruption through the repository could result in release of radioactive material to the accessible environment. Ongoing investigations are assessing the mechanisms and significance of the consequence scenarios. Lathrop Wells Cone ({approx}80,000 yrs), a key analogue for estimating potential future volcanic activity, is the youngest surface expression of apparent waning basaltic volcanism in the region. Cone internal structure, lavas, and ash-fall tephra have been examined to estimate eruptive volume, eruption type, and subsurface disturbance accompanying conduit growth and eruption. The Lathrop Wells volcanic complex has a total volume estimate of approximately 0.1 km{sup 3}. The eruptive products indicate a sequence of initial magmatic fissure fountaining, early Strombolian activity, and a brief hydrovolcanic phase, and violent Strombolian phase(s). Lava flows adjacent to the Lathrop Wells Cone probably were emplaced during the mid-eruptive sequence. Ongoing investigations continue to address the potential hazards of a volcanic event at Yucca Mountain.« less
Study on Effects of the Stochastic Delay Probability for 1d CA Model of Traffic Flow
NASA Astrophysics Data System (ADS)
Xue, Yu; Chen, Yan-Hong; Kong, Ling-Jiang
Considering the effects of different factors on the stochastic delay probability, the delay probability has been classified into three cases. The first case corresponding to the brake state has a large delay probability if the anticipant velocity is larger than the gap between the successive cars. The second one corresponding to the following-the-leader rule has intermediate delay probability if the anticipant velocity is equal to the gap. Finally, the third case is the acceleration, which has minimum delay probability. The fundamental diagram obtained by numerical simulation shows the different properties compared to that by the NaSch model, in which there exist two different regions, corresponding to the coexistence state, and jamming state respectively.
Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data
NASA Astrophysics Data System (ADS)
Nyaupane, Narayan; Parajuli, Ranjan; Kalra, Ajay
2017-12-01
Flooding is the most severe and costlier natural hazard in US. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with proper understanding of flooding event can mitigate the risk of such hazard. The flood plain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural area in the desert of Nevada has experienced peak flood in recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on a HEC-RAS model, prepared using available terrain data. Some of the climate projection shows extreme increase in future design flood. The future design flood could be more than the historic 500yr flood. At the same time, the extent of flooding could go beyond the historic flood of 0.2% annual probability. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer and stake holders.
NASA Astrophysics Data System (ADS)
Timpanaro, André M.; Prado, Carmen P. C.
2014-05-01
We discuss the exit probability of the one-dimensional q-voter model and present tools to obtain estimates about this probability, both through simulations in large networks (around 107 sites) and analytically in the limit where the network is infinitely large. We argue that the result E(ρ )=ρq/ρq+(1-ρ)q, that was found in three previous works [F. Slanina, K. Sznajd-Weron, and P. Przybyła, Europhys. Lett. 82, 18006 (2008), 10.1209/0295-5075/82/18006; R. Lambiotte and S. Redner, Europhys. Lett. 82, 18007 (2008), 10.1209/0295-5075/82/18007, for the case q =2; and P. Przybyła, K. Sznajd-Weron, and M. Tabiszewski, Phys. Rev. E 84, 031117 (2011), 10.1103/PhysRevE.84.031117, for q >2] using small networks (around 103 sites), is a good approximation, but there are noticeable deviations that appear even for small systems and that do not disappear when the system size is increased (with the notable exception of the case q =2). We also show that, under some simple and intuitive hypotheses, the exit probability must obey the inequality ρq/ρq+(1-ρ)≤E(ρ)≤ρ/ρ +(1-ρ)q in the infinite size limit. We believe this settles in the negative the suggestion made [S. Galam and A. C. R. Martins, Europhys. Lett. 95, 48005 (2001), 10.1209/0295-5075/95/48005] that this result would be a finite size effect, with the exit probability actually being a step function. We also show how the result that the exit probability cannot be a step function can be reconciled with the Galam unified frame, which was also a source of controversy.
O'Connell, Allan F.; Talancy, Neil W.; Bailey, Larissa L.; Sauer, John R.; Cook, Robert; Gilbert, Andrew T.
2006-01-01
Large-scale, multispecies monitoring programs are widely used to assess changes in wildlife populations but they often assume constant detectability when documenting species occurrence. This assumption is rarely met in practice because animal populations vary across time and space. As a result, detectability of a species can be influenced by a number of physical, biological, or anthropogenic factors (e.g., weather, seasonality, topography, biological rhythms, sampling methods). To evaluate some of these influences, we estimated site occupancy rates using species-specific detection probabilities for meso- and large terrestrial mammal species on Cape Cod, Massachusetts, USA. We used model selection to assess the influence of different sampling methods and major environmental factors on our ability to detect individual species. Remote cameras detected the most species (9), followed by cubby boxes (7) and hair traps (4) over a 13-month period. Estimated site occupancy rates were similar among sampling methods for most species when detection probabilities exceeded 0.15, but we question estimates obtained from methods with detection probabilities between 0.05 and 0.15, and we consider methods with lower probabilities unacceptable for occupancy estimation and inference. Estimated detection probabilities can be used to accommodate variation in sampling methods, which allows for comparison of monitoring programs using different protocols. Vegetation and seasonality produced species-specific differences in detectability and occupancy, but differences were not consistent within or among species, which suggests that our results should be considered in the context of local habitat features and life history traits for the target species. We believe that site occupancy is a useful state variable and suggest that monitoring programs for mammals using occupancy data consider detectability prior to making inferences about species distributions or population change.
Forecasting the duration of volcanic eruptions: an empirical probabilistic model
NASA Astrophysics Data System (ADS)
Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.
2014-01-01
The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions.
Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K
2014-01-01
Background: Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. Objective: We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Design: Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Results: Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Conclusion: Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. This trial was registered at clinicaltrials.gov as NCT02099812. PMID:25008855
Mohamed Yusoff, Aini; Tan, Tze King; Hari, Ranjeev; Koepfli, Klaus-Peter; Wee, Wei Yee; Antunes, Agostinho; Sitam, Frankie Thomas; Rovie-Ryan, Jeffrine Japning; Karuppannan, Kayal Vizi; Wong, Guat Jah; Lipovich, Leonard; Warren, Wesley C.; O’Brien, Stephen J.; Choo, Siew Woh
2016-01-01
Pangolins are scale-covered mammals, containing eight endangered species. Maintaining pangolins in captivity is a significant challenge, in part because little is known about their genetics. Here we provide the first large-scale sequencing of the critically endangered Manis javanica transcriptomes from eight different organs using Illumina HiSeq technology, yielding ~75 Giga bases and 89,754 unigenes. We found some unigenes involved in the insect hormone biosynthesis pathway and also 747 lipids metabolism-related unigenes that may be insightful to understand the lipid metabolism system in pangolins. Comparative analysis between M. javanica and other mammals revealed many pangolin-specific genes significantly over-represented in stress-related processes, cell proliferation and external stimulus, probably reflecting the traits and adaptations of the analyzed pregnant female M. javanica. Our study provides an invaluable resource for future functional works that may be highly relevant for the conservation of pangolins. PMID:27618997
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreuzer, Helen W.; Horita, Juske; Moran, James J.
Sodium and potassium cyanide are highly toxic, produced in large amounts by the chemical industry, and linked to numerous high-profile crimes. The U.S. Centers for Disease Control and Prevention has identified cyanide as one of the most probable agents to be used in a future chemical terrorism event. We investigated whether stable C and N isotopic content of sodium and potassium cyanide could serve as a forensic signature for sample matching, using a collection of 65 cyanide samples. A few of these samples displayed non-homogeneous isotopic content associated with degradation to a carbonate salt and loss of hydrogen cyanide. Mostmore » samples had highly reproducible isotope content. Of these, >95% could be properly matched based on C and N isotope ratios, with a false match rate <3%. These results suggest that stable C and N isotope ratios are a useful forensic signature for matching cyanide samples.« less
Working group report on advanced high-voltage high-power and energy-storage space systems
NASA Technical Reports Server (NTRS)
Cohen, H. A.; Cooke, D. L.; Evans, R. W.; Hastings, D.; Jongeward, G.; Laframboise, J. G.; Mahaffey, D.; Mcintyre, B.; Pfizer, K. A.; Purvis, C.
1986-01-01
Space systems in the future will probably include high-voltage, high-power energy-storage and -production systems. Two such technologies are high-voltage ac and dc systems and high-power electrodynamic tethers. The working group identified several plasma interaction phenomena that will occur in the operation of these power systems. The working group felt that building an understanding of these critical interaction issues meant that several gaps in our knowledge had to be filled, and that certain aspects of dc power systems have become fairly well understood. Examples of these current collection are in quiescent plasmas and snap over effects. However, high-voltage dc and almost all ac phenomena are, at best, inadequately understood. In addition, there is major uncertainty in the knowledge of coupling between plasmas and large scale current flows in space plasmas. These gaps in the knowledge are addressed.
Implementation of the semiclassical quantum Fourier transform in a scalable system.
Chiaverini, J; Britton, J; Leibfried, D; Knill, E; Barrett, M D; Blakestad, R B; Itano, W M; Jost, J D; Langer, C; Ozeri, R; Schaetz, T; Wineland, D J
2005-05-13
We report the implementation of the semiclassical quantum Fourier transform in a system of three beryllium ion qubits (two-level quantum systems) confined in a segmented multizone trap. The quantum Fourier transform is the crucial final step in Shor's algorithm, and it acts on a register of qubits to determine the periodicity of the quantum state's amplitudes. Because only probability amplitudes are required for this task, a more efficient semiclassical version can be used, for which only single-qubit operations conditioned on measurement outcomes are required. We apply the transform to several input states of different periodicities; the results enable the location of peaks corresponding to the original periods. This demonstration incorporates the key elements of a scalable ion-trap architecture, suggesting the future capability of applying the quantum Fourier transform to a large number of qubits as required for a useful quantum factoring algorithm.
Estimating annual suspended-sediment loads in the northern and central Appalachian Coal region
Koltun, G.F.
1985-01-01
Multiple-regression equations were developed for estimating the annual suspended-sediment load, for a given year, from small to medium-sized basins in the northern and central parts of the Appalachian coal region. The regression analysis was performed with data for land use, basin characteristics, streamflow, rainfall, and suspended-sediment load for 15 sites in the region. Two variables, the maximum mean-daily discharge occurring within the year and the annual peak discharge, explained much of the variation in the annual suspended-sediment load. Separate equations were developed employing each of these discharge variables. Standard errors for both equations are relatively large, which suggests that future predictions will probably have a low level of precision. This level of precision, however, may be acceptable for certain purposes. It is therefore left to the user to asses whether the level of precision provided by these equations is acceptable for the intended application.
Climate Impacts on Tropospheric Ozone and Hydroxyl
NASA Technical Reports Server (NTRS)
Shindell, Drew T.; Bell, N.; Faluvegi, G.
2003-01-01
Climate change may influence tropospheric ozone and OH via several main pathways: (1) altering chemistry via temperature and humidity changes, (2) changing ozone and precursor sources via surface emissions, stratosphere-troposphere exchange, and light- ning, and (3) affecting trace gas sinks via the hydrological cycle and dry deposition. We report results from a set of coupled chemistry-climate model simulations designed to systematically study these effects. We compare the various effects with one another and with past and projected future changes in anthropogenic and natural emissions of ozone precursors. We find that white the overall impact of climate on ozone is probably small compared to emission changes, some significant seasonal and regional effects are apparent. The global effect on hydroxyl is quite large, however, similar in size to the effect of emission changes. Additionally, we show that many of the chemistry-climate links that are not yet adequately modeled are potentially important.
Mallory, Michael J.; Swain, Lindsay A.; Tyley, Stephen J.
1980-01-01
This report presents a preliminary evaluation of the geohydrologic factors affecting storage of water by artificial recharge in the upper Coachella Valley, Calif. The ground-water basin of the upper Coachella Valley seems to be geologically suitable for large-scale artificial recharge. A minimum of 900 ,000 acre-feet of water could probably be stored in the basin without raising basinwide water levels above those that existed in 1945. Preliminary tests indicate that a long-term artificial recharge rate of 5 feet per day may be feasible for spreading grounds in the basin if such factors as sediment and bacterial clogging can be controlled. The California Department of Water Resources, through the Future Water Supply Program, is investigating the use of ground-water basins for storage of State Water Project water in order to help meet maximum annual entitlements to water project contractors. (USGS)
Moroz, Andrei; Deffune, Elenice
2013-11-01
Platelet-rich plasma has been largely used as a therapeutic option for the treatment of chronic wounds of different etiologies. The enhanced regeneration observed after the use of platelet-rich plasma has been systematically attributed to the growth factors that are present inside platelets' granules. We hypothesize that the remaining plasma and platelet-bound fibronectin may act as a further bioactive protein in platelet-rich plasma preparations. Recent reports were analyzed and presented as direct evidences of this hypotheses. Fibronectin may directly influence the extracellular matrix remodeling during wound repair. This effect is probably through matrix metalloproteinase expression, thus exerting an extra effect on chronic wound regeneration. Physicians should be well aware of the possible fibronectin-induced effects in their future endeavors with PRP in chronic wound treatment. Copyright © 2013 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Kairouz, Sylvia; Paradis, Catherine; Nadeau, Louise; Tovar, Marie-Line; Pousset, Maud
2016-12-01
Few empirical studies have examined the relationships between differing regulatory approaches and patterns of gambling behaviors. This article reports on a correlational cross-cultural comparison of differences in the regulatory approaches and gambling behavior among general adult populations in France and Québec, Canada. We drew data from two large population surveys conducted in France and Québec (N=27 653 and N=11 888, respectively). We found diverging and converging aspects of government regulatory policies. Statistical analyses demonstrated significantly higher participation rates and prevalence of 'assiduous gamblers' in Québec. In France, among assiduous gamblers, the proportion of moderate-risk and probable pathological gamblers is significantly higher. Future research should examine environmental conditions and varying gambling offerings, as well as gambling regulation, to determine their potential influence on gambling behaviors.
Discussion for possibility of some aerodynamic ground effect craft
NASA Astrophysics Data System (ADS)
Tanabe, Yoshikazu
1990-05-01
Some type of pleasant, convenient, safe, and economical transportation method to supplement airplane transportation is currently required. This paper proposes an Aerodynamic Ground Effect Craft (AGEC) as this new transportation method, and studies its qualitative feasibility in comparison with present typical transportation methods such as transporter airplanes, flying boats, and linear motor cars which also have common characteristics of ultra low altitude cruising. Noteworthy points of AGEC are the effective energy consumption against transportation capacity (exergie) and the ultra low altitude cruising, which is relatively safer at the emergency landing than the subsonic airplane's body landing. Through AGEC has shorter cruising range and smaller transportation capacity, its transportation efficiency is superior to that of airplanes and linear motor cars. There is no critical difficulty in large sizing of AGEC, and AGEC is thought to be the very probable candidate which can supplement airplane transportation in the near future.
Fast, noise-free memory for photon synchronization at room temperature.
Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer
2018-01-01
Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.
Extraterrestrial materials processing
NASA Technical Reports Server (NTRS)
Steurer, W. H.
1982-01-01
The first year results of a multi-year study of processing extraterrestrial materials for use in space are summarized. Theoretically, there are potential major advantages to be derived from the use of such materials for future space endeavors. The types of known or postulated starting raw materials are described including silicate-rich mixed oxides on the Moon, some asteroids and Mars; free metals in some asteroids and in small quantities in the lunar soil; and probably volatiles like water and CO2 on Mars and some asteroids. Candidate processes for space materials are likely to be significantly different from their terrestrial counterparts largely because of: absence of atmosphere; lack of of readily available working fluids; low- or micro-gravity; no carbon-based fuels; readily available solar energy; and severe constraints on manned intervention. The extraction of metals and oxygen from lunar material by magma electrolysis or by vapor/ion phase separation appears practical.
Crystal gazing. Part 2: Implications of advanced in digital data storage technology
NASA Technical Reports Server (NTRS)
Wells, D. C.
1984-01-01
During the next 5-10 years it is likely that the bit density available in digital mass storage systems (magnetic tapes, optical and magnetic disks) will be increased to such an extent that it will greatly exceed that of the conventional photographic emulsions like IIIaJ which are used in astronomy. These developments imply that it will soon be advantageous for astronomers to use microdensitometers to completely digitize all photographic plates soon after they are developed. Distribution of digital copies of sky surveys and the contents of plate vaults will probably become feasible within ten years. Copies of other astronomical archieves (e.g., Space Telescope) could also be distributed with the same techniques. The implications for designers of future microdensitometers are: (1) there will be a continuing need for precision digitization of large-format photographic imagery, and (2) that the need for real-time analysis of the output of microdensitometers will decrease.
An investigation of potential applications of OP-SAPS: Operational sampled analog processors
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Mcvey, E. S.
1976-01-01
The impact of charge-coupled device (CCD) processors on future instrumentation was investigated. The CCD devices studied process sampled analog data and are referred to as OP-SAPS - operational sampled analog processors. Preliminary studies into various architectural configurations for systems composed of OP-SAPS show that they have potential in such diverse applications as pattern recognition and automatic control. It appears probable that OP-SAPS may be used to construct computing structures which can serve as special peripherals to large-scale computer complexes used in real time flight simulation. The research was limited to the following benchmark programs: (1) face recognition, (2) voice command and control, (3) terrain classification, and (4) terrain identification. A small amount of effort was spent on examining a method by which OP-SAPS may be used to decrease the limiting ground sampling distance encountered in remote sensing from satellites.
Not feeling well … true or exaggerated? Self-assessed health as a leading health indicator.
Becchetti, Leonardo; Bachelet, Maria; Riccardini, Fabiola
2018-02-01
We provide original, international evidence documenting that self-assessed health (SAH) is a leading health indicator, that is, a significant predictor of future changes in health conditions, in a large sample of Europeans aged above 50 and living in 13 different countries. We find that, after controlling for attrition bias, lagged SAH is significantly and negatively correlated with changes in the number of chronic diseases, net of the correlations with levels, and changes in sociodemographic factors and health styles, country and regional health system effects, and declared symptoms. Illness-specific estimates document that lagged SAH significantly correlates with arthritis, cholesterol, and lung diseases (and weakly so with ulcer, hypertension, and cataracts) and has a significant correlation with the probability of contracting cancer. Interpretations and policy implications of our findings are discussed in the paper. Copyright © 2017 John Wiley & Sons, Ltd.
How constraints affect the hunter's decision to shoot a deer.
Diekert, Florian K; Richter, Andries; Rivrud, Inger Maren; Mysterud, Atle
2016-12-13
Hunting is the predominant way of controlling many wildlife populations devoid of large carnivores. It subjects animals to mortality rates that far exceed natural rates and that differ markedly in which age, sex, or size classes are removed relative to those of natural predators. To explain the emerging selection pattern we develop behavioral microfoundations for a hunting model, emphasizing in particular the constraints given by the formal and informal norms, rules, and regulations that govern the hunter's choice. We show how a shorter remaining season, competition among hunters, lower sighting probabilities, and higher costs all lead to lower reservation values, i.e., an increased likelihood of shooting a particular animal. Using a unique dataset on seen and shot deer from Norway, we test and confirm the theoretical predictions in a recreational and meat-motivated hunting system. To achieve sustainability, future wildlife management should account for this predictable selection pressure.
How constraints affect the hunter’s decision to shoot a deer
Diekert, Florian K.; Richter, Andries; Rivrud, Inger Maren; Mysterud, Atle
2016-01-01
Hunting is the predominant way of controlling many wildlife populations devoid of large carnivores. It subjects animals to mortality rates that far exceed natural rates and that differ markedly in which age, sex, or size classes are removed relative to those of natural predators. To explain the emerging selection pattern we develop behavioral microfoundations for a hunting model, emphasizing in particular the constraints given by the formal and informal norms, rules, and regulations that govern the hunter’s choice. We show how a shorter remaining season, competition among hunters, lower sighting probabilities, and higher costs all lead to lower reservation values, i.e., an increased likelihood of shooting a particular animal. Using a unique dataset on seen and shot deer from Norway, we test and confirm the theoretical predictions in a recreational and meat-motivated hunting system. To achieve sustainability, future wildlife management should account for this predictable selection pressure. PMID:27911775
Chang'e-2 spacecraft observations of asteroid 4179 Toutatis
NASA Astrophysics Data System (ADS)
Ji, Jianghui; Jiang, Yun; Zhao, Yuhui; Wang, Su; Yu, Liangliang
2016-01-01
On 13 December 2012, Chang'e-2 completed a successful flyby of the near-Earth asteroid 4179 Toutatis at a closest distance of 770 meters from the asteroid's surface. The observations show that Toutatis has an irregular surface and its shape resembles a ginger-root of a smaller lobe (head) and a larger lobe (body). Such bilobate shape is indicative of a contact binary origin for Toutatis. In addition, the high-resolution images better than 3 meters provide a number of new discoveries about this asteroid, such as an 800-meter depression at the end of the large lobe, a sharply perpendicular silhouette near the neck region, boulders, indicating that Toutatis is probably a rubble-pile asteroid. Chang'e-2 observations have significantly revealed new insights into the geological features and the formation and evolution of this asteroid. In final, we brief the future Chinese asteroid mission concept.
Continuous-time random-walk model for financial distributions
NASA Astrophysics Data System (ADS)
Masoliver, Jaume; Montero, Miquel; Weiss, George H.
2003-02-01
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.
Energy efficient engine: Propulsion system-aircraft integration evaluation
NASA Technical Reports Server (NTRS)
Owens, R. E.
1979-01-01
Flight performance and operating economics of future commercial transports utilizing the energy efficient engine were assessed as well as the probability of meeting NASA's goals for TSFC, DOC, noise, and emissions. Results of the initial propulsion systems aircraft integration evaluation presented include estimates of engine performance, predictions of fuel burns, operating costs of the flight propulsion system installed in seven selected advanced study commercial transports, estimates of noise and emissions, considerations of thrust growth, and the achievement-probability analysis.
On the abundance of extraterrestrial life after the Kepler mission
NASA Astrophysics Data System (ADS)
Wandel, Amri
2015-07-01
The data recently accumulated by the Kepler mission have demonstrated that small planets are quite common and that a significant fraction of all stars may have an Earth-like planet within their habitable zone. These results are combined with a Drake-equation formalism to derive the space density of biotic planets as a function of the relatively modest uncertainty in the astronomical data and of the (yet unknown) probability for the evolution of biotic life, F b. I suggest that F b may be estimated by future spectral observations of exoplanet biomarkers. If F b is in the range 0.001-1, then a biotic planet may be expected within 10-100 light years from Earth. Extending the biotic results to advanced life I derive expressions for the distance to putative civilizations in terms of two additional Drake parameters - the probability for evolution of a civilization, F c, and its average longevity. For instance, assuming optimistic probability values (F b~F c~1) and a broadcasting longevity of a few thousand years, the likely distance to the nearest civilizations detectable by searching for intelligent electromagnetic signals is of the order of a few thousand light years. The probability of detecting intelligent signals with present and future radio telescopes is calculated as a function of the Drake parameters. Finally, I describe how the detection of intelligent signals would constrain the Drake parameters.
NASA Astrophysics Data System (ADS)
Urban, Nathan M.; Keller, Klaus
2010-10-01
How has the Atlantic Meridional Overturning Circulation (AMOC) varied over the past centuries and what is the risk of an anthropogenic AMOC collapse? We report probabilistic projections of the future climate which improve on previous AMOC projection studies by (i) greatly expanding the considered observational constraints and (ii) carefully sampling the tail areas of the parameter probability distribution function (pdf). We use a Bayesian inversion to constrain a simple model of the coupled climate, carbon cycle and AMOC systems using observations to derive multicentury hindcasts and projections. Our hindcasts show considerable skill in representing the observational constraints. We show that robust AMOC risk estimates can require carefully sampling the parameter pdfs. We find a low probability of experiencing an AMOC collapse within the 21st century for a business-as-usual emissions scenario. The probability of experiencing an AMOC collapse within two centuries is 1/10. The probability of crossing a forcing threshold and triggering a future AMOC collapse (by 2300) is approximately 1/30 in the 21st century and over 1/3 in the 22nd. Given the simplicity of the model structure and uncertainty in the forcing assumptions, our analysis should be considered a proof of concept and the quantitative conclusions subject to severe caveats.
Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska
Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.
2007-01-01
We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.
Engineering principles to assure compatible docking between future spacecraft of USA and USSR
NASA Technical Reports Server (NTRS)
Johnson, C. C.
1973-01-01
An androgynous peripheral type docking mechanism concept selected by the U.S. and the USSR is described. The rationale supporting the selection of the concept, the mechanical principles inherent to the concept, and the probable nature of future designs stemming from the concept are discussed. Operational situations prior to docking, impact conditions, energy absorption, and structural joining of two spacecraft are examined.
Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.
Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang
2017-06-17
Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.
The ranking probability approach and its usage in design and analysis of large-scale studies.
Kuo, Chia-Ling; Zaykin, Dmitri
2013-01-01
In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.
Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs
Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang
2017-01-01
Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12–1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15–1.26 times the storage utilization efficiency compared with other schemes. PMID:28629135
Determining Optimal Evacuation Decision Policies for Disasters
2012-03-01
18 3.3 Calculating the Hit Probability ( Phit ) . . . . . . . . . . . . . . . . . . 20 3.4 Phit versus Vertical...23 Figure 3.13 Large Probability Matrix (Map) . . . . . . . . . . . . . . . . . . . . . 24 Figure 3.14 Particle Trajectory with Phit data...26 Figure 3.15 Phit versus Vertical Volatility . . . . . . . . . . . . . . . . . . . . . . 27 Figure 4.1 Cost-To
40 CFR 144.61 - Definitions of terms as used in this subpart.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operating cycle of the business. Current liabilities means obligations whose liquidation is reasonably... business community. Assets means all existing and all probable future economic benefits obtained or...
Research on quantitative relationship between NIIRS and the probabilities of discrimination
NASA Astrophysics Data System (ADS)
Bai, Honggang
2011-08-01
There are a large number of electro-optical (EO) and infrared (IR) sensors used on military platforms including ground vehicle, low altitude air vehicle, high altitude air vehicle, and satellite systems. Ground vehicle and low-altitude air vehicle (rotary and fixed-wing aircraft) sensors typically use the probabilities of discrimination (detection, recognition, and identification) as design requirements and system performance indicators. High-altitude air vehicles and satellite sensors have traditionally used the National Imagery Interpretation Rating Scale (NIIRS) performance measures for guidance in design and measures of system performance. Recently, there has a large effort to make strategic sensor information available to tactical forces or make the information of targets acquisition can be used by strategic systems. In this paper, the two techniques about the probabilities of discrimination and NIIRS for sensor design are presented separately. For the typical infrared remote sensor design parameters, the function of the probability of recognition and NIIRS scale as the distance R is given to Standard NATO Target and M1Abrams two different size targets based on the algorithm of predicting the field performance and NIIRS. For Standard NATO Target, M1Abrams, F-15, and B-52 four different size targets, the conversion from NIIRS to the probabilities of discrimination are derived and calculated, and the similarities and differences between NIIRS and the probabilities of discrimination are analyzed based on the result of calculation. Comparisons with preliminary calculation results show that the conversion between NIIRS and the probabilities of discrimination is probable although more validation experiments are needed.
Significance of stress transfer in time-dependent earthquake probability calculations
Parsons, T.
2005-01-01
A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.
Defining Baconian Probability for Use in Assurance Argumentation
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.
2016-01-01
The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.