Science.gov

Sample records for 5-percent probability billion

  1. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  2. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  3. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  4. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 4 2014-04-01 2014-04-01 false Definitions and rules relating to a 5-percent..., 1990, but with respect to any group of persons that pursuant to a formal or informal understanding among themselves makes a coordinated acquisition of stock before November 20, 1990, only if the...

  5. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... shareholder. 1.382-3 Section 1.382-3 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... relating to a 5-percent shareholder. (a) Definitions—(1) Entity—(i) In general. An entity is any... old shareholders of L (a total of 60 percent of L stock), with each member purchasing 30 shares....

  6. Evaluation of EA-934NA with 2.5 percent Cab-O-Sil

    NASA Technical Reports Server (NTRS)

    Caldwell, Gordon A.

    1990-01-01

    Currently, Hysol adhesive EA-934NA is used to bond the Field Joint Protection System on the Shuttle rocket motors at Kennedy Space Center. However, due to processing problems, an adhesive with a higher viscosity is needed to alleviate these difficulties. One possible solution is to add Cab-O-Sil to the current adhesive. The adhesive strength and bond strengths that can be obtained when 2.5 percent Cab-O-Sil is added to adhesive EA-934NA are examined and tested over a range of test temperatures from -20 to 300 F. Tensile adhesion button and lap shear specimens were bonded to D6AC steel and uniaxial tensile specimens (testing for strength, initial tangent modulus, elongation and Poisson's ratio) were prepared using Hysol adhesive EA-934NA with 2.5 percent Cab-O-Sil added. These specimens were tested at -20, 20, 75, 100, 125, 150, 200, 250, and 300 F, respectively. Additional tensile adhesion button specimens bonding Rust-Oleum primed and painted D6AC steel to itself and to cork using adhesive EA-934NA with 2.5 percent Cab-O-Sil added were tested at 20, 75, 125, 200, and 300 F, respectively. Results generally show decreasing strength values with increasing test temperatures. The bond strengths obtained using cork as a substrate were totally dependent on the cohesive strength of the cork.

  7. 25 CFR 134.1 - Partial reimbursement of irrigation charges; 5 percent per annum of cost of system, June 30, 1920.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Partial reimbursement of irrigation charges; 5 percent..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES PARTIAL PAYMENT CONSTRUCTION CHARGES ON INDIAN IRRIGATION PROJECTS § 134.1 Partial reimbursement of irrigation charges; 5 percent per annum of cost of system,...

  8. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  9. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...

  10. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...

  11. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...

  12. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part 18... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation §...

  13. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part 18... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation §...

  14. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part 18... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation §...

  15. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines). 57.22232 Section 57.22232 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL MINES...

  16. 8.5 percent efficient screen-printed CdS/CdTe solar cell produced on a 5-cm x 10-cm glass substrate

    NASA Astrophysics Data System (ADS)

    Matsumoto, H.; Nakano, A.; Komatsu, Y.; Uda, H.; Kuribayashi, K.; Ikegami, S.

    1983-02-01

    The preparation conditions of CdS sintered film for 5-cm x 10-cm screen-printed CsS/CdTe solar cells were investigated. Increasing the belt speed of the belt furnace increased the residual amount of Cl ions in the CdS sintered film and lowered the efficiency of the cell. The optimum belt speed was 2 cm/min, corresponding to a sintering time of 90 min. The thickness of the CdS film was changed by changing the screen thickness. Increasing the thickness of the CdS film lowered its surface resistivity and improved the fill factor of a cell. A solar cell of 8.5 percent intrinsic efficiency was obtained from CdS film printed by an 80 mesh screen and sintered at 690 C at a belt speed of 2 cm/min.

  17. Induced Probabilities.

    ERIC Educational Resources Information Center

    Neel, John H.

    Induced probabilities have been largely ignored by educational researchers. Simply stated, if a new or random variable is defined in terms of a first random variable, then induced probability is the probability or density of the new random variable that can be found by summation or integration over the appropriate domains of the original random…

  18. Probability Theory

    NASA Astrophysics Data System (ADS)

    Jaynes, E. T.; Bretthorst, G. Larry

    2003-04-01

    Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.

  19. World population beyond six billion.

    PubMed

    Gelbard, A; Haub, C; Kent, M M

    1999-03-01

    This world report reviews population growth pre-1900, population change during 1900-50 and 1950-2000, causes and effects of population change and projections to 2050. World population grew from 2 billion in 1900 to almost 6 billion in 2000. Population showed more rapid growth in the 17th and 18th centuries. Better hygiene and public sanitation in the 19th century led to expanded life expectancies and quicker growth, primarily in developed countries. Demographic transition in the 19th and 20th centuries was the result of shifts from high to low mortality and fertility. The pace of change varies with culture, level of economic development, and other factors. Not all countries follow the same path of change. The reproductive revolution in the mid-20th century and modern contraception led to greater individual control of fertility and the potential for rapid fertility decline. Political and cultural barriers that limit access affect the pace of decline. Population change is also affected by migration. Migration has the largest effect on the distribution of population. Bongaarts explains differences in fertility by the proportion in unions, contraceptive prevalence, infertility, and abortion. Educational status has a strong impact on adoption of family planning. Poverty is associated with multiple risks. In 2050, population could reach 10.7 billion or remain low at 7.3 billion.

  20. Impingement of Cloud Droplets on 36.5-Percent-Thick Joukowski Airfoil at Zero Angle of Attack and Discussion of Use as Cloud Measuring Instrument in Dye-Tracer Technique

    NASA Technical Reports Server (NTRS)

    Brun, R. J.; Vogt, Dorothea E.

    1957-01-01

    The trajectories of droplets i n the air flowing past a 36.5-percent-thick Joukowski airfoil at zero angle of attack were determined. The amount of water i n droplet form impinging on the airfoil, the area of droplet impingement, and the rate of droplet impingement per unit area on the airfoil surface were calculated from the trajectories and cover a large range of flight and atmospheric conditions. With the detailed impingement information available, the 36.5-percent-thick Joukowski airfoil can serve the dual purpose of use as the principal element in instruments for making measurements in clouds and of a basic shape for estimating impingement on a thick streamlined body. Methods and examples are presented for illustrating some limitations when the airfoil is used as the principal element in the dye-tracer technique.

  1. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  2. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma. PMID:27477933

  3. Spend Billions and They Will Come

    ERIC Educational Resources Information Center

    Fox, Bette-Lee

    2004-01-01

    People look at one billion dollars in one of two ways: if it is the result of the long, hard effort of years of fundraising, they rejoice; if it signifies an astronomical budget deficit, they cringe. How, then, should people respond as a community to reaching the $1 billion mark ($1,242,436,438, to be exact) in this year's spending for public…

  4. Countdown to Six Billion Teaching Kit.

    ERIC Educational Resources Information Center

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  5. Life with Four Billion Atoms

    SciTech Connect

    Knight, Thomas

    2013-04-10

    Today it is commonplace to design and construct single silicon chips with billions of transistors. These are complex systems, difficult (but possible) to design, test, and fabricate. Remarkably, simple living systems can be assembled from a similar number of atoms, most of them in water molecules. In this talk I will present the current status of our attempts at full understanding and complexity reduction of one of the simplest living systems, the free-living bacterial species Mesoplasma florum. This 400 nm diameter cell thrives and replicates every 40 minutes with a genome of only 800 kilobases. Our recent experiments using transposon gene knockouts identified 354 of 683 annotated genes as inessential in laboratory culture when inactivated individually. While a functional redesigned genome will certainly not remove all of those genes, this suggests that roughly half the genome can be removed in an intentional redesign. I will discuss our recent knockout results and methodology, and our future plans for Genome re-engineering using targeted knock-in/knock-out double recombination; whole cell metabolic models; comprehensive whole cell metabolite measurement techniques; creation of plug-and-play metabolic modules for the simplified organism; inherent and engineered biosafety control mechanisms. This redesign is part of a comprehensive plan to lay the foundations for a new discipline of engineering biology. Engineering biological systems requires a fundamentally different viewpoint from that taken by the science of biology. Key engineering principles of modularity, simplicity, separation of concerns, abstraction, flexibility, hierarchical design, isolation, and standardization are of critical importance. The essence of engineering is the ability to imagine, design, model, build, and characterize novel systems to achieve specific goals. Current tools and components for these tasks are primitive. Our approach is to create and distribute standard biological parts

  6. Knot probabilities in random diagrams

    NASA Astrophysics Data System (ADS)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  7. High-Reynolds-Number Test of a 5-Percent-Thick Low-Aspect-Ratio Semispan Wing in the Langley 0.3-Meter Transonic Cryogenic Tunnel: Wing Pressure Distributions

    NASA Technical Reports Server (NTRS)

    Chu, Julio; Lawing, Pierce L.

    1990-01-01

    A high Reynolds number test of a 5 percent thick low aspect ratio semispan wing was conducted in the adaptive wall test section of the Langley 0.3 m Transonic Cryogenic Tunnel. The model tested had a planform and a NACA 64A-105 airfoil section that is similar to that of the pressure instrumented canard on the X-29 experimental aircraft. Chordwise pressure data for Mach numbers of 0.3, 0.7, and 0.9 were measured for an angle-of-attack range of -4 to 15 deg. The associated Reynolds numbers, based on the geometric mean chord, encompass most of the flight regime of the canard. This test was a free transition investigation. A summary of the wing pressures are presented without analysis as well as adapted test section top and bottom wall pressure signatures. However, the presented graphical data indicate Reynolds number dependent complex leading edge separation phenomena. This data set supplements the existing high Reynolds number database and are useful for computational codes comparison.

  8. Atmospheric oxygenation three billion years ago.

    PubMed

    Crowe, Sean A; Døssing, Lasse N; Beukes, Nicolas J; Bau, Michael; Kruger, Stephanus J; Frei, Robert; Canfield, Donald E

    2013-09-26

    It is widely assumed that atmospheric oxygen concentrations remained persistently low (less than 10(-5) times present levels) for about the first 2 billion years of Earth's history. The first long-term oxygenation of the atmosphere is thought to have taken place around 2.3 billion years ago, during the Great Oxidation Event. Geochemical indications of transient atmospheric oxygenation, however, date back to 2.6-2.7 billion years ago. Here we examine the distribution of chromium isotopes and redox-sensitive metals in the approximately 3-billion-year-old Nsuze palaeosol and in the near-contemporaneous Ijzermyn iron formation from the Pongola Supergroup, South Africa. We find extensive mobilization of redox-sensitive elements through oxidative weathering. Furthermore, using our data we compute a best minimum estimate for atmospheric oxygen concentrations at that time of 3 × 10(-4) times present levels. Overall, our findings suggest that there were appreciable levels of atmospheric oxygen about 3 billion years ago, more than 600 million years before the Great Oxidation Event and some 300-400 million years earlier than previous indications for Earth surface oxygenation.

  9. Atmospheric oxygenation three billion years ago.

    PubMed

    Crowe, Sean A; Døssing, Lasse N; Beukes, Nicolas J; Bau, Michael; Kruger, Stephanus J; Frei, Robert; Canfield, Donald E

    2013-09-26

    It is widely assumed that atmospheric oxygen concentrations remained persistently low (less than 10(-5) times present levels) for about the first 2 billion years of Earth's history. The first long-term oxygenation of the atmosphere is thought to have taken place around 2.3 billion years ago, during the Great Oxidation Event. Geochemical indications of transient atmospheric oxygenation, however, date back to 2.6-2.7 billion years ago. Here we examine the distribution of chromium isotopes and redox-sensitive metals in the approximately 3-billion-year-old Nsuze palaeosol and in the near-contemporaneous Ijzermyn iron formation from the Pongola Supergroup, South Africa. We find extensive mobilization of redox-sensitive elements through oxidative weathering. Furthermore, using our data we compute a best minimum estimate for atmospheric oxygen concentrations at that time of 3 × 10(-4) times present levels. Overall, our findings suggest that there were appreciable levels of atmospheric oxygen about 3 billion years ago, more than 600 million years before the Great Oxidation Event and some 300-400 million years earlier than previous indications for Earth surface oxygenation. PMID:24067713

  10. Billion shot flashlamp for spaceborne lasers

    NASA Technical Reports Server (NTRS)

    Richter, Linda; Schuda, Felix; Degnan, John

    1990-01-01

    A billion-shot flashlamp developed under a NASA contract for spaceborne laser missions is presented. Lifetime-limiting mechanisms are identified and addressed. Two energy loadings of 15 and 44 Joules were selected for the initial accelerated life testing. A fluorescence-efficiency test station was used for measuring the useful-light output degradation of the lamps. The design characteristics meeting NASA specifications are outlined. Attention is focused on the physical properties of tungsten-matrix cathodes, the chemistry of dispenser cathodes, and anode degradation. It is reported that out of the total 83 lamps tested in the program, 4 lamps reached a billion shots and one lamp is beyond 1.7 billion shots, while at 44 Joules, 4 lamps went beyond 100 million shots and one lamp reached 500 million shots.

  11. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Where Have All the Billions Gone?

    ERIC Educational Resources Information Center

    Leask, Linda; And Others

    1987-01-01

    Providing a basis to help Alaskans determine future spending levels and priorities, this report traces how the state spent more than $26 billion in general funds from fiscal years 1981 through 1986 before oil prices crashed and brought state revenues tumbling down with them. Figures indicate that cumulative general fund expenditures over the…

  13. Thirteen Billion Years in Half AN Hour

    NASA Astrophysics Data System (ADS)

    Bassett, Bruce A.

    2005-10-01

    We take a high-speed tour of the approximately thirteen billion-year history of our universe focusing on unsolved mysteries and the key events that have sculpted and shaped it - from inflation in the first split second to the dark energy which is currently causing the expansion of the cosmos to accelerate.

  14. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  15. Probability forecast of the suspended sediment concentration using copula

    NASA Astrophysics Data System (ADS)

    Yu, Kun-xia; Li, Peng; Li, Zhanbin

    2016-04-01

    An approach for probability forecast of the suspended sediment loads is presented in our research. Probability forecast model is established based on the joint probability distribution of water discharge and suspended sediment concentration. The conditional distribution function of suspended sediment concentration given water discharge is evaluated provided the joint probability distribution between water discharge and suspended sediment concentration is constructed, and probability forecast of suspended sediment concentration is implemented in terms of conditional probability function. This approach is exemplified using annual data set of ten watersheds in the middle Yellow River which is characterized by heavy sediment. The three-parameter Gamma distribution is employed to fit the marginal distribution of annual water discharge and annual suspended sediment concentration, and the Gumbel copula can well describe the dependence structure between annual water discharge and annual suspended sediment concentration. Annual suspended sediment concentration estimated from the conditional distribution function with forecast probability of 50 percent agree better with the observed suspended sediment concentration values than the traditional sediment rating curve method given water discharge values. The overwhelming majority of observed suspended sediment concentration points lie between the forecast probability of 5 percent and 95 percent, which can be considered as the lower and upper 95th percent uncertainty bound of the predicted observation respectively. The results indicate that probability forecast on the basis of conditional distribution function is a potential alternative in suspended sediment and other hydrological variables estimation.

  16. Four billion people facing severe water scarcity

    PubMed Central

    Mekonnen, Mesfin M.; Hoekstra, Arjen Y.

    2016-01-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare. PMID:26933676

  17. Four billion people facing severe water scarcity.

    PubMed

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  18. Four billion people facing severe water scarcity.

    PubMed

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare. PMID:26933676

  19. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  20. Is quantum probability rational?

    PubMed

    Houston, Alasdair I; Wiesner, Karoline

    2013-06-01

    We concentrate on two aspects of the article by Pothos & Busemeyer (P&B): the relationship between classical and quantum probability and quantum probability as a basis for rational decisions. We argue that the mathematical relationship between classical and quantum probability is not quite what the authors claim. Furthermore, it might be premature to regard quantum probability as the best practical rational scheme for decision making.

  1. Predicted probabilities' relationship to inclusion probabilities.

    PubMed

    Fang, Di; Chong, Jenny; Wilson, Jeffrey R

    2015-05-01

    It has been shown that under a general multiplicative intercept model for risk, case-control (retrospective) data can be analyzed by maximum likelihood as if they had arisen prospectively, up to an unknown multiplicative constant, which depends on the relative sampling fraction. (1) With suitable auxiliary information, retrospective data can also be used to estimate response probabilities. (2) In other words, predictive probabilities obtained without adjustments from retrospective data will likely be different from those obtained from prospective data. We highlighted this using binary data from Medicare to determine the probability of readmission into the hospital within 30 days of discharge, which is particularly timely because Medicare has begun penalizing hospitals for certain readmissions. (3).

  2. Racing To Understand Probability.

    ERIC Educational Resources Information Center

    Van Zoest, Laura R.; Walker, Rebecca K.

    1997-01-01

    Describes a series of lessons designed to supplement textbook instruction of probability by addressing the ideas of "equally likely,""not equally likely," and "fairness," as well as to introduce the difference between theoretical and experimental probability. Presents four lessons using The Wind Racer games to study probability. (ASK)

  3. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  4. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  5. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  6. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  7. The updated billion-ton resource assessment

    SciTech Connect

    Turhollow, Anthony; Perlack, Robert; Eaton, Laurence; Langholtz, Matthew; Brandt, Craig; Downing, Mark; Wright, Lynn; Skog, Kenneth; Hellwinckel, Chad; Stokes, Bryce; Lebow, Patricia

    2014-10-03

    This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the billion-ton study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. However, in looking at the major categories of feedstocks the forest residue biomass potential was determined to be less owing to tighter restrictions on forest residue supply including restrictions due to limited projected increase in traditional harvest for pulpwood and sawlogs. The crop residue potential was also determined to be less because of the consideration of soil carbon and not allowing residue removal from conventionally tilled corn acres. The energy crop potential was estimated to be much greater largely because of land availability and modeling of competition among various competing uses of the land. Generally, the scenario assumptions in the updated assessment are much more plausible to show a billion-ton resource, which would be sufficient to displace 30% or more of the country s present petroleum consumption.

  8. The updated billion-ton resource assessment

    DOE PAGES

    Turhollow, Anthony; Perlack, Robert; Eaton, Laurence; Langholtz, Matthew; Brandt, Craig; Downing, Mark; Wright, Lynn; Skog, Kenneth; Hellwinckel, Chad; Stokes, Bryce; et al

    2014-10-03

    This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the billion-ton study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. However, in looking at the major categories of feedstocks the forest residue biomass potential was determined to be less owing to tighter restrictions on forest residue supply including restrictions due to limited projected increase in traditional harvest for pulpwood and sawlogs. The crop residue potential was also determined to be less because of the consideration of soil carbon and not allowing residue removalmore » from conventionally tilled corn acres. The energy crop potential was estimated to be much greater largely because of land availability and modeling of competition among various competing uses of the land. Generally, the scenario assumptions in the updated assessment are much more plausible to show a billion-ton resource, which would be sufficient to displace 30% or more of the country s present petroleum consumption.« less

  9. Endemic Cardiovascular Diseases of the Poorest Billion.

    PubMed

    Kwan, Gene F; Mayosi, Bongani M; Mocumbi, Ana O; Miranda, J Jaime; Ezzati, Majid; Jain, Yogesh; Robles, Gisela; Benjamin, Emelia J; Subramanian, S V; Bukhman, Gene

    2016-06-14

    The poorest billion people are distributed throughout the world, though most are concentrated in rural sub-Saharan Africa and South Asia. Cardiovascular disease (CVD) data can be sparse in low- and middle-income countries beyond urban centers. Despite this urban bias, CVD registries from the poorest countries have long revealed a predominance of nonatherosclerotic stroke, hypertensive heart disease, nonischemic and Chagas cardiomyopathies, rheumatic heart disease, and congenital heart anomalies, among others. Ischemic heart disease has been relatively uncommon. Here, we summarize what is known about the epidemiology of CVDs among the world's poorest people and evaluate the relevance of global targets for CVD control in this population. We assessed both primary data sources, and the 2013 Global Burden of Disease Study modeled estimates in the world's 16 poorest countries where 62% of the population are among the poorest billion. We found that ischemic heart disease accounted for only 12% of the combined CVD and congenital heart anomaly disability-adjusted life years (DALYs) in the poorest countries, compared with 51% of DALYs in high-income countries. We found that as little as 53% of the combined CVD and congenital heart anomaly burden (1629/3049 DALYs per 100 000) was attributed to behavioral or metabolic risk factors in the poorest countries (eg, in Niger, 82% of the population among the poorest billion) compared with 85% of the combined CVD and congenital heart anomaly burden (4439/5199 DALYs) in high-income countries. Further, of the combined CVD and congenital heart anomaly burden, 34% was accrued in people under age 30 years in the poorest countries, while only 3% is accrued under age 30 years in high-income countries. We conclude although the current global targets for noncommunicable disease and CVD control will help diminish premature CVD death in the poorest populations, they are not sufficient. Specifically, the current framework (1) excludes deaths of

  10. Simulating Billion-Task Parallel Programs

    SciTech Connect

    Perumalla, Kalyan S; Park, Alfred J

    2014-01-01

    In simulating large parallel systems, bottom-up approaches exercise detailed hardware models with effects from simplified software models or traces, whereas top-down approaches evaluate the timing and functionality of detailed software models over coarse hardware models. Here, we focus on the top-down approach and significantly advance the scale of the simulated parallel programs. Via the direct execution technique combined with parallel discrete event simulation, we stretch the limits of the top-down approach by simulating message passing interface (MPI) programs with millions of tasks. Using a timing-validated benchmark application, a proof-of-concept scaling level is achieved to over 0.22 billion virtual MPI processes on 216,000 cores of a Cray XT5 supercomputer, representing one of the largest direct execution simulations to date, combined with a multiplexing ratio of 1024 simulated tasks per real task.

  11. Abstract Models of Probability

    NASA Astrophysics Data System (ADS)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  12. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  13. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  14. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  15. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  16. Life: the first two billion years.

    PubMed

    Knoll, Andrew H; Bergmann, Kristin D; Strauss, Justin V

    2016-11-01

    Microfossils, stromatolites, preserved lipids and biologically informative isotopic ratios provide a substantial record of bacterial diversity and biogeochemical cycles in Proterozoic (2500-541 Ma) oceans that can be interpreted, at least broadly, in terms of present-day organisms and metabolic processes. Archean (more than 2500 Ma) sedimentary rocks add at least a billion years to the recorded history of life, with sedimentological and biogeochemical evidence for life at 3500 Ma, and possibly earlier; phylogenetic and functional details, however, are limited. Geochemistry provides a major constraint on early evolution, indicating that the first bacteria were shaped by anoxic environments, with distinct patterns of major and micronutrient availability. Archean rocks appear to record the Earth's first iron age, with reduced Fe as the principal electron donor for photosynthesis, oxidized Fe the most abundant terminal electron acceptor for respiration, and Fe a key cofactor in proteins. With the permanent oxygenation of the atmosphere and surface ocean ca 2400 Ma, photic zone O2 limited the access of photosynthetic bacteria to electron donors other than water, while expanding the inventory of oxidants available for respiration and chemoautotrophy. Thus, halfway through Earth history, the microbial underpinnings of modern marine ecosystems began to take shape.This article is part of the themed issue 'The new bacteriology'.

  17. Eight billion asteroids in the Oort cloud

    NASA Astrophysics Data System (ADS)

    Shannon, Andrew; Jackson, Alan P.; Veras, Dimitri; Wyatt, Mark

    2015-01-01

    The Oort cloud is usually thought of as a collection of icy comets inhabiting the outer reaches of the Solar system, but this picture is incomplete. We use simulations of the formation of the Oort cloud to show that ˜4 per cent of the small bodies in the Oort cloud should have formed within 2.5 au of the Sun, and hence be ice-free rock-iron bodies. If we assume that these Oort cloud asteroids have the same size distribution as their cometary counterparts, the Large Synoptic Survey Telescope should find roughly a dozen Oort cloud asteroids during 10 years of operations. Measurement of the asteroid fraction within the Oort cloud can serve as an excellent test of the Solar system's formation and dynamical history. Oort cloud asteroids could be of particular concern as impact hazards as their high mass density, high impact velocity, and low visibility make them both hard to detect and hard to divert or destroy. However, they should be a rare class of object, and we estimate globally catastrophic collisions should only occur about once per billion years.

  18. Life: the first two billion years.

    PubMed

    Knoll, Andrew H; Bergmann, Kristin D; Strauss, Justin V

    2016-11-01

    Microfossils, stromatolites, preserved lipids and biologically informative isotopic ratios provide a substantial record of bacterial diversity and biogeochemical cycles in Proterozoic (2500-541 Ma) oceans that can be interpreted, at least broadly, in terms of present-day organisms and metabolic processes. Archean (more than 2500 Ma) sedimentary rocks add at least a billion years to the recorded history of life, with sedimentological and biogeochemical evidence for life at 3500 Ma, and possibly earlier; phylogenetic and functional details, however, are limited. Geochemistry provides a major constraint on early evolution, indicating that the first bacteria were shaped by anoxic environments, with distinct patterns of major and micronutrient availability. Archean rocks appear to record the Earth's first iron age, with reduced Fe as the principal electron donor for photosynthesis, oxidized Fe the most abundant terminal electron acceptor for respiration, and Fe a key cofactor in proteins. With the permanent oxygenation of the atmosphere and surface ocean ca 2400 Ma, photic zone O2 limited the access of photosynthetic bacteria to electron donors other than water, while expanding the inventory of oxidants available for respiration and chemoautotrophy. Thus, halfway through Earth history, the microbial underpinnings of modern marine ecosystems began to take shape.This article is part of the themed issue 'The new bacteriology'. PMID:27672146

  19. Uranium in Canada: A billion dollar industry

    SciTech Connect

    Ruzicka, V. )

    1989-12-01

    In 1988, Canada maintained its position as the world's leading producer of uranium with an output of more than 12,400 MT of uranium in concentrates, worth $1.1 billion Canadian. As domestic requirements represent only 15% of current Canadian production, most of the output was exported. With current implementation of the Canada/US Free Trade Agreement, the US has become Canada's major uranium export customer. With a large share of the world's known uranium resources, Canada remains the focus of international uranium exploration activity. In 1988, the uranium exploration expenditures in Canada exceeded $58 million Canadian. The principal exploration targets were deposits associated with Proterozoic unconformities in Saskatchewan and Northwest Territories, particularly those in the Athabasca and Thelon basin regions of the Canadian Shield. Major attention was also paid to polymetallic deposits in which uranium is associated with precious metals, such as gold and platinum group elements. Conceptual genetic models for these deposit types represent useful tools to guide exploration.

  20. Site geotechnical considerations for expansion of the Strategic Petroleum Reserve (SPR) to one billion barrels

    SciTech Connect

    Neal, J.T. ); Whittington, D.W. ); Magorian, T.R. , Amherst, NY )

    1991-01-01

    Eight Gulf Coast salt domes have emerged as candidate sites for possible expansion of the Strategic Petroleum Reserve (SPR) to one billion barrels. Two existing SPR sites, Big Hill, TX, and Weeks Island, LA, are among the eight that are being considered. To achieve the billion barrel capacity, some 25 new leached caverns would be constructed, and would probably be established in two separate sites in Louisiana and Texas because of distribution requirements. Geotechnical factors involved in siting studies have centered first and foremost on cavern integrity and environmental acceptability, once logistical suitability is realized. Other factors have involved subsidence and flooding potential, loss of coastal marshlands, seismicity, brine injection well utility, and co-use by multiple operators. 5 refs., 11 figs., 2 tabs.

  1. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  2. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  3. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  4. On Probability Domains

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  5. Colleges Angle for Billions to Build Obama's Broadband Network

    ERIC Educational Resources Information Center

    Parry, Marc

    2009-01-01

    As the federal government prepares to pour billions of stimulus dollars into increased broadband Internet access, colleges are trying to claim much of the money and shape the emerging national networking policy. Their focus is $4.7-billion that will be doled out under a new grant program administered by a small Commerce Department agency called…

  6. FAO aims for audience of two billion.

    PubMed

    Along with large programs to produce more food, the United Nations Food and Agriculture Organization (FAO) is 1 of the largest communicators and teachers of social development information to the developing world's agricultural and rural citizenry. The U.N. has given the FAO responsibility to raise rural levels of living. FAO does not try to reach directly its audience of 2 billion people who earn their living from agriculture; it works with communicators and educators located within national institutions in each country. FAO's communication program begins at the top-most level of national government in attempts to gain the attention and priority that the agricultural sector of the economy merits in total national development. The main activities of the FAO's population education program fall under several headings: 1) assistance to countries in collecting data and preparing projections on agricultural population and labor force; 2) dissemination of information to countries about implications of population trends on food supply and demand, agricultural employment, and assistance in research activities related to these implications; 3) training of national agricultural planners in demographic aspects of agricultural development; 4) assistance to countries that want to include population concepts in the curricula of agricultural training institutions and training programs for rural development staff; 5) preparation of training and communications materials related to the introduction of population concepts into agricultural training institutions and programs; 6) orienting special types of agricultural institutions toward population motivation; and 7) assistance to countries to strengthen population programs in the context of rural development needs. FAO is emphatic that its population education programs do not include family planning communication, for it does not want to introduce information about contraceptive methods or where to get contraceptive services.

  7. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  8. Fractal probability laws.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  9. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  10. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  11. Academic Pork Barrel Tops $2-Billion for the First Time.

    ERIC Educational Resources Information Center

    Brainard, Jeffrey; Borrego, Anne Marie

    2003-01-01

    Describes how, despite the growing budget deficit, Congress directed a record $2 billion to college projects in 2003, many of them dealing with security and bioterrorism. Includes data tables on the earmarks. (EV)

  12. Harnessing Energy from the Sun for Six Billion People

    SciTech Connect

    Daniel Nocera

    2011-09-12

    Daniel Nocera, a Massachusetts Institute of Technology professor whose recent research focuses on solar-powered fuels, presents a Brookhaven Science Associates Distinguished Lecture, titled "Harnessing Energy from the Sun for Six Billion People -- One at a Time."

  13. NASA Now Minute: Earth and Space Science: 100 Billion Planets

    NASA Video Gallery

    Stephen Kane, co-author of the article, “Study Shows Our Galaxy has 100Billion Planets” reveals details about this incredible study explainsjust how common planets are in our Milky Way galaxy...

  14. Harnessing Energy from the Sun for Six Billion People

    ScienceCinema

    Daniel Nocera

    2016-07-12

    Daniel Nocera, a Massachusetts Institute of Technology professor whose recent research focuses on solar-powered fuels, presents a Brookhaven Science Associates Distinguished Lecture, titled "Harnessing Energy from the Sun for Six Billion People -- One at a Time."

  15. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  16. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  17. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  18. Early Archean (3.3-billion to 3.5-billion-year-old) microfossils from Warrawoona Group, Australia.

    PubMed

    Schopf, J W; Packer, B M

    1987-07-01

    Cellularly preserved filamentous and colonial fossil microorganisms have been discovered in bedded carbonaceous cherts from the Early Archean Apex Basalt and Towers Formation of northwestern Western Australia. The cell types detected suggest that cyanobacteria, and therefore oxygen-producing photosynthesis, may have been extant as early as 3.3 billion to 3.5 billion years ago. These fossils are among the oldest now known from the geologic record; their discovery substantiates previous reports of Early Archean microfossils in Warrawoona Group strata.

  19. Probability distributions for magnetotellurics

    SciTech Connect

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  20. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  1. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  2. Winglets Save Billions of Dollars in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  3. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  4. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  5. Colleges' Billion-Dollar Campaigns Feel the Economy's Sting

    ERIC Educational Resources Information Center

    Masterson, Kathryn

    2009-01-01

    The economy's collapse has caught up with the billion-dollar campaign. In the past 12 months, the amount of money raised by a dozen of the colleges engaged in higher education's biggest fund-raising campaigns fell 32 percent from the year before. The decline, which started before the worst of the recession, has forced colleges to postpone…

  6. Congress Gives Colleges a Billion-Dollar Bonanza.

    ERIC Educational Resources Information Center

    Brainard, Jeffrey; Southwick, Ron

    2000-01-01

    Reports that Congress has earmarked a record amount of money (more than $1 billion) for projects involving specific colleges in the 2000 fiscal year. Notes that such "pork-barrel" spending has tripled since 1996. Charts show trends in earmarks since 1989, year 2000 earmarks by agency, the top 20 recipients of earmarked grants, and ranking of…

  7. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  8. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  9. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  10. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  11. Constraining the last 7 billion years of galaxy evolution in semi-analytic models

    NASA Astrophysics Data System (ADS)

    Mutch, Simon J.; Poole, Gregory B.; Croton, Darren J.

    2013-01-01

    We investigate the ability of the Croton et al. semi-analytic model to reproduce the evolution of observed galaxies across the final 7 billion years of cosmic history. Using Monte Carlo Markov Chain techniques we explore the available parameter space to produce a model which attempts to achieve a statistically accurate fit to the observed stellar mass function at z = 0 and z ≈ 0.8, as well as the local black hole-bulge relation. We find that in order to be successful we are required to push supernova feedback efficiencies to extreme limits which are, in some cases, unjustified by current observations. This leads us to the conclusion that the current model may be incomplete. Using the posterior probability distributions provided by our fitting, as well as the qualitative details of our produced stellar mass functions, we suggest that any future model improvements must act to preferentially bolster star formation efficiency in the most massive haloes at high redshift.

  12. Probability Interpretation of Quantum Mechanics.

    ERIC Educational Resources Information Center

    Newton, Roger G.

    1980-01-01

    This paper draws attention to the frequency meaning of the probability concept and its implications for quantum mechanics. It emphasizes that the very meaning of probability implies the ensemble interpretation of both pure and mixed states. As a result some of the "paradoxical" aspects of quantum mechanics lose their counterintuitive character.…

  13. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  14. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  15. Conservation of protein structure over four billion years

    PubMed Central

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M.; Gaucher, Eric A.; Sanchez-Ruiz, Jose M.; Gavira, Jose A.

    2013-01-01

    SUMMARY Little is known with certainty about the evolution of protein structures in general and the degree of protein structure conservation over planetary time scales in particular. Here we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating back up to ~4 billion years before present. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold while only small structural changes have occurred over 4 billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods of time and is followed by long periods of structural stasis. PMID:23932589

  16. Conservation of protein structure over four billion years.

    PubMed

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M; Gaucher, Eric A; Sanchez-Ruiz, Jose M; Gavira, Jose A

    2013-09-01

    Little is known about the evolution of protein structures and the degree of protein structure conservation over planetary time scales. Here, we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating up to approximately four billion years ago. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold, whereas only small structural changes have occurred over four billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods and is followed by long periods of structural stasis. PMID:23932589

  17. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  18. Holographic probabilities in eternal inflation.

    PubMed

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  19. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  20. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  1. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  2. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  5. Opioid Epidemic Costs U.S. $78.5 Billion Annually: CDC

    MedlinePlus

    ... Epidemic Costs U.S. $78.5 Billion Annually: CDC Economic burden includes health care, lost productivity and treatment ... powerful prescription painkillers called opioids costs the U.S. economy $78.5 billion a year, according to a ...

  6. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  7. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  8. Children's understanding of posterior probability.

    PubMed

    Girotto, Vittorio; Gonzalez, Michel

    2008-01-01

    Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five, children's decisions under uncertainty (Study 1) and judgments about random outcomes (Study 2) are correctly affected by posterior information. From the same age, children correctly revise their decisions in situations in which they face a single, uncertain event, produced by an intentional agent (Study 3). The finding that young children have some understanding of posterior probability supports the theory of naive extensional reasoning, and contravenes some pessimistic views of probabilistic reasoning, in particular the evolutionary claim that the human mind cannot deal with single-case probability. PMID:17391661

  9. Billion particle linac simulations for future light sources

    SciTech Connect

    Ryne, R. D.; Venturini, M.; Zholents, A. A.; Qiang, J.

    2008-09-25

    In this paper we report on multi-physics, multi-billion macroparticle simulation of beam transport in a free electron laser (FEL) linac for future light source applications. The simulation includes a self-consistent calculation of 3D space-charge effects, short-range geometry wakefields, longitudinal coherent synchrotron radiation (CSR) wakefields, and detailed modeling of RF acceleration and focusing. We discuss the need for and the challenges associated with such large-scale simulation. Applications to the study of the microbunching instability in an FEL linac are also presented.

  10. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  11. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  12. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  13. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  14. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  15. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  16. Comments on quantum probability theory.

    PubMed

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  17. Scalable in-memory RDFS closure on billions of triples.

    SciTech Connect

    Goodman, Eric L.; Mizell, David

    2010-06-01

    We present an RDFS closure algorithm, specifically designed and implemented on the Cray XMT supercomputer, that obtains inference rates of 13 million inferences per second on the largest system configuration we used. The Cray XMT, with its large global memory (4TB for our experiments), permits the construction of a conceptually straightforward algorithm, fundamentally a series of operations on a shared hash table. Each thread is given a partition of triple data to process, a dedicated copy of the ontology to apply to the data, and a reference to the hash table into which it inserts inferred triples. The global nature of the hash table allows the algorithm to avoid a common obstacle for distributed memory machines: the creation of duplicate triples. On LUBM data sets ranging between 1.3 billion and 5.3 billion triples, we obtain nearly linear speedup except for two portions: file I/O, which can be ameliorated with the additional service nodes, and data structure initialization, which requires nearly constant time for runs involving 32 processors or more.

  18. $75 Billion in Formula Grants Failed to Drive Reform. Can $5 Billion in Competitive Grants Do the Job? Education Stimulus Watch. Special Report 2

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    In early 2009, Congress passed and President Barack Obama signed into law the American Recovery and Reinvestment Act (ARRA), the federal government's nearly $800 billion stimulus legislation. According to key members of Congress and the Obama administration, the education portions of the law, totaling about $100 billion, were designed both to…

  19. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  20. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  1. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  2. European cogeneration market estimated at $12 billion by 1995

    SciTech Connect

    Not Available

    1987-09-01

    A new study by an international market research firm projects that European companies will install some $12 billion worth of cogeneration equipment by 1995. Products for waste heat recovery, steam and turbine generators made up a $993 million cogeneration market in 1986 in Europe. West Germany is expected to account for nearly a quarter of all volume throughout the 1986-95 period. In 1986, industry there spent some $218 million on cogeneration equipment. France and the U.K. are expected to account for another 17% and 16% of sales, respectively. The report discusses the various industrial end users, and on average finds that commercial/institutional establishments; food, beverage and tobacco producers; fuels processors; and the pulp and paper industry each represent between 10% and 15% of cogeneration purchases.

  3. Objective Probability and Quantum Fuzziness

    NASA Astrophysics Data System (ADS)

    Mohrhoff, U.

    2009-02-01

    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which

  4. Bigger, Better Catalog Unveils Half a Billion Celestial Objects

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These frames are samples from the photographic sky surveys, which have been digitized by a technical team at the Space Telescope Science Institute to support the Hubble Space Telescope operations. The team processed these images to create a new astronomical catalog, called the Guide Star Catalog II. This project was undertaken by the Space Telescope Science Institute as an upgrade to an earlier sky survey and catalog (DSS-I and GSC-I), initially done to provide guide stars for pointing the Hubble Space Telescope. By virtue of its sheer size, the DSS-II and GSC-II have many research applications for both professional and amateur astronomers. [Top] An example from the DSS-II shows the Rosette Nebula, (originally photographed by the Palomar Observatory) as digitized in the DSS-I (left) and DSS-II (right). The DSS-II includes views of the sky at both red and blue wavelengths, providing invaluable color information on about one billion deep-sky objects. [Bottom] This blow-up of the inset box in the raw DSS-I scan shows examples of the GSC-I and the improved GSC-II catalogs. Astronomers extracted the stars from the scanned plate of the Rosette and listed them in the catalogs. The new GSC-II catalog provides the colors, positions, and luminosities of nearly half a billion stars -- over 20 times as many as the original GSC-I. The GSC-II contains information on stars as dim as the 19th magnitude. Credit: NASA, the DSS-II and GSC-II Consortia (with images from the Palomar Observatory-STScI Digital Sky Survey of the northern sky, based on scans of the Second Palomar Sky Survey are copyright c 1993-1999 by the California Institute of Technology)

  5. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  6. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew R.; Piro, Anthony; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  7. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  8. Persistence probabilities for stream populations.

    PubMed

    Samia, Yasmine; Lutscher, Frithjof

    2012-07-01

    Individuals in streams and rivers are constantly at risk of being washed downstream and thereby lost to their population. The possibility of diffusion-mediated persistence of populations in advective environments has been the focus of a multitude of recent modeling efforts. Most of these recent models are deterministic, and they predict the existence of a critical advection velocity, above which a population cannot persist. In this work, we present a stochastic approach to the persistence problem in streams and rivers. We use the dominant eigenvalue of the advection-diffusion operator to transition from a spatially explicit description to a spatially implicit birth-death process, in which individual washout from the domain appears as an additional death term. We find that the deterministic persistence threshold is replaced by a smooth transition from almost sure persistence to extinction as advection velocity increases. More interestingly, we explore how temporal variation in flow rate and other parameters affect the persistence probability. In line with general expectations, we find that temporal variation often decreases the persistence probability, and we focus on a few examples of how variation can increase population persistence.

  9. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  10. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  11. Lévy laws in free probability

    PubMed Central

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes. PMID:12473744

  12. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  13. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  14. Orbital forcing of climate 1.4 billion years ago

    PubMed Central

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U.; Wang, Huajian; Costa, M. Mafalda; Bjerrum, Christian J.; Connelly, James N.; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E.

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment. PMID:25775605

  15. Nine Billion Years: Past and Future of the Solar System

    NASA Astrophysics Data System (ADS)

    Leubner, I. H.

    2013-05-01

    As the Sun is losing mass and thus gravity by radiation and solar wind, solar-planetary energy balances diminish. Since the planets are only weakly bound to the Sun, the planets have been moving away from the Sun, causing increases of orbits and orbital periods. This is modeled for selected planets from Mercury to Sedna and from the formation of the Solar system at -4.5 to +4.5 billion years (Byr/Ma). Planets were initially significantly closer to the Sun, suggesting that modeling of the formation of the solar system needs to be revisited. By +4.5Byr planets beyond Saturn will have separated from the Solar System. The presently outermost solar object, Sedna, is in the process of separation. Climate changes of Mars and Earth are modeled as a function of time. The prediction of the transition of Mars from water to ice at -3.6 Byr is in agreement with observations (-2.9 to -3.7 Byr). This provides for the first time answers to the why and when of water to ice transition on Mars. Earth temperatures are predicted to decrease by of 38, 24, and 20C between -4.5 Byr to +4.5 Byr for present temperatures of +50, 0, and -50 C, respectively. Mars: Water - Ice Transition

  16. Thermal Evolution of the Earth During the First Billion Years

    NASA Astrophysics Data System (ADS)

    Sotin, Christophe

    There is good evidence that life occurred on Earth during the first billion years of its history. Modelling the dynamics of the Earth at this period of time is critical to understand the conditions of the emergence of life. These conditions are the result of the coupling between the inner and outer envelopes of the Earth. Several processes such as volcanism, magnetic field and plate tectonics originate in the Earth's deep layers. They control the physical and chemical conditions of the outer layers (atmosphere, hydrosphere, and crust) where life appeared and developed. The goal of this chapter is to describe these internal processes and to present models for Earth's evolution. After a descriptive summary of our current knowledge of the Earth's deep interior, this chapter explains the mechanisms of heat transfer to the surface by subsolidus thermal convection, a process that drives the Earth's surface dynamics (volcanism and plate tectonics). The last part of this chapter addresses the Earth's magnetic field and how it prevents atmospheric escape and preserves the present atmosphere. Throughout this chapter, references to conditions existing on Earth-like planets are given to illustrate how the knowledge of these planets contributes to a better understanding of the history of our own planet.

  17. The nuclear interaction at Oklo 2 billion years ago

    NASA Astrophysics Data System (ADS)

    Fujii, Yasunori; Iwamoto, Akira; Fukahori, Tokio; Ohnuki, Toshihiko; Nakagawa, Masayuki; Hidaka, Hiroshi; Oura, Yasuji; Möller, Peter

    2000-05-01

    We re-examine the effort to constrain the time variability of the coupling constants of the fundamental interactions by studying the anomalous isotopic abundance of Sm observed at the remnants of the natural reactors which were in operation at Oklo about 2 billion years ago, in terms of a possible deviation of the resonance energy from the value observed today. We rely on new samples that were carefully collected to minimize natural contamination and also on a careful temperature estimate of the reactors. We obtain the upper bound (-0.2±0.8)×10 -17 y -1 on the fractional rate of change of the electromagnetic as well as the strong interaction coupling constants. Our result basically agrees with and even suggests some improvement of the result due recently to Damour and Dyson. Strictly speaking, however, we find another choice of the resonance energy shift indicating a non-zero time variation of the constants. However, we find a rather strong but still tentative indication that this non-null range can be ruled out by including Gd data, for which it is essential to take the effect of contamination into account.

  18. Deep space communication - A one billion mile noisy channel

    NASA Technical Reports Server (NTRS)

    Smith, J. G.

    1982-01-01

    Deep space exploration is concerned with the study of natural phenomena in the solar system with the aid of measurements made at spacecraft on deep space missions. Deep space communication refers to communication between earth and spacecraft in deep space. The Deep Space Network is an earth-based facility employed for deep space communication. It includes a network of large tracking antennas located at various positions around the earth. The goals and achievements of deep space exploration over the past 20 years are discussed along with the broad functional requirements of deep space missions. Attention is given to the differences in space loss between communication satellites and deep space vehicles, effects of the long round-trip light time on spacecraft autonomy, requirements for the use of massive nuclear power plants on spacecraft at large distances from the sun, and the kinds of scientific return provided by a deep space mission. Problems concerning a deep space link of one billion miles are also explored.

  19. Fuel efficient stoves for the poorest two billion

    NASA Astrophysics Data System (ADS)

    Gadgil, Ashok

    2012-03-01

    About 2 billion people cook their daily meals on generally inefficient, polluting, biomass cookstoves. The fuels include twigs and leaves, agricultural waste, animal dung, firewood, and charcoal. Exposure to resulting smoke leads to acute respiratory illness, and cancers, particularly among women cooks, and their infant children near them. Resulting annual mortality estimate is almost 2 million deaths, higher than that from malaria or tuberculosis. There is a large diversity of cooking methods (baking, boiling, long simmers, brazing and roasting), and a diversity of pot shapes and sizes in which the cooking is undertaken. Fuel-efficiency and emissions depend on the tending of the fire (and thermal power), type of fuel, stove characteristics, and fit of the pot to the stove. Thus, no one perfect fuel-efficient low-emitting stove can suit all users. Affordability imposes a further severe constraint on the stove design. For various economic strata within the users, a variety of stove designs may be appropriate and affordable. In some regions, biomass is harvested non-renewably for cooking fuel. There is also increasing evidence that black carbon emitted from stoves is a significant contributor to atmospheric forcing. Thus improved biomass stoves can also help mitigate global climate change. The speaker will describe specific work undertaken to design, develop, test, and disseminate affordable fuel-efficient stoves for internally displaced persons (IDPs) of Darfur, Sudan, where the IDPs face hardship, humiliation, hunger, and risk of sexual assault owing to their dependence on local biomass for cooking their meals.

  20. Earth's air pressure 2.7 billion years ago constrained to less than half of modern levels

    NASA Astrophysics Data System (ADS)

    Som, Sanjoy M.; Buick, Roger; Hagadorn, James W.; Blake, Tim S.; Perreault, John M.; Harnmeijer, Jelte P.; Catling, David C.

    2016-06-01

    How the Earth stayed warm several billion years ago when the Sun was considerably fainter is the long-standing problem of the `faint young Sun paradox'. Because of negligible O2 and only moderate CO2 levels in the Archaean atmosphere, methane has been invoked as an auxiliary greenhouse gas. Alternatively, pressure broadening in a thicker atmosphere with a N2 partial pressure around 1.6-2.4 bar could have enhanced the greenhouse effect. But fossilized raindrop imprints indicate that air pressure 2.7 billion years ago (Gyr) was below twice modern levels and probably below 1.1 bar, precluding such pressure enhancement. This result is supported by nitrogen and argon isotope studies of fluid inclusions in 3.0-3.5 Gyr rocks. Here, we calculate absolute Archaean barometric pressure using the size distribution of gas bubbles in basaltic lava flows that solidified at sea level ~2.7 Gyr in the Pilbara Craton, Australia. Our data indicate a surprisingly low surface atmospheric pressure of Patm = 0.23 +/- 0.23 (2σ) bar, and combined with previous studies suggests ~0.5 bar as an upper limit to late Archaean Patm. The result implies that the thin atmosphere was rich in auxiliary greenhouse gases and that Patm fluctuated over geologic time to a previously unrecognized extent.

  1. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  2. Early-type Galaxy Evolution: The Last 8 Billion Years

    NASA Astrophysics Data System (ADS)

    Kaviraj, Sugata; Yi, S. K.; Ellis, R.; Schawinski, K.; Gawiser, E.; Silk, J.; van Dokkum, P.; Urry, M.

    2010-01-01

    I review our current understanding of the star formation histories of early-type galaxies, in the context of recent observational studies of their rest-frame ultraviolet (UV) properties. By combining GALEX and SDSS photometry at low reshift, and exploiting (deep) optical surveys (MUSYC/COMBO-17/GEMS/COSMOS) at intermediate redshift, we are able to put unprecedented constraints on the formation and evolution of these galaxies over the last 8 billion years. In agreement with previous (optical) studies, the results indicate that the bulk of the stellar mass in early-types forms at high redshift (z > 1), possibly over short timescales (< 1 Gyr). Nevertheless, early-types of all luminosities form stars over the lifetime of the Universe, with most luminous (-23 < M(V) -21) systems forming up to 10-15% of their stellar mass after z = 1 (with a scatter to higher values), while their less massive counterparts form up to 30-60% of their mass in the same redshift range. The intensity of recent star formation and the UV colour distribution is quantitatively consistent with what might be expected from minor mergers (mass ratios < 1:3) in an LCDM cosmology. This is supported by visual inspection of HST images of early-types around z 0.5 which show a remarkable correspondence between the presence of morphological disturbances and UV excess. We use our results to speculate on the potentially significant role of minor merging on the evolution of the massive galaxy population at late epochs and the possible characteristics that future surveys will have to possess to study the minor merger process. This research was supported by a Fellowship from the Royal Commission for the Exhibition of 1851 and a Senior Research Fellowship from Worcester College, Oxford (SK).

  3. 3.5 billion years of reshaped Moho, southern Africa

    NASA Astrophysics Data System (ADS)

    Stankiewicz, Jacek; de Wit, Maarten

    2013-12-01

    According to some previous studies, Archean continental crust is, on global average, apparently thinner than Proterozoic crust. Subsequently, the validity of this statement has been questioned. To provide an additional perspective on this issue, we present analyses of Moho signatures derived from recent seismic data along swaths 2000 km in length across southern Africa and its flanking ocean. The imaged crust has a near continuous age range between ca. 0.1 and 3.7 billion years, and the seismic data allow direct comparison of Moho depths between adjacent Archean, Proterozoic and Phanerozoic crust. We find no simple secular change in depth to Moho over this time period. In contrast, there is significant variation in depth to Moho beneath both Archean and Proterozoic crust; Archean crust of southern Africa displays as much crustal diversity in thickness as the adjacent Proterozoic crust. The Moho beneath all crustal provinces that we have analysed has been severely altered by tectono-metamorphic and igneous processes, in many cases more than once, and cannot provide unequivocal data for geodynamic models dealing with secular changes in continental crust formation. These results and conclusions are similar to those documented along ca. 2000 km swaths across the Canadian Shield recorded by Lithoprobe. Tying the age and character of the Precambrian crust of southern Africa to their depth diversities is clearly related to manifold processes of tectono-thermal ‘surgery’ subsequent to their origin, the details of which are still to be resolved, as they are in most Precambrian terranes. Reconstructing pristine Moho of the early Earth therefore remains a formidable challenge. In South Africa, better knowledge of ‘fossilised’ Archean crustal sections ‘turned-on-edge’, such as at the Vredefort impact crater (for the continental crust), and from the Barberton greenstone belt (for oceanic crust) is needed to characterize potential pristine Archean Moho transitions.

  4. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  5. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  6. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  7. Constraining the geodynamo and magnetopause during Earth's first billion years

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Tarduno, J. A.; Davis, W. J.; Mamajek, E.

    2013-12-01

    A key parameter in determining solar-terrestrial interactions for the early Earth is the magnetopause standoff distance, determined by the balance between the geomagnetic field and solar wind pressure. The oldest constraints are for 3.45 Ga, during which the magnetopause standoff was less than half the distance of present-day, suggesting an environment where enhanced volatile loss (including water) from the atmosphere seems unavoidable (Tarduno et al., Science, 2010). As we look further back in time there are two vastly different, but currently viable models for the geodynamo. In one the dynamo started shortly after core formation, whereas in the other the dynamo was delayed by as much as 1 billion years by slow lower mantle cooling. A further uncertainty in standoff calculations is solar mass loss for the first 700 million years of the young Sun. Here we address both the uncertainties in solar winds and Earth's dipole moment. We constrain solar mass loss using a new model for the evolution of solar magnetic topology with time, allowing us to extend our prior calculations to the earliest Sun. Extant rocks suitable for paleomagnetic analysis are not available older than ca. 3.47 Ga, however, silicate minerals containing magnetic inclusions composing sedimentary rocks could preserve an ancient record of the geodynamo. Among these, the Jack Hills metaconglomerate (Yilgarn craton, Western Australia) is a promising unit because cobbles pass a conglomerate test (Tarduno and Cottrell, EPSL, 2013). Following our work on zircons and other single silicate crystals hosting magnetic inclusions in the Rochester laboratory since 1997, we discuss the first successful Thellier-Thellier paleointensity results on zircons measured in situ in quartz and as isolated crystals. We employ a CO2 laser demagnetization system and a small bore (6.3 mm) 3-component DC SQUID magnetometer; the latter offers the highest currently available moment resolution. We will discuss our related

  8. Lévy processes in free probability

    PubMed Central

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability. PMID:12473745

  9. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  10. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  11. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  12. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  13. Illustrating Basic Probability Calculations Using "Craps"

    ERIC Educational Resources Information Center

    Johnson, Roger W.

    2006-01-01

    Instructors may use the gambling game of craps to illustrate the use of a number of fundamental probability identities. For the "pass-line" bet we focus on the chance of winning and the expected game length. To compute these, probabilities of unions of disjoint events, probabilities of intersections of independent events, conditional probabilities…

  14. Subjective and objective probabilities in quantum mechanics

    SciTech Connect

    Srednicki, Mark

    2005-05-15

    We discuss how the apparently objective probabilities predicted by quantum mechanics can be treated in the framework of Bayesian probability theory, in which all probabilities are subjective. Our results are in accord with earlier work by Caves, Fuchs, and Schack, but our approach and emphasis are different. We also discuss the problem of choosing a noninformative prior for a density matrix.

  15. Preservation of hydrocarbons and biomarkers in oil trapped inside fluid inclusions for >2 billion years

    NASA Astrophysics Data System (ADS)

    George, Simon C.; Volk, Herbert; Dutkiewicz, Adriana; Ridley, John; Buick, Roger

    2008-02-01

    Oil-bearing fluid inclusions occur in a ca. 2.45 Ga fluvial metaconglomerate of the Matinenda Formation at Elliot Lake, Canada. The oil, most likely derived from the conformably overlying deltaic McKim Formation, was trapped in quartz and feldspar during diagenesis and early metamorphism of the host rock, probably before ca. 2.2 Ga. Molecular geochemical analyses of the oil reveal a wide range of compounds, including CH 4, CO 2, n-alkanes, isoprenoids, monomethylalkanes, aromatic hydrocarbons, low molecular weight cyclic hydrocarbons, and trace amounts of complex multi-ring biomarkers. Maturity ratios show that the oil was generated in the oil window, with no evidence of extensive thermal cracking. This is remarkable, given that the oils were exposed to upper prehnite-pumpellyite facies metamorphism (280-350 °C) either during migration or after entrapment. The fluid inclusions are closed systems, with high fluid pressures, and contain no clays or other minerals or metals that might catalyse oil-to-gas cracking. These three attributes may all contribute to the thermal stability of the included oil and enable survival of biomarkers and molecular ratios over billions of years. The biomarker geochemistry of the oil in the Matinenda Formation fluid inclusions enables inferences about the organisms that contributed to the organic matter deposited in the Palaeoproterozoic source rocks from which the analysed oil was generated and expelled. The presence of biomarkers produced by cyanobacteria and eukaryotes that are derived from and trapped in rocks deposited before ca. 2.2 Ga is consistent with an earlier evolution of oxygenic photosynthesis and suggests that some aquatic settings had become sufficiently oxygenated for sterol biosynthesis by this time. The extraction of biomarker molecules from Palaeoproterozoic oil-bearing fluid inclusions thus establishes a new method, using low detection limits and system blank levels, to trace evolution through Earth's early history

  16. Calibrating Subjective Probabilities Using Hierarchical Bayesian Models

    NASA Astrophysics Data System (ADS)

    Merkle, Edgar C.

    A body of psychological research has examined the correspondence between a judge's subjective probability of an event's outcome and the event's actual outcome. The research generally shows that subjective probabilities are noisy and do not match the "true" probabilities. However, subjective probabilities are still useful for forecasting purposes if they bear some relationship to true probabilities. The purpose of the current research is to exploit relationships between subjective probabilities and outcomes to create improved, model-based probabilities for forecasting. Once the model has been trained in situations where the outcome is known, it can then be used in forecasting situations where the outcome is unknown. These concepts are demonstrated using experimental psychology data, and potential applications are discussed.

  17. The uncertainty in earthquake conditional probabilities

    USGS Publications Warehouse

    Savage, J.C.

    1992-01-01

    The Working Group on California Earthquake Probabilities (WGCEP) questioned the relevance of uncertainty intervals assigned to earthquake conditional probabilities on the basis that the uncertainty in the probability estimate seemed to be greater the smaller the intrinsic breadth of the recurrence-interval distribution. It is shown here that this paradox depends upon a faulty measure of uncertainty in the conditional probability and that with a proper measure of uncertainty no paradox exists. The assertion that the WGCEP probability assessment in 1988 correctly forecast the 1989 Loma Prieta earthquake is also challenged by showing that posterior probability of rupture inferred after the occurrence of the earthquake from the prior WGCEP probability distribution reverts to a nearly informationless distribution. -Author

  18. Integrated statistical modelling of spatial landslide probability

    NASA Astrophysics Data System (ADS)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  19. A SWIRE Picture is Worth Billions of Years

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: SWIRE View of Distant Galaxies [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 2Figure 3 Figure 4

    These spectacular images, taken by the Spitzer Wide-area Infrared Extragalactic (SWIRE) Legacy project, encapsulate one of the primary objectives of the Spitzer mission: to connect the evolution of galaxies from the distant, or early, universe to the nearby, or present day, universe.

    The Tadpole galaxy (main image) is the result of a recent galactic interaction in the local universe. Although these galactic mergers are rare in the universe's recent history, astronomers believe that they were much more common in the early universe. Thus, SWIRE team members will use this detailed image of the Tadpole galaxy to help understand the nature of the 'faint red-orange specks' of the early universe.

    The larger picture (figure 2) depicts one-sixteenth of the SWIRE survey field called ELAIS-N1. In this image, the bright blue sources are hot stars in our own Milky Way, which range anywhere from 3 to 60 times the mass of our Sun. The fainter green spots are cooler stars and galaxies beyond the Milky Way whose light is dominated by older stellar populations. The red dots are dusty galaxies that are undergoing intense star formation. The faintest specks of red-orange are galaxies billions of light-years away in the distant universe.

    Figure 3 features an unusual ring-like galaxy called CGCG 275-022. The red spiral arms indicate that this galaxy is very dusty and perhaps undergoing intense star formation. The star-forming activity could have been initiated by a near head-on collision with another galaxy.

    The most distant galaxies that SWIRE is able to detect are revealed in a zoom of deep space (figure 4). The colors in this feature represent the same objects as those in the larger field image of ELAIS

  20. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  1. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  2. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  3. Rubidium-strontium date of possibly 3 billion years for a granitic rock from antarctica.

    PubMed

    Halpern, M

    1970-09-01

    A single total rock sample of biotite granite from Jule Peaks, Antarctica, has been dated by the rubidium-strontium method at about 3 billion years. The juxtaposition of this sector of Antarctica with Africa in the Dietz and Sproll continental drift reconstruction results in a possible geochronologic fit of the Princess Martha Coast of Antarctica with a covered possible notheastern extension of the African Swaziland Shield, which contains granitic rocks that are also 3 billion years old.

  4. Indian farmers need help to feed over 1.5 billion people in 2030.

    PubMed

    Jagadish, Mittur N

    2012-01-01

    In view of the enormous challenge and pressure on farmers to feed 9 billion plus people and billions of animals who are going to be living in our planet in 2050, new technologies must be invented, assessed and adapted. Farmer welfare and provision of resources required for their work is of paramount importance. India has benefited from Bt cotton technology and will certainly benefit from other biotech crops that have been carefully developed and assessed for consumption and environmental safety.

  5. Rules Set for $4 Billion Race to Top Contest: Final Rules Give States Detailed Map in Quest for $4 Billion in Education Stimulus Aid

    ERIC Educational Resources Information Center

    McNeil, Michele

    2009-01-01

    For a good shot at $4 billion in grants from the federal Race to the Top Fund, states will need to make a persuasive case for their education reform agendas, demonstrate significant buy-in from local school districts, and devise plans to evaluate teachers and principals based on student performance, according to final regulations released last…

  6. Entropy analysis of systems exhibiting negative probabilities

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J. A.

    2016-07-01

    This paper addresses the concept of negative probability and its impact upon entropy. An analogy between the probability generating functions, in the scope of quasiprobability distributions, and the Grünwald-Letnikov definition of fractional derivatives, is explored. Two distinct cases producing negative probabilities are formulated and their distinct meaning clarified. Numerical calculations using the Shannon entropy characterize further the characteristics of the two limit cases.

  7. Calculating the CEP (Circular Error Probable)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report compares the probability contained in the Circular Error Probable associated with an Elliptical Error Probable to that of the EEP at a given confidence level. The levels examined are 50 percent and 95 percent. The CEP is found to be both more conservative and less conservative than the associated EEP, depending on the eccentricity of the ellipse. The formulas used are derived in the appendix.

  8. Expert fears doom if world population hits 12-15 billion.

    PubMed

    1994-02-22

    Earth's land, water and cropland are disappearing so rapidly that the world population must be slashed to 2 billion or less by 2100 to provide prosperity for all in that year, says a study released yesterday. The alternative, if current trends continue, is a population of 12 billion to 15 billion people and an apocalyptic worldwide scene of "absolute misery, poverty, disease and starvation," said the study's author, David Pimentel, an ecologist at Cornell University. In the US, the population would climb to 500 million and the standard of living would decline to slightly better than in present-day China. Mr. Pimentel said at the annual meeting of the American Association for the Advancement of Science. Even now, the world population of 6 billion is at least 3 times what the Earth's battered natural resources and depleted energy reserves would be able to comfortably support in 2100, Mr. Pimentel said. Mr. Pimentel defines "comfortably support" as providing something close to the current American standard of living, but with wiser use of energy and natural resources. Although a decline to 1 billion or 2 billion people over the next century sounds nearly impossible, it could be done by limiting families around the world to an average of 1.5 children, Mr. Pimentel said. Currently, US women have an average of 2.1 children, while the average in Rwanda is 8.5.

  9. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  10. Is probability of frequency too narrow?

    SciTech Connect

    Martz, H.F.

    1993-10-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed.

  11. The Yatela gold deposit: 2 billion years in the making

    NASA Astrophysics Data System (ADS)

    Hein, K. A. A.; Matsheka, I. R.; Bruguier, O.; Masurel, Q.; Bosch, D.; Caby, R.; Monié, P.

    2015-12-01

    Gold mineralisation in the Yatela Main gold mine is hosted in a saprolitic residuum situated above Birimian supracrustal rocks, and at depth. The supracrustal rocks comprise metamorphosed calcitic and dolomitic marbles that were intruded by diorite (2106 ± 10 Ma, 207Pb/206Pb), and sandstone-siltstone-shale sequences (youngest detrital zircon population dated at 2139 ± 6 Ma). In-situ gold-sulphide mineralisation is associated with hydrothermal activity synchronous to emplacement of the diorite and forms a sub-economic resource; however, the overlying saprolitic residuum hosts economic gold mineralisation in friable lateritized palaeosols and aeolian sands (loess). Samples of saprolitic residuum were studied to investigate the morphology and composition of gold grains as a proxy for distance from source (and possible exploration vector) because the deposit hosts both angular and detrital gold suggesting both proximal and distal sources. U-Pb geochronology of detrital zircons also indicated a proximal and distal source, with the age spectra giving Archaean (2.83-3.28 Ga), and Palaeoproterozoic (1.95-2.20 Ga) to Neoproterozoic (1.1-1.8 Ga) zircons in the Yatela depocentre. The 1.1-1.8 Ga age spectrum restricts the maximum age for the first deposition of the sedimentary units in the Neoproterozoic, or during early deposition in the Taoudeni Basin. Models for formation of the residuum include distal and proximal sources for detritus into the depocentre, however, it is more likely that material was sourced locally and included recycled material. The creation of a deep laterite weathering profile and supergene enrichment of the residuum probably took place during the mid-Cretaceous-early Tertiary.

  12. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile...

  13. A 25.5 percent AMO gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltages approach 1.120 V and efficiencies well over 25 percent (AMO) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  14. A 25.5 percent AM0 gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltage approach 1.120 V and efficiencies well over 25 percent (AM0) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  15. Corrosion behavior of aluminum-alumina composites in aerated 3.5 percent chloride solution

    NASA Astrophysics Data System (ADS)

    Acevedo Hurtado, Paul Omar

    Aluminum based metal matrix composites are finding many applications in engineering. Of these Al-Al2O3 composites appear to have promise in a number of defense applications because of their mechanical properties. However, their corrosion behavior remains suspect, especially in marine environments. While efforts are being made to improve the corrosion resistance of Al-Al2O3 composites, the mechanism of corrosion is not well known. In this study, the corrosion behavior of powder metallurgy processed Al-Cu alloy reinforced with 10, 15, 20 and 25 vol. % Al2O3 particles (XT 1129, XT 2009, XT 2048, XT 2031) was evaluated in aerated 3.5% NaCl solution using microstructural and electrochemical measurements. AA1100-O and AA2024T4 monolithic alloys were also studied for comparison purposes. The composites and unreinforced alloys were subjected to potentiodynamic polarization and Electrochemical Impedance Spectroscopy (EIS) testing. Addition of 25 vol. % Al2O 3 to the base alloys was found to increase its corrosion resistance considerably. Microstructural studies revealed the presence of intermetallic Al2Cu particles in these composites that appeared to play an important role in the observations. Pitting potential for these composites was near corrosion potential values, and repassivation potential was below the corresponding corrosion potential, indicating that these materials begin to corrode spontaneously as soon as they come in contact with the 3.5 % NaCl solution. EIS measurements indicate the occurrence of adsorption/diffusion phenomena at the interface of the composites which ultimately initiate localized or pitting corrosion. Polarization resistance values were extracted from the EIS data for all the materials tested. Electrically equivalent circuits are proposed to describe and substantiate the corrosive processes occurring in these Al-Al2O 3 composite materials.

  16. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... options are members of a pre-existing public group. Moreover, if transferable options are issued to...

  17. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... options are members of a pre-existing public group. Moreover, if transferable options are issued to...

  18. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... members. However, the participation by creditors in formulating a plan for an insolvency workout or a... receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or reorganization do... options are members of a pre-existing public group. Moreover, if transferable options are issued to...

  19. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  20. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  1. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  2. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  3. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  4. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computed to no less than three significant digits. Probabilities will be truncated to the number of significant digits used in a particular lottery. (b) Divide the total number of applicants into 1.00 to... than .40, then multiply each such intermediate probability by the ratio of .40 to such sum. Divide...

  5. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  6. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  7. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  8. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  9. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  10. Teenagers' Perceived and Actual Probabilities of Pregnancy.

    ERIC Educational Resources Information Center

    Namerow, Pearila Brickner; And Others

    1987-01-01

    Explored adolescent females' (N=425) actual and perceived probabilities of pregnancy. Subjects estimated their likelihood of becoming pregnant the last time they had intercourse, and indicated the dates of last intercourse and last menstrual period. Found that the distributions of perceived probability of pregnancy were nearly identical for both…

  11. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  12. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  13. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  14. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  15. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  16. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  17. Quantum probability assignment limited by relativistic causality

    PubMed Central

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  18. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  19. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  20. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  1. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  2. The Animism Controversy Revisited: A Probability Analysis

    ERIC Educational Resources Information Center

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  3. Classical and Quantum Spreading of Position Probability

    ERIC Educational Resources Information Center

    Farina, J. E. G.

    1977-01-01

    Demonstrates that the standard deviation of the position probability of a particle moving freely in one dimension is a function of the standard deviation of its velocity distribution and time in classical or quantum mechanics. (SL)

  4. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  5. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record

  6. The cognitive substrate of subjective probability.

    PubMed

    Nilsson, Håkan; Olsson, Henrik; Juslin, Peter

    2005-07-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as prototype similarity, relative likelihood, or evidential support accumulation (ESAM; D. J. Koehler, C. M. White, & R. Grondin, 2003); cue-based relative frequency; and exemplar memory, implemented by probabilities from exemplars (PROBEX; P. Juslin & M. Persson, 2002). Three experiments with different task structures consistently demonstrate that exemplar memory is the best account of the data whereas the results are inconsistent with extant formulations of the representativeness heuristic and cue-based relative frequency. PMID:16060768

  7. Rare Gases Transition Probabilities for Plasma Diagnostics

    SciTech Connect

    Katsonis, K.; Siskos, A.; Ndiaye, A.; Clark, R. E. H.; Cornille, M.; Abdallah, J. Jr.

    2006-01-15

    Evaluation of Ar and Xe transition probabilities to be used in Collisional-Radiative models for plasma diagnostics is addressed. Partial results are given for the typical case of the 4p <- 4d Ar III multiplet.

  8. Teaching Elementary Probability Through its History.

    ERIC Educational Resources Information Center

    Kunoff, Sharon; Pines, Sylvia

    1986-01-01

    Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)

  9. Determining Probabilities by Examining Underlying Structure.

    ERIC Educational Resources Information Center

    Norton, Robert M.

    2001-01-01

    Discusses how dice games pose fairness issues that appeal to students and examines a structure for three games involving two dice in a way that leads directly to the theoretical probabilities for all possible outcomes. (YDS)

  10. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  11. Non-Gaussian Photon Probability Distribution

    NASA Astrophysics Data System (ADS)

    Solomon, Benjamin T.

    2010-01-01

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√u ) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al. (2006) microwave cloaking, and Oulton et al. (2008) sub wavelength confinement; thereby providing a strong case that

  12. Probability distribution of the vacuum energy density

    SciTech Connect

    Duplancic, Goran; Stefancic, Hrvoje; Glavan, Drazen

    2010-12-15

    As the vacuum state of a quantum field is not an eigenstate of the Hamiltonian density, the vacuum energy density can be represented as a random variable. We present an analytical calculation of the probability distribution of the vacuum energy density for real and complex massless scalar fields in Minkowski space. The obtained probability distributions are broad and the vacuum expectation value of the Hamiltonian density is not fully representative of the vacuum energy density.

  13. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  14. When probability trees don't work

    NASA Astrophysics Data System (ADS)

    Chan, K. C.; Lenard, C. T.; Mills, T. M.

    2016-08-01

    Tree diagrams arise naturally in courses on probability at high school or university, even at an elementary level. Often they are used to depict outcomes and associated probabilities from a sequence of games. A subtle issue is whether or not the Markov condition holds in the sequence of games. We present two examples that illustrate the importance of this issue. Suggestions as to how these examples may be used in a classroom are offered.

  15. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  16. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  17. Familiarity and preference for pitch probability profiles.

    PubMed

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge. PMID:25838257

  18. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence.

    PubMed

    Ohmoto, H; Kakegawa, T; Lowe, D R

    1993-10-22

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>10(-13) of the present atmospheric level) of free oxygen. PMID:11539502

  19. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence

    NASA Technical Reports Server (NTRS)

    Ohmoto, H.; Kakegawa, T.; Lowe, D. R.

    1993-01-01

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>>10(-13) of the present atmospheric level) of free oxygen.

  20. NOAA Budget Increases to $4.1 Billion, But Some Key Items Are Reduced

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2008-02-01

    The Bush administration has proposed a US$4.1 billion budget for fiscal year (FY) 2009 for the U.S. National Oceanic and Atmospheric Administration (NOAA). The proposed budget, which would be the agency's largest ever, is $202.6 million, or 5.2%, above the FY 2008 enacted budget. By topping $4 billion and the amount Congress passed for FY 2008, the budget proposal crosses into ``a new threshold,'' according Navy Vice Admiral Conrad Lautenbacher, undersecretary of commerce for oceans and atmosphere and NOAA administrator.

  1. 3.4-Billion-Year-Old Biogenic Pyrites from Barberton, South Africa: Sulfur Isotope Evidence

    NASA Astrophysics Data System (ADS)

    Ohmoto, Hiroshi; Kakegawa, Takeshi; Lowe, Donald R.

    1993-10-01

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (> > 10-13 of the present atmospheric level) of free oxygen.

  2. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence.

    PubMed

    Ohmoto, H; Kakegawa, T; Lowe, D R

    1993-10-22

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>10(-13) of the present atmospheric level) of free oxygen.

  3. Billions for biodefense: federal agency biodefense funding, FY2001-FY2005.

    PubMed

    Schuler, Ari

    2004-01-01

    Over the past several years, the United States government has spent substantial resources on preparing the nation against a bioterrorist attack. This article analyzes the civilian biodefense funding by the federal government from fiscal years 2001 through 2005, specifically analyzing the budgets and allocations for biodefense at the Department of Health and Human Services, the Department of Homeland Security, the Department of Defense, the Department of Agriculture, the Environmental Protection Agency, the National Science Foundation, and the Department of State. In total, approximately $14.5 billion has been funded for civilian biodefense through FY2004, with an additional $7.6 billion in the President's budget request for FY2005. PMID:15225402

  4. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a

  5. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  6. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  7. The rapid assembly of an elliptical galaxy of 400 billion solar masses at a redshift of 2.3.

    PubMed

    Fu, Hai; Cooray, Asantha; Feruglio, C; Ivison, R J; Riechers, D A; Gurwell, M; Bussmann, R S; Harris, A I; Altieri, B; Aussel, H; Baker, A J; Bock, J; Boylan-Kolchin, M; Bridge, C; Calanog, J A; Casey, C M; Cava, A; Chapman, S C; Clements, D L; Conley, A; Cox, P; Farrah, D; Frayer, D; Hopwood, R; Jia, J; Magdis, G; Marsden, G; Martínez-Navajas, P; Negrello, M; Neri, R; Oliver, S J; Omont, A; Page, M J; Pérez-Fournon, I; Schulz, B; Scott, D; Smith, A; Vaccari, M; Valtchanov, I; Vieira, J D; Viero, M; Wang, L; Wardlow, J L; Zemcov, M

    2013-06-20

    Stellar archaeology shows that massive elliptical galaxies formed rapidly about ten billion years ago with star-formation rates of above several hundred solar masses per year. Their progenitors are probably the submillimetre bright galaxies at redshifts z greater than 2. Although the mean molecular gas mass (5 × 10(10) solar masses) of the submillimetre bright galaxies can explain the formation of typical elliptical galaxies, it is inadequate to form elliptical galaxies that already have stellar masses above 2 × 10(11) solar masses at z ≈ 2. Here we report multi-wavelength high-resolution observations of a rare merger of two massive submillimetre bright galaxies at z = 2.3. The system is seen to be forming stars at a rate of 2,000 solar masses per year. The star-formation efficiency is an order of magnitude greater than that of normal galaxies, so the gas reservoir will be exhausted and star formation will be quenched in only around 200 million years. At a projected separation of 19 kiloparsecs, the two massive starbursts are about to merge and form a passive elliptical galaxy with a stellar mass of about 4 × 10(11) solar masses. We conclude that gas-rich major galaxy mergers with intense star formation can form the most massive elliptical galaxies by z ≈ 1.5.

  8. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  9. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.

  10. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  11. Classical and Quantum Probability for Biologists - Introduction

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2010-01-01

    The aim of this review (oriented to biologists looking for applications of QM) is to provide a detailed comparative analysis of classical (Kolmogorovian) and quantum (Dirac-von Neumann) models. We will stress differences in the definition of conditional probability and as a consequence in the structures of matrices of transition probabilities, especially the condition of double stochasticity which arises naturally in QM. One of the most fundamental differences between two models is deformation of the classical formula of total probability (FTP) which plays an important role in statistics and decision making. An additional term appears in the QM-version of FTP - so called interference term. Finally, we discuss Bell's inequality and show that the common viewpoint that its violation induces either nonlocality or "death of realism" has not been completely justified. For us it is merely a sign of non-Kolmogorovianity of probabilistic data collected in a few experiments with incompatible setups of measurement devices.

  12. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  13. Transition Probabilities for Hydrogen-Like Atoms

    NASA Astrophysics Data System (ADS)

    Jitrik, Oliverio; Bunge, Carlos F.

    2004-12-01

    E1, M1, E2, M2, E3, and M3 transition probabilities for hydrogen-like atoms are calculated with point-nucleus Dirac eigenfunctions for Z=1-118 and up to large quantum numbers l=25 and n=26, increasing existing data more than a thousandfold. A critical evaluation of the accuracy shows a higher reliability with respect to previous works. Tables for hydrogen containing a subset of the results are given explicitly, listing the states involved in each transition, wavelength, term energies, statistical weights, transition probabilities, oscillator strengths, and line strengths. The complete results, including 1 863 574 distinct transition probabilities, lifetimes, and branching fractions are available at http://www.fisica.unam.mx/research/tables/spectra/1el

  14. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  15. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  16. Sampling Quantum Nonlocal Correlations with High Probability

    NASA Astrophysics Data System (ADS)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(< u_i,v_jrangle)_{i,j=1}^n}, where the vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  17. Match probabilities in racially admixed populations.

    PubMed

    Lange, K

    1993-02-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors.

  18. Genotypic probabilities for pairs of inbred relatives.

    PubMed

    Liu, Wenlei; Weir, B S

    2005-07-29

    Expressions for the joint genotypic probabilities of two related individuals are used in many population and quantitative genetic analyses. These expressions, resting on a set of 15 probabilities of patterns of identity by descent among the four alleles at a locus carried by the relatives, are generally well known. There has been recent interest in special cases where the two individuals are both related and inbred, although there have been differences among published results. Here, we return to the original 15-probability treatment and show appropriate reductions for relatives when they are drawn from a population that itself is inbred or when the relatives have parents who are related. These results have application in affected-relative tests for linkage, and in methods for interpreting forensic genetic profiles.

  19. Cancer costs projected to reach at least $158 billion in 2020

    Cancer.gov

    Based on growth and aging of the U.S. population, medical expenditures for cancer in the year 2020 are projected to reach at least $158 billion (in 2010 dollars) – an increase of 27 percent over 2010. If newly developed tools for cancer diagnosis, treatme

  20. Two Billion Cars: What it Means for Climate and Energy Policy

    ScienceCinema

    Daniel Sperling

    2016-07-12

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  1. Nitrogen, phosphorus, and potassium requirements to support a multi-billion gallon biofuel industry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    To accomplish the goals for biofuel and bioenergy production, 1 billion tons of biomass will need to be produced annually by the year 2030. Crop production data from a joint study by the U.S. Department of Energy (US DOE) and the U.S. Department of Agriculture (USDA) demonstrated how this goal could...

  2. Two Billion Cars: What it Means for Climate and Energy Policy

    SciTech Connect

    Daniel Sperling

    2009-04-15

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  3. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  4. Multi-Billion Shot, High-Fluence Exposure of Cr(4+): YAG Passive Q-Switch

    NASA Technical Reports Server (NTRS)

    Stephen, Mark A.; Dallas, Joseph L.; Afzal, Robert S.

    1997-01-01

    NASA's Goddard Space Flight Center is developing the Geoscience Laser Altimeter System (GLAS) employing a diode pumped, Q-Switched, ND:YAG laser operating at 40 Hz repetition rate. To meet the five-year mission lifetime goal, a single transmitter would accumulate over 6.3 billion shots. Cr(4+):YAG is a promising candidate material for passively Q-switching the laser. Historically, the performance of saturable absorbers has degraded over long-duration usage. To measure the multi-billion shot performance of Cr(4+):YAG, a passively Q-switched GLAS-like oscillator was tested at an accelerated repetition rate of 500 Hz. The intracavity fluence was calculated to be approximately 2.5 J/cm(exp 2). The laser was monitored autonomously for 165 days. There was no evidence of change in the material optical properties during the 7.2 billion shot test.. All observed changes in laser operation could be attributed to pump laser diode aging. This is the first demonstration of multi-billion shot exposure testing of Cr(4+):YAG in this pulse energy regime

  5. Conservation in a World of Six Billion: A Grassroots Action Guide.

    ERIC Educational Resources Information Center

    Hren, Benedict J.

    This grassroots action guide features a conservation initiative working to bring the impacts of human population growth, economic development, and natural resource consumption into balance with the limits of nature for the benefit of current and future generations. Contents include information sheets entitled "Six Billion People and Growing,""The…

  6. US Physician Practices Spend More Than $15.4 Billion Annually To Report Quality Measures.

    PubMed

    Casalino, Lawrence P; Gans, David; Weber, Rachel; Cea, Meagan; Tuchovsky, Amber; Bishop, Tara F; Miranda, Yesenia; Frankel, Brittany A; Ziehler, Kristina B; Wong, Meghan M; Evenson, Todd B

    2016-03-01

    Each year US physician practices in four common specialties spend, on average, 785 hours per physician and more than $15.4 billion dealing with the reporting of quality measures. While much is to be gained from quality measurement, the current system is unnecessarily costly, and greater effort is needed to standardize measures and make them easier to report. PMID:26953292

  7. High-Stakes Hustle: Public Schools and the New Billion Dollar Accountability

    ERIC Educational Resources Information Center

    Baines, Lawrence A.; Stanley, Gregory Kent

    2004-01-01

    High-stakes testing costs up to $50 billion per annum, has no impact on student achievement, and has changed the focus of American public schools. This article analyzes the benefits and costs of the accountability movement, as well as discusses its roots in the eugenics movements of the early 20th century.

  8. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  9. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  10. Survival probability for the stadium billiard

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.; Georgiou, Orestis

    2009-12-01

    We consider the open stadium billiard, consisting of two semicircles joined by parallel straight sides with one hole situated somewhere on one of the sides. Due to the hyperbolic nature of the stadium billiard, the initial decay of trajectories, due to loss through the hole, appears exponential. However, some trajectories (bouncing ball orbits) persist and survive for long times and therefore form the main contribution to the survival probability function at long times. Using both numerical and analytical methods, we concur with previous studies that the long-time survival probability for a reasonably small hole drops like Constant×(; here we obtain an explicit expression for the Constant.

  11. Does Probability Interference Exist In Social Science?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.; Haven, Emmanuel

    2007-02-01

    In this paper we discuss the rationale why sub(super)-additive probabilities in a psychological setting could be explained via the use of quantum probability interference. We propose to measure the complementarity of two variables: i) time of processing (by experiment participants) of (non-moving) images and ii) the ability (by experiment participants) of recognizing deformations of (non-moving) pictures. We argue in the paper why we can not find this complementarity using the Heisenberg Uncertainty Principle. The paper provides for the details on the experimental set up to test the complementarity.

  12. Random walks with similar transition probabilities

    NASA Astrophysics Data System (ADS)

    Schiefermayr, Klaus

    2003-04-01

    We consider random walks on the nonnegative integers with a possible absorbing state at -1. A random walk is called [alpha]-similar to a random walk if there exist constants Cij such that for the corresponding n-step transition probabilities , i,j[greater-or-equal, slanted]0, hold. We give necessary and sufficient conditions for the [alpha]-similarity of two random walks both in terms of the parameters and in terms of the corresponding spectral measures which appear in the spectral representation of the n-step transition probabilities developed by Karlin and McGregor.

  13. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  14. Nonstationary envelope process and first excursion probability.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.

  15. Probabilities for separating sets of order statistics.

    PubMed

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  16. Electric quadrupole transition probabilities for atomic lithium

    SciTech Connect

    Çelik, Gültekin; Gökçe, Yasin; Yıldız, Murat

    2014-05-15

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT.

  17. Non-Gaussian Photon Probability Distribution

    SciTech Connect

    Solomon, Benjamin T.

    2010-01-28

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mGAMMA distribution (whose parameters are alpha = r, betar/sq root(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact P{sub i}, the probabilistic function and the ability to interact A{sub i}, the electromagnetic function. Splitting the probability function P{sub i} from the electromagnetic function A{sub i} enables the investigation of the photon behavior from a purely probabilistic P{sub i} perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function P{sub i} and the ability to interact A{sub i}, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon P{sub i} of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub

  18. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    SciTech Connect

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  19. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  20. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  1. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  2. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  3. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  4. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  5. Probable Bright Supernovae discovered by PSST

    NASA Astrophysics Data System (ADS)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-01-01

    Three bright transients, which are probable supernovae, have been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  6. Probable Bright Supernova discovered by PSST

    NASA Astrophysics Data System (ADS)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  7. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  8. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  9. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  10. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  11. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  12. Quantum temporal probabilities in tunneling systems

    NASA Astrophysics Data System (ADS)

    Anastopoulos, Charis; Savvidou, Ntina

    2013-09-01

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines 'classical' time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems.

  13. Teaching Mathematics with Technology: Probability Simulations.

    ERIC Educational Resources Information Center

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  14. Conceptual Variation and Coordination in Probability Reasoning

    ERIC Educational Resources Information Center

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  15. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  16. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  17. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  18. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  19. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    USGS Publications Warehouse

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    similar and related to three primary physiographic provinces. The final regional regression analyses resulted in three sets of equations. For Regions 1 and 2, the basin characteristics of drainage area and basin shape factor were statistically significant. For Region 3, because of the small amount of data from streamgages, only drainage area was statistically significant. Average standard errors of prediction ranged from 28.7 to 38.4 percent for flood region 1, 24.1 to 43.5 percent for flood region 2, and 25.8 to 30.5 percent for region 3. The regional regression equations are only applicable to stream sites in Missouri with flows not significantly affected by regulation, channelization, backwater, diversion, or urbanization. Basins with about 5 percent or less impervious area were considered to be rural. Applicability of the equations are limited to the basin characteristic values that range from 0.11 to 8,212.38 square miles (mi2) and basin shape from 2.25 to 26.59 for Region 1, 0.17 to 4,008.92 mi2 and basin shape 2.04 to 26.89 for Region 2, and 2.12 to 2,177.58 mi2 for Region 3. Annual peak data from streamgages were used to qualitatively assess the largest floods recorded at streamgages in Missouri since the 1915 water year. Based on existing streamgage data, the 1983 flood event was the largest flood event on record since 1915. The next five largest flood events, in descending order, took place in 1993, 1973, 2008, 1994 and 1915. Since 1915, five of six of the largest floods on record occurred from 1973 to 2012.

  20. Quantum temporal probabilities in tunneling systems

    SciTech Connect

    Anastopoulos, Charis Savvidou, Ntina

    2013-09-15

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.

  1. The albedo effect on neutron transmission probability.

    PubMed

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  2. Beaufortian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Beaufortian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes Jurassic through lower Cretaceous (Neocomian) strata of the Kingak Shale and the overlying pebble shale unit. These strata are part of a composite total petroleum system involving hydrocarbons expelled from source rocks in three stratigraphic intervals, the Lower Jurassic part of the Kingak Shale, the Triassic Shublik Formation, and the Lower Cretaceous gamma-ray zone (GRZ) and associated strata. The potential for undiscovered oil and gas resources in the Beaufortian megasequence in NPRA was assessed by defining eight plays (assessment units), two in lower Cretaceous (Neocomian) topset seismic facies, four in Upper Jurassic topset seismic facies, one in Lower Jurassic topset seismic facies, and one in Jurassic through lower Cretaceous (Neocomian) clinoform seismic facies. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 239 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 103 million barrels. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 1,162 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 405 billion cubic feet. The Beaufortian Cretaceous Topset South Play is estimated to contain between 635 (95-percent probability) and 4,004 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 2,130 billion cubic feet. No technically recoverable oil is assessed in the Beaufortian Cretaceous Topset South Play, as it lies at depths that are entirely in the gas window. The Beaufortian Upper Jurassic Topset Northeast Play is estimated to contain between 2,744 (95-percent probability) and 8,086 (5-percent probability) million barrels of technically recoverable oil

  3. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  4. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in other journals. However, there is…

  5. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  6. You Say "Probable" and I Say "Likely": Improving Interpersonal Communication With Verbal Probability Phrases

    ERIC Educational Resources Information Center

    Karelitz, Tzur M.; Budescu, David V.

    2004-01-01

    When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…

  7. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  8. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    SciTech Connect

    Stewart, Jeffrey S.

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  9. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  10. Brookian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Brookian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes bottomset and clinoform seismic facies of the Torok Formation (mostly Albian age) and generally coeval, topset seismic facies of the uppermost Torok Formation and the Nanushuk Group. These strata are part of a composite total petroleum system involving hydrocarbons expelled from three stratigraphic intervals of source rocks, the Lower Cretaceous gamma-ray zone (GRZ), the Lower Jurassic Kingak Shale, and the Triassic Shublik Formation. The potential for undiscovered oil and gas resources in the Brookian megasequence in NPRA was assessed by defining five plays (assessment units), one in the topset seismic facies and four in the bottomset-clinoform seismic facies. The Brookian Topset Play is estimated to contain between 60 (95-percent probability) and 465 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 239 million barrels. The Brookian Topset Play is estimated to contain between 0 (95-percent probability) and 679 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 192 billion cubic feet. The Brookian Clinoform North Play, which extends across northern NPRA, is estimated to contain between 538 (95-percent probability) and 2,257 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 1,306 million barrels. The Brookian Clinoform North Play is estimated to contain between 0 (95-percent probability) and 1,969 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 674 billion cubic feet. The Brookian Clinoform Central Play, which extends across central NPRA, is estimated to contain between 299 (95-percent probability) and 1,849 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 973

  11. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  12. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  13. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  14. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. PMID:24039093

  15. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  16. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  17. Cheating Probabilities on Multiple Choice Tests

    NASA Astrophysics Data System (ADS)

    Rizzuto, Gaspard T.; Walters, Fred

    1997-10-01

    This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.

  18. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

  19. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  20. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  1. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  2. Probability of photoassociation from a quasicontinuum approach

    NASA Astrophysics Data System (ADS)

    Javanainen, Juha; Mackie, Matt

    1998-08-01

    We examine photoassociation by using a quasicontinuum to describe the colliding atoms. The quasicontinuum system is analyzed using methods adapted from the theory of laser spectroscopy and quantum optics, and a continuum limit is then taken. In a degenerate gas the equilibrium probability of photoassociation may be close to unity. In the continuum limit, for a thermal atomic sample, the stimulated Raman adiabatic passage (STIRAP) mechanism cannot be employed to eliminate unwanted spontaneous transitions.

  3. Neural coding of uncertainty and probability.

    PubMed

    Ma, Wei Ji; Jazayeri, Mehrdad

    2014-01-01

    Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability.

  4. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  5. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  6. Neural coding of uncertainty and probability.

    PubMed

    Ma, Wei Ji; Jazayeri, Mehrdad

    2014-01-01

    Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability. PMID:25032495

  7. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  8. Understanding Deutsch's probability in a deterministic multiverse

    NASA Astrophysics Data System (ADS)

    Greaves, H.

    2004-09-01

    Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from 'probability' without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: all she needs is a particular rationality principle. The decision-theoretic approach recently developed by Deutsch and Wallace claims to provide just such a principle. But, according to Wallace, decision theory is itself applicable only if the correct attitude to a future Everettian measurement outcome is subjective uncertainty. I argue that subjective uncertainty is not available to the Everettian, but I offer an alternative: we can justify the Everettian application of decision theory on the basis that an Everettian should care about all her future branches. The probabilities appearing in the decision-theoretic representation theorem can then be interpreted as the degrees to which the rational agent cares about each future branch. This reinterpretation, however, reduces the intuitive plausibility of one of the Deutsch-Wallace axioms (measurement neutrality).

  9. The Probability Distribution of Daily Streamflow

    NASA Astrophysics Data System (ADS)

    Blum, A.; Vogel, R. M.

    2015-12-01

    Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.

  10. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  11. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  12. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  13. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  14. How to Bring Solar Energy to Seven Billion People (LBNL Science at the Theater)

    ScienceCinema

    Wadia, Cyrus

    2016-07-12

    By exploiting the powers of nanotechnology and taking advantage of non-toxic, Earth-abundant materials, Berkeley Lab's Cyrus Wadia has fabricated new solar cell devices that have the potential to be several orders of magnitude less expensive than conventional solar cells. And by mastering the chemistry of these materials-and the economics of solar energy-he envisions bringing electricity to the 1.2 billion people now living without it.

  15. How to Bring Solar Energy to Seven Billion People (LBNL Science at the Theater)

    SciTech Connect

    Wadia, Cyrus

    2009-04-06

    By exploiting the powers of nanotechnology and taking advantage of non-toxic, Earth-abundant materials, Berkeley Lab's Cyrus Wadia has fabricated new solar cell devices that have the potential to be several orders of magnitude less expensive than conventional solar cells. And by mastering the chemistry of these materials-and the economics of solar energy-he envisions bringing electricity to the 1.2 billion people now living without it.

  16. Instability of Wave Trains and Wave Probabilities

    NASA Astrophysics Data System (ADS)

    Babanin, Alexander

    2013-04-01

    Centre for Ocean Engineering, Science and Technology, Swinburne University of Technology, Melbourne, Australia, ababanin@swin.edu.au Design criteria in ocean engineering, whether this is one in 50 years or one in 5000 years event, are hardly ever based on measurements, and rather on statistical distributions of relevant metocean properties. Of utmost interest is the tail of distribution, that is rare events such as the highest waves with low probability. Engineers have long since realised that the superposition of linear waves with narrow-banded spectrum as depicted by the Rayleigh distribution underestimates the probability of extreme wave heights and crests, which is a critical shortcoming as far as the engineering design is concerned. Ongoing theoretical and experimental efforts have been under way for decades to address this issue. Typical approach is the treating all possible waves in the ocean or at a particular location as a single ensemble for which some comprehensive solution can be obtained. The oceanographic knowledge, however, now indicates that no single and united comprehensive solution is available. We would expect the probability distributions of wave height to depend on a) whether the waves are at the spectral peak or at the tail; b) on wave spectrum and mean steepness in the wave field; c) on the directional distribution of the peak waves; d) on whether the waves are in deep water, in intermediate depth or in shallow water; e) on wave breaking; f) on the wind, particularly if it is very strong, and on the currents if they have suitable horizontal gradients. Probability distributions in the different circumstances according to these groups of conditions should be different, and by combining them together the inevitable scatter is introduced. The scatter and the accuracy will not improve by increasing the bulk data quality and quantity, and it hides the actual distribution of extremes. The groups have to be separated and their probability

  17. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    PubMed

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-01

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  18. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    PubMed

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes. PMID:27049949

  19. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    PubMed

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  20. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  1. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  2. Air density 2.7 billion years ago limited to less than twice modern levels by fossil raindrop imprints.

    PubMed

    Som, Sanjoy M; Catling, David C; Harnmeijer, Jelte P; Polivka, Peter M; Buick, Roger

    2012-04-19

    According to the 'Faint Young Sun' paradox, during the late Archaean eon a Sun approximately 20% dimmer warmed the early Earth such that it had liquid water and a clement climate. Explanations for this phenomenon have invoked a denser atmosphere that provided warmth by nitrogen pressure broadening or enhanced greenhouse gas concentrations. Such solutions are allowed by geochemical studies and numerical investigations that place approximate concentration limits on Archaean atmospheric gases, including methane, carbon dioxide and oxygen. But no field data constraining ground-level air density and barometric pressure have been reported, leaving the plausibility of these various hypotheses in doubt. Here we show that raindrop imprints in tuffs of the Ventersdorp Supergroup, South Africa, constrain surface air density 2.7 billion years ago to less than twice modern levels. We interpret the raindrop fossils using experiments in which water droplets of known size fall at terminal velocity into fresh and weathered volcanic ash, thus defining a relationship between imprint size and raindrop impact momentum. Fragmentation following raindrop flattening limits raindrop size to a maximum value independent of air density, whereas raindrop terminal velocity varies as the inverse of the square root of air density. If the Archaean raindrops reached the modern maximum measured size, air density must have been less than 2.3 kg m(-3), compared to today's 1.2 kg m(-3), but because such drops rarely occur, air density was more probably below 1.3 kg m(-3). The upper estimate for air density renders the pressure broadening explanation possible, but it is improbable under the likely lower estimates. Our results also disallow the extreme CO(2) levels required for hot Archaean climates. PMID:22456703

  3. Air density 2.7 billion years ago limited to less than twice modern levels by fossil raindrop imprints.

    PubMed

    Som, Sanjoy M; Catling, David C; Harnmeijer, Jelte P; Polivka, Peter M; Buick, Roger

    2012-04-19

    According to the 'Faint Young Sun' paradox, during the late Archaean eon a Sun approximately 20% dimmer warmed the early Earth such that it had liquid water and a clement climate. Explanations for this phenomenon have invoked a denser atmosphere that provided warmth by nitrogen pressure broadening or enhanced greenhouse gas concentrations. Such solutions are allowed by geochemical studies and numerical investigations that place approximate concentration limits on Archaean atmospheric gases, including methane, carbon dioxide and oxygen. But no field data constraining ground-level air density and barometric pressure have been reported, leaving the plausibility of these various hypotheses in doubt. Here we show that raindrop imprints in tuffs of the Ventersdorp Supergroup, South Africa, constrain surface air density 2.7 billion years ago to less than twice modern levels. We interpret the raindrop fossils using experiments in which water droplets of known size fall at terminal velocity into fresh and weathered volcanic ash, thus defining a relationship between imprint size and raindrop impact momentum. Fragmentation following raindrop flattening limits raindrop size to a maximum value independent of air density, whereas raindrop terminal velocity varies as the inverse of the square root of air density. If the Archaean raindrops reached the modern maximum measured size, air density must have been less than 2.3 kg m(-3), compared to today's 1.2 kg m(-3), but because such drops rarely occur, air density was more probably below 1.3 kg m(-3). The upper estimate for air density renders the pressure broadening explanation possible, but it is improbable under the likely lower estimates. Our results also disallow the extreme CO(2) levels required for hot Archaean climates.

  4. [Subjective probability of reward receipt and the magnitude effect in probability discounting].

    PubMed

    Isomura, Mieko; Aoyama, Kenjiro

    2008-06-01

    Previous research suggested that larger probabilistic rewards were discounted more steeply than smaller probabilistic rewards (the magnitude effect). This research tests the hypothesis that the magnitude effect reflects the extent to which individuals distrust the stated probability of receiving different amounts of rewards. The participants were 105 college students. Probability discounting of two different amounts of rewards (5 000 yen and 100 000 yen) and the subjective probability of reward receipt of the different amounts (5 000 yen, 100 000 yen and 1 000 000 yen) were measured. The probabilistic 100 000 yen was discounted more steeply than the probabilistic 5 000 yen. The subjective probability of reward receipt was higher in the 5 000 yen than in the 100 000 yen condition. The proportion of subjective probability of receiving 5 000 yen to that of receiving 100 000 yen was significantly correlated with the proportion of degree of probability discounting for 5 000 yen to that for 100 000 yen. These results were consistent with the hypothesis stated above.

  5. On the universality of knot probability ratios

    NASA Astrophysics Data System (ADS)

    Janse van Rensburg, E. J.; Rechnitzer, A.

    2011-04-01

    Let pn denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let pn(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is pn(K)/pn and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of pn(K), but there is substantial numerical evidence (Orlandini et al 1988 J. Phys. A: Math. Gen. 31 5953-67, Marcone et al 2007 Phys. Rev. E 75 41105, Rawdon et al 2008 Macromolecules 41 4444-51, Janse van Rensburg and Rechnitzer 2008 J. Phys. A: Math. Theor. 41 105002) that pn(K) grows as p_n(K) \\simeq C_K \\mu _\\emptyset ^n n^{\\alpha -3+N_K}, \\qquad as\\quad n \\rightarrow \\infty, where NK is the number of prime components of the knot type K. It is believed that the entropic exponent, α, is universal, while the exponential growth rate, μ∅, is independent of the knot type but varies with the lattice. The amplitude, CK, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L is \\frac{p_n(K)/p_n}{p_n(L)/p_n} = \\frac{p_n(K)}{p_n(L)} \\simeq \\left[ \\frac{C_K}{C_L} \\right].\\\\[-8pt] In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot

  6. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  7. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  8. Symmetry, probability, and recognition in face space.

    PubMed

    Sirovich, Lawrence; Meytlis, Marsha

    2009-04-28

    The essential midline symmetry of human faces is shown to play a key role in facial coding and recognition. This also has deep and important connections with recent explorations of the organization of primate cortex, as well as human psychophysical experiments. Evidence is presented that the dimension of face recognition space for human faces is dramatically lower than previous estimates. One result of the present development is the construction of a probability distribution in face space that produces an interesting and realistic range of (synthetic) faces. Another is a recognition algorithm that by reasonable criteria is nearly 100% accurate.

  9. The Prediction of Spatial Aftershock Probabilities (PRESAP)

    NASA Astrophysics Data System (ADS)

    McCloskey, J.

    2003-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the

  10. Mapping probability of shipping sound exposure level.

    PubMed

    Gervaise, Cédric; Aulanier, Florian; Simard, Yvan; Roy, Nathalie

    2015-06-01

    Mapping vessel noise is emerging as one method of identifying areas where sound exposure due to shipping noise could have negative impacts on aquatic ecosystems. The probability distribution function (pdf) of sound exposure levels (SEL) is an important metric for identifying areas of concern. In this paper a probabilistic shipping SEL modeling method is described to obtain the pdf of SEL using the sonar equation and statistical relations linking the pdfs of ship traffic density, source levels, and transmission losses to their products and sums.

  11. Probability of detection calculations using MATLAB

    NASA Astrophysics Data System (ADS)

    Wei, Yung-Chung

    1993-06-01

    A set of highly efficient computer programs based on the Marcum and Swerling's analysis on radar detection has been written in MATLAB to evaluate the probability of detection. The programs are based on accurate methods unlike the detectability method which is based on approximation. This thesis also outlines radar detection theory and target models as a background. The goal of this effort is to provide a set of efficient computer programs for student usage and teacher's aid. Programs are designed to be user friendly and run on personal computers.

  12. Assessment of potential oil and gas resources in source rocks of the Alaska North Slope, 2012

    USGS Publications Warehouse

    Houseknecht, David W.; Rouse, William A.; Garrity, Christopher P.; Whidden, Katherine J.; Dumoulin, Julie A.; Schenk, Christopher J.; Charpentier, Ronald R.; Cook, Troy A.; Gaswirth, Stephanie B.; Kirschbaum, Mark A.; Pollastro, Richard M.

    2012-01-01

    The U.S. Geological Survey estimated potential, technically recoverable oil and gas resources for source rocks of the Alaska North Slope. Estimates (95-percent to 5-percent probability) range from zero to 2 billion barrels of oil and from zero to nearly 80 trillion cubic feet of gas.

  13. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.

  14. Uncertainty analysis for Probable Maximum Precipitation estimates

    NASA Astrophysics Data System (ADS)

    Micovic, Zoran; Schaefer, Melvin G.; Taylor, George H.

    2015-02-01

    An analysis of uncertainty associated with Probable Maximum Precipitation (PMP) estimates is presented. The focus of the study is firmly on PMP estimates derived through meteorological analyses and not on statistically derived PMPs. Theoretical PMP cannot be computed directly and operational PMP estimates are developed through a stepwise procedure using a significant degree of subjective professional judgment. This paper presents a methodology for portraying the uncertain nature of PMP estimation by analyzing individual steps within the PMP derivation procedure whereby for each parameter requiring judgment, a set of possible values is specified and accompanied by expected probabilities. The resulting range of possible PMP values can be compared with the previously derived operational single-value PMP, providing measures of the conservatism and variability of the original estimate. To our knowledge, this is the first uncertainty analysis conducted for a PMP derived through meteorological analyses. The methodology was tested on the La Joie Dam watershed in British Columbia. The results indicate that the commonly used single-value PMP estimate could be more than 40% higher when possible changes in various meteorological variables used to derive the PMP are considered. The findings of this study imply that PMP estimates should always be characterized as a range of values recognizing the significant uncertainties involved in PMP estimation. In fact, we do not know at this time whether precipitation is actually upper-bounded, and if precipitation is upper-bounded, how closely PMP estimates approach the theoretical limit.

  15. On the probability of matching DNA fingerprints.

    PubMed

    Risch, N J; Devlin, B

    1992-02-01

    Forensic scientists commonly assume that DNA fingerprint patterns are infrequent in the general population and that genotypes are independent across loci. To test these assumptions, the number of matching DNA patterns in two large databases from the Federal Bureau of Investigation (FBI) and from Lifecodes was determined. No deviation from independence across loci in either database was apparent. For the Lifecodes database, the probability of a three-locus match ranges from 1 in 6,233 in Caucasians to 1 in 119,889 in Blacks. When considering all trios of five loci in the FBI database, there was only a single match observed out of more than 7.6 million comparisons. If independence is assumed, the probability of a five-locus match ranged from 1.32 x 10(-12) in Southeast Hispanics to 5.59 x 10(-14) in Blacks, implying that the minimum number of possible patterns for each ethnic group is several orders of magnitude greater than their corresponding population sizes in the United States. The most common five-locus pattern can have a frequency no greater than about 10(-6). Hence, individual five-locus DNA profiles are extremely uncommon, if not unique. PMID:1738844

  16. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  17. Probably maximum flood of the Sava River

    NASA Astrophysics Data System (ADS)

    Brilly, Mitja; Vidmar, Andrej; Raj, Mojca Å.

    2010-05-01

    The Nuclear Power Plant Krško (NEK) situated on the left bank of the Save River close to the border of Croatia. Probably Maximum Flood, on the location of the NEK could result in combination of probably maximum precipitation, sequential storm before PMP or snowmelt on the Sava River watershed. Mediterranean climate characterises very high precipitation and temporary high snow pack. The HBV-96 model as Integrated Hydrological Modelling System (IHMS) used for modelling. Model was calibrated and verification for daily time step at first for time period 1190-2006. Calibration and verification for hourly time step was done for period 1998-1999. The stream routing parameters were calibrated for flood event in years 1998 and 2007 and than verification for flood event in 1990. Discharge routing data analysis shown that possible inundation of Ljubljana and Savinja valley was not properly estimated. The flood areas are protected with levees and water does not spread over flooded areas in events used for calibration. Inundated areas in Ljubljana valley and Savinja valley are protected by levees and model could not simulate properly inundation of PMF. We recalibrate parameters controlled inundation on those areas for the worst scenario. Calculated PMF values drop down tramendosly after recalibration.

  18. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  19. Significance of "high probability/low damage" versus "low probability/high damage" flood events

    NASA Astrophysics Data System (ADS)

    Merz, B.; Elmer, F.; Thieken, A. H.

    2009-06-01

    The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.

  20. States' Spending on Colleges Rises 19 Pct. in 2 Years, Nears $31-Billion for'85-86.

    ERIC Educational Resources Information Center

    Evangelauf, Jean

    1985-01-01

    The U.S. states' expenditures to nearly $31 billion in tax money mark a continuing recovery in support for higher education. Shaping this year's appropriations levels were concerns about tuition and efforts to promote economic development. (MLW)

  1. Economic choices reveal probability distortion in macaque monkeys.

    PubMed

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing.

  2. Economic Choices Reveal Probability Distortion in Macaque Monkeys

    PubMed Central

    Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-01-01

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. PMID:25698750

  3. Model estimates hurricane wind speed probabilities

    NASA Astrophysics Data System (ADS)

    Mumane, Richard J.; Barton, Chris; Collins, Eric; Donnelly, Jeffrey; Eisner, James; Emanuel, Kerry; Ginis, Isaac; Howard, Susan; Landsea, Chris; Liu, Kam-biu; Malmquist, David; McKay, Megan; Michaels, Anthony; Nelson, Norm; O Brien, James; Scott, David; Webb, Thompson, III

    In the United States, intense hurricanes (category 3, 4, and 5 on the Saffir/Simpson scale) with winds greater than 50 m s -1 have caused more damage than any other natural disaster [Pielke and Pielke, 1997]. Accurate estimates of wind speed exceedance probabilities (WSEP) due to intense hurricanes are therefore of great interest to (re)insurers, emergency planners, government officials, and populations in vulnerable coastal areas.The historical record of U.S. hurricane landfall is relatively complete only from about 1900, and most model estimates of WSEP are derived from this record. During the 1899-1998 period, only two category-5 and 16 category-4 hurricanes made landfall in the United States. The historical record therefore provides only a limited sample of the most intense hurricanes.

  4. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  5. Probability density function learning by unsupervised neurons.

    PubMed

    Fiori, S

    2001-10-01

    In a recent work, we introduced the concept of pseudo-polynomial adaptive activation function neuron (FAN) and presented an unsupervised information-theoretic learning theory for such structure. The learning model is based on entropy optimization and provides a way of learning probability distributions from incomplete data. The aim of the present paper is to illustrate some theoretical features of the FAN neuron, to extend its learning theory to asymmetrical density function approximation, and to provide an analytical and numerical comparison with other known density function estimation methods, with special emphasis to the universal approximation ability. The paper also provides a survey of PDF learning from incomplete data, as well as results of several experiments performed on real-world problems and signals. PMID:11709808

  6. Probability of Brownian motion hitting an obstacle

    SciTech Connect

    Knessl, C.; Keller, J.B.

    2000-02-01

    The probability p(x) that Brownian motion with drift, starting at x, hits an obstacle is analyzed. The obstacle {Omega} is a compact subset of R{sup n}. It is shown that p(x) is expressible in terms of the field U(x) scattered by {Omega} when it is hit by plane wave. Therefore results for U(x), and methods for finding U(x) can be used to determine p(x). The authors illustrate this by obtaining exact and asymptotic results for p(x) when {Omega} is a slit in R{sup 2}, and asymptotic results when {Omega} is a disc in R{sup 3}.

  7. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2004-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  8. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-11

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.

  9. Parabolic Ejecta Features on Titan? Probably Not

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.; Melosh, H. J.

    1996-03-01

    Radar mapping of Venus by Magellan indicated a number of dark parabolic features, associated with impact craters. A suggested mechanism for generating such features is that ejecta from the impact event is 'winnowed' by the zonal wind field, with smaller ejecta particles falling out of the atmosphere more slowly, and hence drifting further. What discriminates such features from simple wind streaks is the 'stingray' or parabolic shape. This is due to the ejecta's spatial distribution prior to being winnowed during fallout, and this distribution is generated by the explosion plume of the impact piercing the atmosphere, allowing the ejecta to disperse pseudoballistically before re-entering the atmosphere, decelerating to terminal velocity and then being winnowed. Here we apply this model to Titan, which has a zonal wind field similar to that of Venus. We find that Cassini will probably not find parabolic features, as the winds stretch the deposition so far that ejecta will form streaks or bands instead.

  10. Trending in Probability of Collision Measurements

    NASA Technical Reports Server (NTRS)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  11. Quantum probabilities for inflation from holography

    SciTech Connect

    Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  12. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2006-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  13. 5426 Sharp: A Probable Hungaria Binary

    NASA Astrophysics Data System (ADS)

    Warner, Brian D.; Benishek, Vladimir; Ferrero, Andrea

    2015-07-01

    Initial CCD photometry observations of the Hungaria asteroid 5426 Sharp in 2014 December and 2015 January at the Center of Solar System Studies-Palmer Divide Station in Landers, CA, showed attenuations from the general lightcurve, indicating the possibility of the asteroid being a binary system. The secondary period was almost exactly an Earth day, prompting a collaboration to be formed with observers in Europe, which eventually allowed establishing two periods: P1 = 4.5609 ± 0.0003 h, A1 = 0.18 ± 0.01 mag and P2 = 24.22 ± 0.02 h, A2 = 0.08 ± 0.01 mag. No mutual events, i.e., occultations and/or eclipses, were seen, therefore the asteroid is considered a probable and not confirmed binary

  14. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-01

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data. PMID:26754250

  15. Evolution probabilities and phylogenetic distance of dinucleotides.

    PubMed

    Michel, Christian J

    2007-11-21

    We develop here an analytical evolution model based on a dinucleotide mutation matrix 16 x 16 with six substitution parameters associated with the three types of substitutions in the two dinucleotide sites. It generalizes the previous models based on the nucleotide mutation matrices 4 x 4. It determines at some time t the exact occurrence probabilities of dinucleotides mutating randomly according to these six substitution parameters. Furthermore, several properties and two applications of this model allow to derive 16 evolutionary analytical solutions of dinucleotides and also a dinucleotide phylogenetic distance. Finally, based on this mathematical model, the SED (Stochastic Evolution of Dinucleotides) web server has been developed for deriving evolutionary analytical solutions of dinucleotides.

  16. Homonymous Hemianopsia Associated with Probable Alzheimer's Disease.

    PubMed

    Ishiwata, Akiko; Kimura, Kazumi

    2016-01-01

    Posterior cortical atrophy (PCA) is a rare neurodegenerative disorder that has cerebral atrophy in the parietal, occipital, or occipitotemporal cortices and is characterized by visuospatial and visuoperceptual impairments. The most cases are pathologically compatible with Alzheimer's disease (AD). We describe a case of PCA in which a combination of imaging methods, in conjunction with symptoms and neurological and neuropsychological examinations, led to its being diagnosed and to AD being identified as its probable cause. Treatment with donepezil for 6 months mildly improved alexia symptoms, but other symptoms remained unchanged. A 59-year-old Japanese woman with progressive alexia, visual deficit, and mild memory loss was referred to our neurologic clinic for the evaluation of right homonymous hemianopsia. Our neurological examination showed alexia, constructional apraxia, mild disorientation, short-term memory loss, and right homonymous hemianopsia. These findings resulted in a score of 23 (of 30) points on the Mini-Mental State Examination. Occipital atrophy was identified, with magnetic resonance imaging (MRI) showing left-side dominance. The MRI data were quantified with voxel-based morphometry, and PCA was diagnosed on the basis of these findings. Single photon emission computed tomography with (123)I-N-isopropyl-p-iodoamphetamine showed hypoperfusion in the corresponding voxel-based morphometry occipital lobes. Additionally, the finding of hypoperfusion in the posterior associate cortex, posterior cingulate gyrus, and precuneus was consistent with AD. Therefore, the PCA was considered to be a result of AD. We considered Lewy body dementia as a differential diagnosis because of the presence of hypoperfusion in the occipital lobes. However, the patient did not meet the criteria for Lewy body dementia during the course of the disease. We therefore consider including PCA in the differential diagnoses to be important for patients with visual deficit, cognitive

  17. Repetition probability effects for inverted faces.

    PubMed

    Grotheer, Mareike; Hermann, Petra; Vidnyánszky, Zoltán; Kovács, Gyula

    2014-11-15

    It has been shown, that the repetition related reduction of the blood-oxygen level dependent (BOLD) signal is modulated by the probability of repetitions (P(rep)) for faces (Summerfield et al., 2008), providing support for the predictive coding (PC) model of visual perception (Rao and Ballard, 1999). However, the stage of face processing where repetition suppression (RS) is modulated by P(rep) is still unclear. Face inversion is known to interrupt higher level configural/holistic face processing steps and if modulation of RS by P(rep) takes place at these stages of face processing, P(rep) effects are expected to be reduced for inverted when compared to upright faces. Therefore, here we aimed at investigating whether P(rep) effects on RS observed for face stimuli originate at the higher-level configural/holistic stages of face processing by comparing these effects for upright and inverted faces. Similarly to previous studies, we manipulated P(rep) for pairs of stimuli in individual blocks of fMRI recordings. This manipulation significantly influenced repetition suppression in the posterior FFA, the OFA and the LO, independently of stimulus orientation. Our results thus reveal that RS in the ventral visual stream is modulated by P(rep) even in the case of face inversion and hence strongly compromised configural/holistic face processing. An additional whole-brain analysis could not identify any areas where the modulatory effect of probability was orientation specific either. These findings imply that P(rep) effects on RS might originate from the earlier stages of face processing.

  18. Practical implementation of joint multitarget probabilities

    NASA Astrophysics Data System (ADS)

    Musick, Stanton; Kastella, Keith D.; Mahler, Ronald P. S.

    1998-07-01

    A Joint Multitarget Probability (JMP) is a posterior probability density pT(x1,...,xTZ) that there are T targets (T an unknown number) with unknown locations specified by the multitarget state X equals (x1,...,xT)T conditioned on a set of observations Z. This paper presents a numerical approximation for implementing JMP in detection, tracking and sensor management applications. A problem with direct implementation of JMP is that, if each xt, t equals 1,...,T, is discretized on a grid of N elements, NT variables are required to represent JMP on the T-target sector. This produces a large computational requirement even for small values of N and T. However, when the sensor easily separates targets, the resulting JMP factorizes and can be approximated by a product representation requiring only O(T2N) variables. Implementation of JMP for multitarget tracking requires a Bayes' rule step for measurement update and a Markov transition step for time update. If the measuring sensor is only influenced by the cell it observes, the JMP product representation is preserved under measurement update. However, the product form is not quite preserved by the Markov time update, but can be restored using a minimum discrimination approach. All steps for the approximation can be performed with O(N) effort. This notion is developed and demonstrated in numerical examples with at most two targets in a 1-dimensional surveillance region. In this case, numerical results for detection and tracking for the product approximation and the full JMP are very similar.

  19. [The year 2000: one billion couples of child-bearing age].

    PubMed

    Lintong, L J

    1988-04-01

    Out of 1 billion couples there are only 124 million who use modern and effective contraceptives. World abortions number 33 million/year. 250 million sexually active women of child-bearing age in developing countries outside China do not use modern and effective contraceptives. Fertility control costs on the average US$2.5 billion a year in each developing country, 20% of which is assistance from developed countries. Expanding the family planning service to the 250 million sexually active child-bearing aged women costs an additional U.S. $5 billion yearly. A family planning accessibility survey was conducted by the Population Crisis Committee. PCC divided the countries into 2 categories: Developed and developing countries. The 110 countries (15 developed and 95 developing) covered 96% of the world population. The survey placed the countries in 5 classes according to accessibility levels: Excellent, good, fair, poor, very poor. The developed countries were analyzed according to effective contraceptive methods, service to the poor and minorities, sex education in the schools, and family planning information and advertisement. The developing countries were analyzed according to effective contraceptive methods, performance of service and distribution, public information and education, private sector participation, government finance and policies. Of the 15 developed countries, 43% were excellent, 22% good, 24% fair, and 2% poor. Of the 95 developing countries, 5 were excellent, 10 good, 16 fair, and 64 either poor or very poor countries in respect to family planning accessibility. In the face of a population explosion in the year 2000, many countries lack of government support for family planning programs. After 30 years of world effort in population control, half of the world population still has no effective family planning services.

  20. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core

    NASA Astrophysics Data System (ADS)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J.; Greene, Jenny E.; Blakeslee, John P.; Janish, Ryan

    2016-04-01

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day ‘dormant’ descendants of this population of ‘active’ black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall—the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600—a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  1. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    PubMed

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  2. Mathematical Model for the 0.5 Billion Years Aged Sun

    NASA Astrophysics Data System (ADS)

    Tatomir, E.

    An algorithm is given for constructing evolutionary tracks for a star with the mass equal to one solar mass. The presented model can be applied to the stars belonging to the inferior main sequence, which have the proton-proton reaction as energy source and present a radiative core and a convective shell. This paper presents an original way of solving the system of equations corresponding to the radiative nucleus by using Taylor's series in close vicinity to the center of the Sun. It also presents the numerical integration and the results for a 0.5 billion years aged solar model.

  3. Electron microscopy reveals unique microfossil preservation in 1 billion-year-old lakes

    NASA Astrophysics Data System (ADS)

    Saunders, M.; Kong, C.; Menon, S.; Wacey, D.

    2014-06-01

    Electron microscopy was applied to the study of 1 billion-year-old microfossils from northwest Scotland in order to investigate their 3D morphology and mode of fossilization. 3D-FIB-SEM revealed high quality preservation of organic cell walls with only minor amounts of post-mortem decomposition, followed by variable degrees of morphological alteration (folding and compression of cell walls) during sediment compaction. EFTEM mapping plus SAED revealed a diverse fossilizing mineral assemblage including K-rich clay, Fe-Mg-rich clay and calcium phosphate, with each mineral occupying specific microenvironments in proximity to carbonaceous microfossil cell walls.

  4. The First Billion Years: The Growth of Galaxies in the Reionization Epoch

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth

    2015-08-01

    Detection and measurement of the earliest galaxies in the first billion years only became possible after the Hubble Space Telescope was updated in 2009 with the infrared WFC3/IR camera during Shuttle servicing mission SM4. The first billion years is a fascinating epoch, not just because of the earliest galaxies known from about 450 Myr after the Big Bang, but also because it encompasses the reionization epoch that peaked around z~9, as Planck has recently shown, and ended around redshift z~6 at 900 Myr. Before 2009 just a handful of galaxies were known in the reionization epoch at z>6. But within the last 5 years, with the first HUDF09 survey, the HUDF12, CANDELS and numerous other surveys on the GOODS and CANDELS fields, as well as detections from the cluster lensing programs like CLASH and the Frontier Fields, the number of galaxies at redshifts 7-10 has exploded, with some 700 galaxies being found and characterized. The first billion years was a period of extraordinary growth in the galaxy population with rapid growth in the star formation rate density and global mass density in galaxies. Spitzer observations in the infrared of these Hubble fields are establishing masses as well as giving insights into the nature and timescales of star formation from the very powerful emission lines being revealed by the Spitzer IRAC data. I will discuss what we understand about the growth of galaxies in this epoch from the insights gained from remarkable deep fields like the XDF, as well as the wide-area GOODS/CANDELS fields, the detection of unexpectedly luminous galaxies at redshifts 8-10, the impact of early galaxies on reionization, confirmation of a number of galaxies at z~7-8 from ground-based spectroscopic measurements, and the indications of a change in the growth of the star formation rate around 500 Myr. The first billion years was a time of dramatic growth and change in the early galaxy population.

  5. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    SciTech Connect

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  6. Exercise in probability and statistics, or the probability of winning at tennis

    NASA Astrophysics Data System (ADS)

    Fischer, Gaston

    1980-01-01

    The relationships between the probabilities p, x, s, and M, of winning, respectively, a point, a game, a set, or a match have been derived. The calculations are carried out under the assumption that these probabilities are averages. For example, x represents an average probability of winning a game when serving and receiving, and the same value of x is assumed to hold also for tie-break games. The formulas derived are for sets played with a tie-break game at the level of 6-6, as well as for the traditional rule requiring an advantage of two games to win a set. Matches to the best of three and five sets are considered. As is to be expected, a small advantage in the probability p of winning a point leads to advantages which are amplified by large factors : 2.5 for games, 7.1 for sets with tie-break at 6-6, 10.6 for matches to the best of three sets, and 13.3 for matches to the best of five sets. When sets are decided according to the traditional rule, the last three factors become, respectively, 7.4, 11.1, and 13.8. The theoretical calculations are compared with real and synthetic tennis scores and good agreement is found. The scatter of the data is seen to obey the predictions of a normal distribution. Some classroom problems are suggested at the end.

  7. Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Becker, A.C.; Becla, J.; Borne, K.; Burke, David L.; Claver, C.F.; Cook, K.H.; Connolly, A.; Gilmore, D.K.; Jones, R.L.; Juric, M.; Kahn, Steven M.; Lim, K-T.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Sesar, B.; Stubbs, Christopher W.; Tyson, J.Anthony; /UC, Davis

    2011-11-10

    The Large Synoptic Survey Telescope (LSST) will be a large, wide-field ground-based system designed to obtain, starting in 2015, multiple images of the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times during the anticipated 10 years of operations (distributed over six bands, ugrizy). Each 30-second long visit will deliver 5{sigma} depth for point sources of r {approx} 24.5 on average. The co-added map will be about 3 magnitudes deeper, and will include 10 billion galaxies and a similar number of stars. We discuss various measurements that will be automatically performed for these 20 billion sources, and how they can be used for classification and determination of source physical and other properties. We provide a few classification examples based on SDSS data, such as color classification of stars, color-spatial proximity search for wide-angle binary stars, orbital-color classification of asteroid families, and the recognition of main Galaxy components based on the distribution of stars in the position-metallicity-kinematics space. Guided by these examples, we anticipate that two grand classification challenges for LSST will be (1) rapid and robust classification of sources detected in difference images, and (2) simultaneous treatment of diverse astrometric and photometric time series measurements for an unprecedentedly large number of objects.

  8. Greenhouse gas implications of a 32 billion gallon bioenergy landscape in the US

    NASA Astrophysics Data System (ADS)

    DeLucia, E. H.; Hudiburg, T. W.; Wang, W.; Khanna, M.; Long, S.; Dwivedi, P.; Parton, W. J.; Hartman, M. D.

    2015-12-01

    Sustainable bioenergy for transportation fuel and greenhouse gas (GHGs) reductions may require considerable changes in land use. Perennial grasses have been proposed because of their potential to yield substantial biomass on marginal lands without displacing food and reduce GHG emissions by storing soil carbon. Here, we implemented an integrated approach to planning bioenergy landscapes by combining spatially-explicit ecosystem and economic models to predict a least-cost land allocation for a 32 billion gallon (121 billion liter) renewable fuel mandate in the US. We find that 2022 GHG transportation emissions are decreased by 7% when 3.9 million hectares of eastern US land are converted to perennial grasses supplemented with corn residue to meet cellulosic ethanol requirements, largely because of gasoline displacement and soil carbon storage. If renewable fuel production is accompanied by a cellulosic biofuel tax credit, CO2 equivalent emissions could be reduced by 12%, because it induces more cellulosic biofuel and land under perennial grasses (10 million hectares) than under the mandate alone. While GHG reducing bioenergy landscapes that meet RFS requirements and do not displace food are possible, the reductions in GHG emissions are 50% less compared to previous estimates that did not account for economically feasible land allocation.

  9. The evolution in the stellar mass of brightest cluster galaxies over the past 10 billion years

    NASA Astrophysics Data System (ADS)

    Bellstedt, Sabine; Lidman, Chris; Muzzin, Adam; Franx, Marijn; Guatelli, Susanna; Hill, Allison R.; Hoekstra, Henk; Kurinsky, Noah; Labbe, Ivo; Marchesini, Danilo; Marsan, Z. Cemile; Safavi-Naeini, Mitra; Sifón, Cristóbal; Stefanon, Mauro; van de Sande, Jesse; van Dokkum, Pieter; Weigel, Catherine

    2016-08-01

    Using a sample of 98 galaxy clusters recently imaged in the near-infrared with the European Southern Observatory (ESO) New Technology Telescope, WIYN telescope and William Herschel Telescope, supplemented with 33 clusters from the ESO archive, we measure how the stellar mass of the most massive galaxies in the universe, namely brightest cluster galaxies (BCGs), increases with time. Most of the BCGs in this new sample lie in the redshift range 0.2 < z < 0.6, which has been noted in recent works to mark an epoch over which the growth in the stellar mass of BCGs stalls. From this sample of 132 clusters, we create a subsample of 102 systems that includes only those clusters that have estimates of the cluster mass. We combine the BCGs in this subsample with BCGs from the literature, and find that the growth in stellar mass of BCGs from 10 billion years ago to the present epoch is broadly consistent with recent semi-analytic and semi-empirical models. As in other recent studies, tentative evidence indicates that the stellar mass growth rate of BCGs may be slowing in the past 3.5 billion years. Further work in collecting larger samples, and in better comparing observations with theory using mock images, is required if a more detailed comparison between the models and the data is to be made.

  10. A massive galaxy in its core formation phase three billion years after the Big Bang.

    PubMed

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha Förster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-09-18

    Most massive galaxies are thought to have formed their dense stellar cores in early cosmic epochs. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes, but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we report a candidate core in the process of formation 11 billion years ago, at redshift z = 2.3. This galaxy, GOODS-N-774, has a stellar mass of 100 billion solar masses, a half-light radius of 1.0 kiloparsecs and a star formation rate of solar masses per year. The star-forming gas has a velocity dispersion of 317 ± 30 kilometres per second. This is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, which are compact quiescent galaxies at z ≈ 2 (refs 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 seem to be rare; however, from the star formation rate and size of this galaxy we infer that many star-forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  11. A massive galaxy in its core formation phase three billion years after the Big Bang.

    PubMed

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha Förster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-09-18

    Most massive galaxies are thought to have formed their dense stellar cores in early cosmic epochs. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes, but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we report a candidate core in the process of formation 11 billion years ago, at redshift z = 2.3. This galaxy, GOODS-N-774, has a stellar mass of 100 billion solar masses, a half-light radius of 1.0 kiloparsecs and a star formation rate of solar masses per year. The star-forming gas has a velocity dispersion of 317 ± 30 kilometres per second. This is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, which are compact quiescent galaxies at z ≈ 2 (refs 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 seem to be rare; however, from the star formation rate and size of this galaxy we infer that many star-forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys. PMID:25162527

  12. Evidence for arsenic metabolism and cycling by microorganisms 2.7 billion years ago

    NASA Astrophysics Data System (ADS)

    Sforna, Marie Catherine; Philippot, Pascal; Somogyi, Andrea; van Zuilen, Mark A.; Medjoubi, Kadda; Schoepp-Cothenet, Barbara; Nitschke, Wolfgang; Visscher, Pieter T.

    2014-11-01

    The ability of microbes to metabolize arsenic may have emerged more than 3.4 billion years ago. Some of the modern environments in which prominent arsenic metabolism occurs are anoxic, as were the Precambrian oceans. Early oceans may also have had a relatively high abundance of arsenic. However, it is unclear whether arsenic cycling occurred in ancient environments. Here we assess the chemistry and nature of cell-like globules identified in salt-encrusted portions of 2.72-billion-year-old fossil stromatolites from Western Australia. We use Raman spectroscopy and X-ray fluorescence to show that the globules are composed of organic carbon and arsenic (As). We argue that our data are best explained by the occurrence of a complete arsenic cycle at this site, with As(III) oxidation and As(V) reduction by microbes living in permanently anoxic conditions. We therefore suggest that arsenic cycling could have occurred more widely in marine environments in the several hundred million years before the Earth’s atmosphere and shallow oceans were oxygenated.

  13. The Value Of The Nonprofit Hospital Tax Exemption Was $24.6 Billion In 2011.

    PubMed

    Rosenbaum, Sara; Kindig, David A; Bao, Jie; Byrnes, Maureen K; O'Laughlin, Colin

    2015-07-01

    The federal government encourages public support for charitable activities by allowing people to deduct donations to tax-exempt organizations on their income tax returns. Tax-exempt hospitals are major beneficiaries of this policy because it encourages donations to the hospitals while shielding them from federal and state tax liability. In exchange, these hospitals must engage in community benefit activities, such as providing care to indigent patients and participating in Medicaid. The congressional Joint Committee on Taxation estimated the value of the nonprofit hospital tax exemption at $12.6 billion in 2002--a number that included forgone taxes, public contributions, and the value of tax-exempt bond financing. In this article we estimate that the size of the exemption reached $24.6 billion in 2011. The Affordable Care Act (ACA) brings a new focus on community benefit activities by requiring tax-exempt hospitals to engage in communitywide planning efforts to improve community health. The magnitude of the tax exemption, coupled with ACA reforms, underscores the public's interest not only in community benefit spending generally but also in the extent to which nonprofit hospitals allocate funds for community benefit expenditures that improve the overall health of their communities.

  14. Oxygen and hydrogen isotope evidence for a temperate climate 3.42 billion years ago.

    PubMed

    Hren, M T; Tice, M M; Chamberlain, C P

    2009-11-12

    Stable oxygen isotope ratios (delta(18)O) of Precambrian cherts have been used to establish much of our understanding of the early climate history of Earth and suggest that ocean temperatures during the Archaean era ( approximately 3.5 billion years ago) were between 55 degrees C and 85 degrees C (ref. 2). But, because of uncertainty in the delta(18)O of the primitive ocean, there is considerable debate regarding this conclusion. Examination of modern and ancient cherts indicates that another approach, using a combined analysis of delta(18)O and hydrogen isotopes (deltaD) rather than delta(18)O alone, can provide a firmer constraint on formational temperatures without independent knowledge of the isotopic composition of ambient waters. Here we show that delta(18)O and deltaD sampled from 3.42-billion-year-old Buck Reef Chert rocks in South Africa are consistent with formation from waters at varied low temperatures. The most (18)O-enriched Buck Reef Chert rocks record the lowest diagenetic temperatures and were formed in equilibrium with waters below approximately 40 degrees C. Geochemical and sedimentary evidence suggests that the Buck Reef Chert was formed in shallow to deep marine conditions, so our results indicate that the Palaeoarchaean ocean was isotopically depleted relative to the modern ocean and far cooler (

  15. Oxygen and hydrogen isotope evidence for a temperate climate 3.42 billion years ago.

    PubMed

    Hren, M T; Tice, M M; Chamberlain, C P

    2009-11-12

    Stable oxygen isotope ratios (delta(18)O) of Precambrian cherts have been used to establish much of our understanding of the early climate history of Earth and suggest that ocean temperatures during the Archaean era ( approximately 3.5 billion years ago) were between 55 degrees C and 85 degrees C (ref. 2). But, because of uncertainty in the delta(18)O of the primitive ocean, there is considerable debate regarding this conclusion. Examination of modern and ancient cherts indicates that another approach, using a combined analysis of delta(18)O and hydrogen isotopes (deltaD) rather than delta(18)O alone, can provide a firmer constraint on formational temperatures without independent knowledge of the isotopic composition of ambient waters. Here we show that delta(18)O and deltaD sampled from 3.42-billion-year-old Buck Reef Chert rocks in South Africa are consistent with formation from waters at varied low temperatures. The most (18)O-enriched Buck Reef Chert rocks record the lowest diagenetic temperatures and were formed in equilibrium with waters below approximately 40 degrees C. Geochemical and sedimentary evidence suggests that the Buck Reef Chert was formed in shallow to deep marine conditions, so our results indicate that the Palaeoarchaean ocean was isotopically depleted relative to the modern ocean and far cooler (

  16. The Value Of The Nonprofit Hospital Tax Exemption Was $24.6 Billion In 2011.

    PubMed

    Rosenbaum, Sara; Kindig, David A; Bao, Jie; Byrnes, Maureen K; O'Laughlin, Colin

    2015-07-01

    The federal government encourages public support for charitable activities by allowing people to deduct donations to tax-exempt organizations on their income tax returns. Tax-exempt hospitals are major beneficiaries of this policy because it encourages donations to the hospitals while shielding them from federal and state tax liability. In exchange, these hospitals must engage in community benefit activities, such as providing care to indigent patients and participating in Medicaid. The congressional Joint Committee on Taxation estimated the value of the nonprofit hospital tax exemption at $12.6 billion in 2002--a number that included forgone taxes, public contributions, and the value of tax-exempt bond financing. In this article we estimate that the size of the exemption reached $24.6 billion in 2011. The Affordable Care Act (ACA) brings a new focus on community benefit activities by requiring tax-exempt hospitals to engage in communitywide planning efforts to improve community health. The magnitude of the tax exemption, coupled with ACA reforms, underscores the public's interest not only in community benefit spending generally but also in the extent to which nonprofit hospitals allocate funds for community benefit expenditures that improve the overall health of their communities. PMID:26085486

  17. Enhanced awakening probability of repetitive impulse sounds.

    PubMed

    Vos, Joos; Houben, Mark M J

    2013-09-01

    In the present study relations between the level of impulse sounds and the observed proportion of behaviorally confirmed awakening reactions were determined. The sounds (shooting sounds, bangs produced by door slamming or by container transshipment, aircraft landings) were presented by means of loudspeakers in the bedrooms of 50 volunteers. The fragments for the impulse sounds consisted of single or multiple events. The sounds were presented during a 6-h period that started 75 min after the subjects wanted to sleep. In order to take account of habituation, each subject participated during 18 nights. At equal indoor A-weighted sound exposure levels, the proportion of awakening for the single impulse sounds was equal to that for the aircraft sounds. The proportion of awakening induced by the multiple impulse sounds, however, was significantly higher. For obtaining the same rate of awakening, the sound level of each of the successive impulses in a fragment had to be about 15-25 dB lower than the level of one single impulse. This level difference was largely independent of the degree of habituation. Various explanations for the enhanced awakening probability are discussed. PMID:23967934

  18. Lectures on probability and statistics. Revision

    SciTech Connect

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  19. The probability of finding suitable directed donors.

    PubMed

    Kanter, M; Selvin, S; Myhre, B A

    1989-02-01

    A series of tables based on mathematical calculations is given as guidelines for the number of directed donors needed by members of various ethnic/racial groups to provide a desired number of units of blood with a selected probability of achieving this result. From these tables, certain conclusions can be drawn. Unrelated donors who do not know their blood type are an inefficient source of directed donors. Rh-negative patients are unlikely to obtain enough directed-donor units from either related or unrelated donors with confidence unless these donors known their blood type. In general, siblings, parents, and offspring are the most efficient directed donors from the standpoint of compatibility. Cousins, uncles, aunts, nieces, and nephews are not much more likely to be compatible than unrelated donors are. It is easier to obtain suitable directed-donor units among Hispanics than among whites, blacks, or Asians, due to their skewed blood group frequencies. In general, using O-negative directed donors for Rh-positive recipients does not significantly increase the likelihood of finding suitable donors.

  20. Probability of rupture of multiple fault segments

    USGS Publications Warehouse

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  1. Essays on probability elicitation scoring rules

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  2. Atomic Transition Probabilities for Neutral Cerium

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; den Hartog, E. A.; Wood, M. P.; Nitz, D. E.; Chisholm, J.; Sobeck, J.

    2009-10-01

    The spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are more complex than spectra of other rare earth species. The resulting high density of lines in the visible makes Ce ideal for use in metal halide (MH) High Intensity Discharge (HID) lamps. Inclusion of cerium-iodide in a lamp dose can improve both the Color Rendering Index and luminous efficacy of a MH-HID lamp. Basic spectroscopic data including absolute atomic transition probabilities for Ce I and Ce II are needed for diagnosing and modeling these MH-HID lamps. Recent work on Ce II [1] is now being augmented with similar work on Ce I. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2000 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. [4pt] [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [0pt] [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).

  3. Do aftershock probabilities decay with time?

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  4. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  5. Probability judgments under ambiguity and conflict

    PubMed Central

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081

  6. Levetiracetam: Probably Associated Diurnal Frequent Urination.

    PubMed

    Ju, Jun; Zou, Li-Ping; Shi, Xiu-Yu; Hu, Lin-Yan; Pang, Ling-Yu

    2016-01-01

    Diurnal frequent urination is a common condition in elementary school children who are especially at risk for associated somatic and behavioral problems. Levetiracetam (LEV) is a broad-spectrum antiepileptic drug that has been used in both partial and generalized seizures and less commonly adverse effects including psychiatric and behavioral problems. Diurnal frequent urination is not a well-known adverse effect of LEV. Here, we reported 2 pediatric cases with epilepsy that developed diurnal frequent urination after LEV administration. Case 1 was a 6-year-old male patient who presented urinary frequency and urgency in the daytime since the third day after LEV was given as adjunctive therapy. Symptoms increased accompanied by the raised dosage of LEV. Laboratory tests and auxiliary examinations did not found evidence of organic disease. Diurnal frequent urination due to LEV was suspected, and then the drug was discontinued. As expected, his frequency of urination returned to normal levels. Another 13-year-old female patient got similar clinical manifestations after oral LEV monotherapy and the symptoms became aggravated while in stress state. Since the most common causes of frequent micturition had been ruled out, the patient was considered to be diagnosed with LEV-associated psychogenic frequent urination. The dosage of LEV was reduced to one-third, and the frequency of urination was reduced by 60%. Both patients got the Naranjo score of 6, which indicated that LEV was a "probable" cause of diurnal frequent urination. Although a definite causal link between LEV and diurnal urinary frequency in the 2 cases remains to be established, we argue that diurnal frequent urination associated with LEV deserves clinician's attention. PMID:26938751

  7. Deposition of 1.88-billion-year-old iron formations as a consequence of rapid crustal growth.

    PubMed

    Rasmussen, Birger; Fletcher, Ian R; Bekker, Andrey; Muhling, Janet R; Gregory, Courtney J; Thorne, Alan M

    2012-04-26

    Iron formations are chemical sedimentary rocks comprising layers of iron-rich and silica-rich minerals whose deposition requires anoxic and iron-rich (ferruginous) sea water. Their demise after the rise in atmospheric oxygen by 2.32 billion years (Gyr) ago has been attributed to the removal of dissolved iron through progressive oxidation or sulphidation of the deep ocean. Therefore, a sudden return of voluminous iron formations nearly 500 million years later poses an apparent conundrum. Most late Palaeoproterozoic iron formations are about 1.88 Gyr old and occur in the Superior region of North America. Major iron formations are also preserved in Australia, but these were apparently deposited after the transition to a sulphidic ocean at 1.84 Gyr ago that should have terminated iron formation deposition, implying that they reflect local marine conditions. Here we date zircons in tuff layers to show that iron formations in the Frere Formation of Western Australia are about 1.88 Gyr old, indicating that the deposition of iron formations from two disparate cratons was coeval and probably reflects global ocean chemistry. The sudden reappearance of major iron formations at 1.88 Gyr ago--contemporaneous with peaks in global mafic-ultramafic magmatism, juvenile continental and oceanic crust formation, mantle depletion and volcanogenic massive sulphide formation--suggests deposition of iron formations as a consequence of major mantle activity and rapid crustal growth. Our findings support the idea that enhanced submarine volcanism and hydrothermal activity linked to a peak in mantle melting released large volumes of ferrous iron and other reductants that overwhelmed the sulphate and oxygen reservoirs of the ocean, decoupling atmospheric and seawater redox states, and causing the return of widespread ferruginous conditions. Iron formations formed on clastic-starved coastal shelves where dissolved iron upwelled and mixed with oxygenated surface water. The

  8. Deposition of 1.88-billion-year-old iron formations as a consequence of rapid crustal growth.

    PubMed

    Rasmussen, Birger; Fletcher, Ian R; Bekker, Andrey; Muhling, Janet R; Gregory, Courtney J; Thorne, Alan M

    2012-04-26

    Iron formations are chemical sedimentary rocks comprising layers of iron-rich and silica-rich minerals whose deposition requires anoxic and iron-rich (ferruginous) sea water. Their demise after the rise in atmospheric oxygen by 2.32 billion years (Gyr) ago has been attributed to the removal of dissolved iron through progressive oxidation or sulphidation of the deep ocean. Therefore, a sudden return of voluminous iron formations nearly 500 million years later poses an apparent conundrum. Most late Palaeoproterozoic iron formations are about 1.88 Gyr old and occur in the Superior region of North America. Major iron formations are also preserved in Australia, but these were apparently deposited after the transition to a sulphidic ocean at 1.84 Gyr ago that should have terminated iron formation deposition, implying that they reflect local marine conditions. Here we date zircons in tuff layers to show that iron formations in the Frere Formation of Western Australia are about 1.88 Gyr old, indicating that the deposition of iron formations from two disparate cratons was coeval and probably reflects global ocean chemistry. The sudden reappearance of major iron formations at 1.88 Gyr ago--contemporaneous with peaks in global mafic-ultramafic magmatism, juvenile continental and oceanic crust formation, mantle depletion and volcanogenic massive sulphide formation--suggests deposition of iron formations as a consequence of major mantle activity and rapid crustal growth. Our findings support the idea that enhanced submarine volcanism and hydrothermal activity linked to a peak in mantle melting released large volumes of ferrous iron and other reductants that overwhelmed the sulphate and oxygen reservoirs of the ocean, decoupling atmospheric and seawater redox states, and causing the return of widespread ferruginous conditions. Iron formations formed on clastic-starved coastal shelves where dissolved iron upwelled and mixed with oxygenated surface water. The

  9. Probably good diagrams for learning: representational epistemic recodification of probability theory.

    PubMed

    Cheng, Peter C-H

    2011-07-01

    The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.

  10. What is preexisting strength? Predicting free association probabilities, similarity ratings, and cued recall probabilities.

    PubMed

    Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B

    2005-08-01

    Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength. PMID:16447386

  11. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  12. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. PMID:25704578

  13. On lacunary statistical convergence of order α in probability

    NASA Astrophysics Data System (ADS)

    Işık, Mahmut; Et, Kübra Elif

    2015-09-01

    In this study, we examine the concepts of lacunary statistical convergence of order α in probability and Nθ—convergence of order α in probability. We give some relations connected to these concepts.

  14. Probability-summation model of multiple laser-exposure effects.

    PubMed

    Menendez, A R; Cheney, F E; Zuclich, J A; Crump, P

    1993-11-01

    A probability-summation model is introduced to provide quantitative criteria for discriminating independent from interactive effects of multiple laser exposures on biological tissue. Data that differ statistically from predictions of the probability-summation model indicate the action of sensitizing (synergistic/positive) or desensitizing (hardening/negative) biophysical interactions. Interactions are indicated when response probabilities vary with changes in the spatial or temporal separation of exposures. In the absence of interactions, probability-summation parsimoniously accounts for "cumulative" effects. Data analyzed using the probability-summation model show instances of both sensitization and desensitization of retinal tissue by laser exposures. Other results are shown to be consistent with probability-summation. The relevance of the probability-summation model to previous laser-bioeffects studies, models, and safety standards is discussed and an appeal is made for improved empirical estimates of response probabilities for single exposures.

  15. Probability in Theories With Complex Dynamics and Hardy's Fifth Axiom

    NASA Astrophysics Data System (ADS)

    Burić, Nikola

    2010-08-01

    L. Hardy has formulated an axiomatization program of quantum mechanics and generalized probability theories that has been quite influential. In this paper, properties of typical Hamiltonian dynamical systems are used to argue that there are applications of probability in physical theories of systems with dynamical complexity that require continuous spaces of pure states. Hardy’s axiomatization program does not deal with such theories. In particular Hardy’s fifth axiom does not differentiate between such applications of classical probability and quantum probability.

  16. Pretest probability assessment derived from attribute matching

    PubMed Central

    Kline, Jeffrey A; Johnson, Charles L; Pollack, Charles V; Diercks, Deborah B; Hollander, Judd E; Newgard, Craig D; Garvey, J Lee

    2005-01-01

    Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation <2% in a validation set of 8,120 patients evaluated for possible ACS and did not have ST segment elevation on ECG. 1,061 patients were excluded prior to validation analysis because of ST-segment elevation (713), missing data (77) or being lost to follow-up (271). Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82) for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77) for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25%) patients as having a PTP < 2.0%; 28 developed ACS (1.7% 95% CI = 1.1–2.4%). The LRE categorized 244 (4%, 95% CI = 3–4%) with PTP < 2.0%; four developed ACS (1.6%, 95% CI = 0.4–4.1%). Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE. PMID:16095534

  17. A change in the geodynamics of continental growth 3 billion years ago.

    PubMed

    Dhuime, Bruno; Hawkesworth, Chris J; Cawood, Peter A; Storey, Craig D

    2012-03-16

    Models for the growth of continental crust rely on knowing the balance between the generation of new crust and the reworking of old crust throughout Earth's history. The oxygen isotopic composition of zircons, for which uranium-lead and hafnium isotopic data provide age constraints, is a key archive of crustal reworking. We identified systematic variations in hafnium and oxygen isotopes in zircons of different ages that reveal the relative proportions of reworked crust and of new crust through time. Growth of continental crust appears to have been a continuous process, albeit at variable rates. A marked decrease in the rate of crustal growth at ~3 billion years ago may be linked to the onset of subduction-driven plate tectonics. PMID:22422979

  18. Atmospheric carbon dioxide: a driver of photosynthetic eukaryote evolution for over a billion years?

    PubMed

    Beerling, David J

    2012-02-19

    Exciting evidence from diverse fields, including physiology, evolutionary biology, palaeontology, geosciences and molecular genetics, is providing an increasingly secure basis for robustly formulating and evaluating hypotheses concerning the role of atmospheric carbon dioxide (CO(2)) in the evolution of photosynthetic eukaryotes. Such studies span over a billion years of evolutionary change, from the origins of eukaryotic algae through to the evolution of our present-day terrestrial floras, and have relevance for plant and ecosystem responses to future global CO(2) increases. The papers in this issue reflect the breadth and depth of approaches being adopted to address this issue. They reveal new discoveries pointing to deep evidence for the role of CO(2) in shaping evolutionary changes in plants and ecosystems, and establish an exciting cross-disciplinary research agenda for uncovering new insights into feedbacks between biology and the Earth system.

  19. Billion-atom synchronous parallel kinetic Monte Carlo simulations of critical 3D Ising systems

    SciTech Connect

    Martinez, E.; Monasterio, P.R.; Marian, J.

    2011-02-20

    An extension of the synchronous parallel kinetic Monte Carlo (spkMC) algorithm developed by Martinez et al. [J. Comp. Phys. 227 (2008) 3804] to discrete lattices is presented. The method solves the master equation synchronously by recourse to null events that keep all processors' time clocks current in a global sense. Boundary conflicts are resolved by adopting a chessboard decomposition into non-interacting sublattices. We find that the bias introduced by the spatial correlations attendant to the sublattice decomposition is within the standard deviation of serial calculations, which confirms the statistical validity of our algorithm. We have analyzed the parallel efficiency of spkMC and find that it scales consistently with problem size and sublattice partition. We apply the method to the calculation of scale-dependent critical exponents in billion-atom 3D Ising systems, with very good agreement with state-of-the-art multispin simulations.

  20. Investigation of Radar Propagation in Buildings: A 10 Billion Element Cartesian-Mesh FETD Simulation

    SciTech Connect

    Stowell, M L; Fasenfest, B J; White, D A

    2008-01-14

    In this paper large scale full-wave simulations are performed to investigate radar wave propagation inside buildings. In principle, a radar system combined with sophisticated numerical methods for inverse problems can be used to determine the internal structure of a building. The composition of the walls (cinder block, re-bar) may effect the propagation of the radar waves in a complicated manner. In order to provide a benchmark solution of radar propagation in buildings, including the effects of typical cinder block and re-bar, we performed large scale full wave simulations using a Finite Element Time Domain (FETD) method. This particular FETD implementation is tuned for the special case of an orthogonal Cartesian mesh and hence resembles FDTD in accuracy and efficiency. The method was implemented on a general-purpose massively parallel computer. In this paper we briefly describe the radar propagation problem, the FETD implementation, and we present results of simulations that used over 10 billion elements.

  1. Star Formation in Galaxy Clusters Over the Past 10 Billion Years

    NASA Astrophysics Data System (ADS)

    Tran, Kim-Vy

    2012-01-01

    Galaxy clusters are the largest gravitationally bound systems in the universe and include the most massive galaxies in the universe; this makes galaxy clusters ideal laboratories for disentangling the nature versus nurture aspect of how galaxies evolve. Understanding how galaxies form and evolve in clusters continues to be a fundamental question in astronomy. The ages and assembly histories of galaxies in rich clusters test both stellar population models and hierarchical formation scenarios. Is star formation in cluster galaxies simply accelerated relative to their counterparts in the lower density field, or do cluster galaxies assemble their stars in a fundamentally different manner? To answer this question, I review multi-wavelength results on star formation in galaxy clusters from Coma to the most distant clusters yet discovered at look-back times of 10 billion years (z 2).

  2. Barium fluoride whispering-gallery-mode disk-resonator with one billion quality-factor.

    PubMed

    Lin, Guoping; Diallo, Souleymane; Henriet, Rémi; Jacquot, Maxime; Chembo, Yanne K

    2014-10-15

    We demonstrate a monolithic optical whispering-gallery-mode resonator fabricated with barium fluoride (BaF₂) with an ultra-high quality (Q) factor above 10⁹ at 1550 nm, and measured with both the linewidth and cavity-ring-down methods. Vertical scanning optical profilometry shows that the root mean square surface roughness of 2 nm is achieved for our mm-size disk. To the best of our knowledge, we show for the first time that one billion Q-factor is achievable by precision polishing in relatively soft crystals with mohs hardness of 3. We show that complex thermo-optical dynamics can take place in these resonators. Beside usual applications in nonlinear optics and microwave photonics, high-energy particle scintillation detection utilizing monolithic BaF₂ resonators potentially becomes feasible. PMID:25361142

  3. An exhumation history of continents over billion-year time scales.

    PubMed

    Blackburn, Terrence J; Bowring, Samuel A; Perron, J Taylor; Mahan, Kevin H; Dudas, Francis O; Barnhart, Katherine R

    2012-01-01

    The continental lithosphere contains the oldest and most stable structures on Earth, where fragments of ancient material have eluded destruction by tectonic and surface processes operating over billions of years. Although present-day erosion of these remnants is slow, a record of how they have uplifted, eroded, and cooled over Earth's history can provide insight into the physical properties of the continents and the forces operating to exhume them over geologic time. We constructed a continuous record of ancient lithosphere cooling with the use of uranium-lead (U-Pb) thermochronology on volcanically exhumed lower crustal fragments. Combining these measurements with thermal and Pb-diffusion models constrains the range of possible erosion histories. Measured U-Pb data are consistent with extremely low erosion rates persisting over time scales approaching the age of the continents themselves.

  4. Extraterrestrial demise of banded iron formations 1.85 billion years ago

    USGS Publications Warehouse

    Slack, J.F.; Cannon, W.F.

    2009-01-01

    In the Lake Superior region of North America, deposition of most banded iron formations (BIFs) ended abruptly 1.85 Ga ago, coincident with the oceanic impact of the giant Sudbury extraterrestrial bolide. We propose a new model in which this impact produced global mixing of shallow oxic and deep anoxic waters of the Paleoproterozoic ocean, creating a suboxic redox state for deep seawater. This suboxic state, characterized by only small concentrations of dissolved O2 (???1 ??M), prevented transport of hydrothermally derived Fe(II) from the deep ocean to continental-margin settings, ending an ???1.1 billion-year-long period of episodic BIF mineralization. The model is supported by the nature of Precambrian deep-water exhalative chemical sediments, which changed from predominantly sulfide facies prior to ca. 1.85 Ga to mainly oxide facies thereafter. ?? 2009 Geological Society of America.

  5. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  6. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  7. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  8. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  9. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  10. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  11. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  12. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  13. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  14. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  15. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  16. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  17. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  18. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  19. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  20. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  1. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  2. An anoxic, Fe(II)-rich, U-poor ocean 3.46 billion years ago

    NASA Astrophysics Data System (ADS)

    Li, Weiqiang; Czaja, Andrew D.; Van Kranendonk, Martin J.; Beard, Brian L.; Roden, Eric E.; Johnson, Clark M.

    2013-11-01

    The oxidation state of the atmosphere and oceans on the early Earth remains controversial. Although it is accepted by many workers that the Archean atmosphere and ocean were anoxic, hematite in the 3.46 billion-year-old (Ga) Marble Bar Chert (MBC) from Pilbara Craton, NW Australia has figured prominently in arguments that the Paleoarchean atmosphere and ocean was fully oxygenated. In this study, we report the Fe isotope compositions and U concentrations of the MBC, and show that the samples have extreme heavy Fe isotope enrichment, where δ56Fe values range between +1.5‰ and +2.6‰, the highest δ56Fe values for bulk samples yet reported. The high δ56Fe values of the MBC require very low levels of oxidation and, in addition, point to a Paleoarchean ocean that had high aqueous Fe(II) contents. A dispersion/reaction model indicates that O2 contents in the photic zone of the ocean were less than 10-3 μM, which suggests that the ocean was essentially anoxic. An independent test of anoxic conditions is provided by U-Th-Pb isotope systematics, which show that U contents in the Paleoarchean ocean were likely below 0.02 ppb, two orders-of-magnitude lower than the modern ocean. Collectively, the Fe and U data indicate a reduced, Fe(II)-rich, U-poor environment in the Archean oceans at 3.46 billion years ago. Given the evidence for photosynthetic communities provided by broadly coeval stromatolites, these results suggests that an important photosynthetic pathway in the Paleoarchean oceans may have been anoxygenic photosynthetic Fe(II) oxidation.

  3. IRON AND {alpha}-ELEMENT PRODUCTION IN THE FIRST ONE BILLION YEARS AFTER THE BIG BANG

    SciTech Connect

    Becker, George D.; Carswell, Robert F.; Sargent, Wallace L. W.; Rauch, Michael E-mail: acalver@ast.cam.ac.uk E-mail: mr@obs.carnegiescience.edu

    2012-01-10

    We present measurements of carbon, oxygen, silicon, and iron in quasar absorption systems existing when the universe was roughly one billion years old. We measure column densities in nine low-ionization systems at 4.7 < z < 6.3 using Keck, Magellan, and Very Large Telescope optical and near-infrared spectra with moderate to high resolution. The column density ratios among C II, O I, Si II, and Fe II are nearly identical to sub-damped Ly{alpha} systems (sub-DLAs) and metal-poor ([M/H] {<=} -1) DLAs at lower redshifts, with no significant evolution over 2 {approx}< z {approx}< 6. The estimated intrinsic scatter in the ratio of any two elements is also small, with a typical rms deviation of {approx}< 0.1 dex. These facts suggest that dust depletion and ionization effects are minimal in our z > 4.7 systems, as in the lower-redshift DLAs, and that the column density ratios are close to the intrinsic relative element abundances. The abundances in our z > 4.7 systems are therefore likely to represent the typical integrated yields from stellar populations within the first gigayear of cosmic history. Due to the time limit imposed by the age of the universe at these redshifts, our measurements thus place direct constraints on the metal production of massive stars, including iron yields of prompt supernovae. The lack of redshift evolution further suggests that the metal inventories of most metal-poor absorption systems at z {approx}> 2 are also dominated by massive stars, with minimal contributions from delayed Type Ia supernovae or winds from asymptotic giant branch stars. The relative abundances in our systems broadly agree with those in very metal-poor, non-carbon-enhanced Galactic halo stars. This is consistent with the picture in which present-day metal-poor stars were potentially formed as early as one billion years after the big bang.

  4. Searching for Organics Preserved in 4.5 Billion Year Old Salt

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.; Fries, M.; Steele, A.; Bodnar, R.

    2012-01-01

    Our understanding of early solar system fluids took a dramatic turn a decade ago with the discovery of fluid inclusion-bearing halite (NaCl) crystals in the matrix of two freshly fallen brecciated H chondrite falls, Monahans and Zag. Both meteorites are regolith breccias, and contain xenolithic halite (and minor admixed sylvite -- KCl, crystals in their regolith lithologies. The halites are purple to dark blue, due to the presence of color centers (electrons in anion vacancies) which slowly accumulated as 40K (in sylvite) decayed over billions of years. The halites were dated by K-Ar, Rb-Sr and I-Xe systematics to be 4.5 billion years old. The "blue" halites were a fantastic discovery for the following reasons: (1) Halite+sylvite can be dated (K is in sylvite and will substitute for Na in halite, Rb substitutes in halite for Na, and I substitutes for Cl). (2) The blue color is lost if the halite dissolves on Earth and reprecipitates (because the newly-formed halite has no color centers), so the color serves as a "freshness" or pristinity indicator. (3) Halite frequently contains aqueous fluid inclusions. (4) Halite contains no structural oxygen, carbon or hydrogen, making them ideal materials to measure these isotopic systems in any fluid inclusions. (5) It is possible to directly measure fluid inclusion formation temperatures, and thus directly measure the temperature of the mineralizing aqueous fluid. In addition to these two ordinary chondrites halite grains have been reliably reported in several ureilites, an additional ordinary chondrite (Jilin), and in the carbonaceous chondrite (Murchison), although these reports were unfortunately not taken seriously. We have lately found additional fluid inclusions in carbonates in several additional carbonaceous chondrites. Meteoritic aqueous fluid inclusions are apparently relatively widespread in meteorites, though very small and thus difficult to analyze.

  5. China. Country profile. [China's billion consumers are a rapidly changing market].

    PubMed

    Hardee, K

    1984-10-01

    This article provides a summary of demographic, social, and economic characteristics of the People's Republic of China. Chinese leaders project that achievement of the 4 modernizations (agriculture, industry, science, and technology) will double the per capita income level to $800/year by 2000. Although industrial and agricultural growth have outpaced population growth, stringent population control is considered necessary for continued economic development. China's 1982 population was 1.008 billion, with a birth rate of 20.91, a death rate of 6.36, and a 14.55 rate of natural increase. The growth rate declined from 1.3% in 1982 to 1.15% in 1983. To achieve its goal of preventing the population from exceeding 1.2 billion by the year 2000, the government urges couples to have only 1 child. This policy has been successful in the cities but faces opposition in the rural areas. The sex ratio is 106 males to every 100 females, and there is concern about female infanticide. In 1982 the average household size ranged from a high of 5.2 persons in Qinghai and Yunnan to a low of 3.6 persons in Shanghai. 39% of the population lives in nuclear families without relatives. The literacy rate stood at 77% of those over 12 years of age in 1982, but males outnumber females at higher levels of education. China's campaign to improve health has focused on preventive measures, and there are an estimated 3-5 million health care workers. The 1982 labor force participation rate for those 15-64 years of age was 87.7%, with 44% of workers employed in agricculture. 76.6% of women work, primarily in labor-intensive, low-wage occupations.

  6. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  7. Probability Distribution for Flowing Interval Spacing

    SciTech Connect

    S. Kuzio

    2004-09-22

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  8. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  9. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    PubMed

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores. PMID:25883353

  10. Switching To Less-Expensive Blindness Drug Could Save Medicare Part B $18 Billion Over A Ten-Year Period

    PubMed Central

    Hutton, DW; Newman-Casey, PA; Tavag, M; Zacks, DN; Stein, JD

    2014-01-01

    The biologic drugs bevacizumab and ranibizumab have revolutionized treatment of diabetic macular edema and macular degeneration, leading causes of blindness. Ophthalmologic use of these drugs has increased, now accounting for roughly one-sixth of the Medicare Part B drug budget. Ranibizumab and bevacizumab have similar efficacy and potentially minor differences in adverse event rates, but at $2,023 per dose, ranibizumab costs forty times more than bevacizumab. Using modeling methods, we predict ten-year (2010–2020) population-level costs and health benefits of using bevacizumab and ranibizumab. Our results show that if all patients were treated with the less-expensive bevacizumab instead of current usage patterns, Medicare Part B, patients, and the health care system would save $18 billion, $4.6 billion, and $29 billion, respectively. Altering patterns of use with these therapies by encouraging bevacizumab use and hastening approval of biosimilar therapies would dramatically reduce spending without substantially affecting patient outcomes. PMID:24889941

  11. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    PubMed

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores.

  12. Tensile and fatigue data for irradiated and unirradiated AISI 310 stainless steel and titanium - 5 percent aluminum - 2.5 percent tin: Application of the method of universal slopes

    NASA Technical Reports Server (NTRS)

    Debogdan, C. E.

    1973-01-01

    Irradiated and unirradiated tensile and fatigue specimens of AISI 310 stainless steel and Ti-5Al-2.5Sn were tested in the range of 100 to 10,000 cycles to failure to determine the applicability of the method of universal slopes to irradiated materials. Tensile data for both materials showed a decrease in ductility and increase in ultimate tensile strength due to irradiation. Irradiation caused a maximum change in fatigue life of only 15 to 20 percent for both materials. The method of universal slopes predicted all the fatigue data for the 310 SS (irradiated as well as unirradiated) within a life factor of 2. For the titanium alloy, 95 percent of the data was predicted within a life factor of 3.

  13. Surprisingly rational: probability theory plus noise explains biases in judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. PMID:25090427

  14. Probability shapes perceptual precision: A study in orientation estimation.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention."

  15. On the shape of the probability weighting function.

    PubMed

    Gonzalez, R; Wu, G

    1999-02-01

    Empirical studies have shown that decision makers do not usually treat probabilities linearly. Instead, people tend to overweight small probabilities and underweight large probabilities. One way to model such distortions in decision making under risk is through a probability weighting function. We present a nonparametric estimation procedure for assessing the probability weighting function and value function at the level of the individual subject. The evidence in the domain of gains supports a two-parameter weighting function, where each parameter is given a psychological interpretation: one parameter measures how the decision maker discriminates probabilities, and the other parameter measures how attractive the decision maker views gambling. These findings are consistent with a growing body of empirical and theoretical work attempting to establish a psychological rationale for the probability weighting function. PMID:10090801

  16. Nonlinear Neurobiological Probability Weighting Functions For Aversive Outcomes

    PubMed Central

    Berns, Gregory S.; Capra, C. Monica; Chappelow, Jonathan; Moore, Sara; Noussair, Charles

    2008-01-01

    While mainstream economic models assume that individuals treat probabilities objectively, many people tend to overestimate the likelihood of improbable events and underestimate the likelihood of probable events. However, a biological account for why probabilities would be treated this way does not yet exist. While undergoing fMRI, we presented individuals with a series of lotteries, defined by the voltage of an impending cutaneous electric shock and the probability with which the shock would be received. During the prospect phase, neural activity that tracked the probability of the expected outcome was observed in a circumscribed network of brain regions that included the anterior cingulate, visual, parietal, and temporal cortices. Most of these regions displayed responses to probabilities consistent with nonlinear probability weighting. The neural responses to passive lotteries predicted 79% of subsequent decisions when individuals were offered choices between different lotteries, and exceeded that predicted by behavior alone near the indifference point. PMID:18060809

  17. Sub-parts-per-billion level detection of NO2 using room-temperature quantum cascade lasers

    PubMed Central

    Pushkarsky, Michael; Tsekoun, Alexei; Dunayevskiy, Ilya G.; Go, Rowel; Patel, C. Kumar N.

    2006-01-01

    We report the sub-parts-per-billion-level detection of NO2 using tunable laser-based photoacoustic spectroscopy where the laser radiation is obtained from a room-temperature continuous-wave high-power quantum cascade laser operating in an external grating cavity configuration. The continuously tunable external grating cavity quantum cascade laser produces maximum single-frequency output of ≈300 mW tunable over ≈350 nm centered at 6.25 μm. We demonstrate minimum detection level of ≈0.5 parts per billion of NO2 in the presence of humidified air. PMID:16829569

  18. The 18O/16O Ratio of 2-Billion-Year-Old Seawater Inferred from Ancient Oceanic Crust.

    PubMed

    Holmden, C; Muehlenbachs, K

    1993-03-19

    An oxygen isotope profile of the 2-billion-year-old Purtuniq ophiolite overlaps with similar profiles of younger ophiolites and the modern oceanic crust. This overlap implies (i) that there was a similar style of seawater-ocean crust interaction during the past 2 billion years; (ii) that the oxygen isotope composition of early Proterozoic seawater was similar to the modern value; (iii) that early Proterozoic sea-floor spreading rates were similar to, or greater than, average modern rates; and (iv) that early Proterozoic carbonate rocks and cherts with low (18)O/(16)O ratios do not reflect global-scale (18)O depletion of early Proterozoic oceans. PMID:17816892

  19. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  20. Young Star Probably Ejected From Triple System

    NASA Astrophysics Data System (ADS)

    2003-01-01

    Astronomers analyzing nearly 20 years of data from the National Science Foundation's Very Large Array radio telescope have discovered that a small star in a multiple-star system in the constellation Taurus probably has been ejected from the system after a close encounter with one of the system's more-massive components, presumed to be a compact double star. This is the first time any such event has been observed. Path of Small Star, 1983-2001 "Our analysis shows a drastic change in the orbit of this young star after it made a close approach to another object in the system," said Luis Rodriguez of the Institute of Astronomy of the National Autonomous University of Mexico (UNAM). "The young star was accelerated to a large velocity by the close approach, and certainly now is in a very different, more remote orbit, and may even completely escape its companions," said Laurent Loinard, leader of the research team that also included Monica Rodriguez in addition to Luis Rodriguez. The UNAM astronomers presented their findings at the American Astronomical Society's meeting in Seattle, WA. The discovery of this chaotic event will be important for advancing our understanding of classical dynamic astronomy and of how stars evolve, including possibly providing an explanation for the production of the mysterious "brown dwarfs," the astronomers said. The scientists analyzed VLA observations of T Tauri, a multiple system of young stars some 450 light-years from Earth. The observations were made from 1983 to 2001. The T Tauri system includes a "Northern" star, the famous star that gives its name to the class of young visible stars, and a "Southern" system of stars, all orbiting each other. The VLA data were used to track the orbit of the smaller Southern star around the larger Southern object, presumed to be a pair of stars orbiting each other closely. The astronomers' plot of the smaller star's orbit shows that it followed an apparently elliptical orbit around its twin companions

  1. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    SciTech Connect

    Downing, Mark; Eaton, Laurence M; Graham, Robin Lambert; Langholtz, Matthew H; Perlack, Robert D; Turhollow Jr, Anthony F; Stokes, Bryce; Brandt, Craig C

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small-diameter trees were

  2. A physical-space approach for the probability hypothesis density and cardinalized probability hypothesis density filters

    NASA Astrophysics Data System (ADS)

    Erdinc, Ozgur; Willett, Peter; Bar-Shalom, Yaakov

    2006-05-01

    The probability hypothesis density (PHD) filter, an automatically track-managed multi-target tracker, is attracting increasing but cautious attention. Its derivation is elegant and mathematical, and thus of course many engineers fear it; perhaps that is currently limiting the number of researchers working on the subject. In this paper, we explore a physical-space approach - a bin model - which leads us to arrive the same filter equations as the PHD. Unlike the original derivation of the PHD filter, the concepts used are the familiar ones of conditional probability. The original PHD suffers from a "target-death" problem in which even a single missed detection can lead to the apparent disappearance of a target. To obviate this, PHD originator Mahler has recently developed a new "cardinalized" version of PHD (CPHD). We are able to extend our physical-space derivation to the CPHD case as well. We stress that the original derivations are mathematically correct, and need no embellishment from us; our contribution here is to offer an alternative derivation, one that we find appealing.

  3. A resolution to express the sense of the Senate in support of reducing its budget by at least 5 percent.

    THOMAS, 112th Congress

    Sen. Wicker, Roger F. [R-MS

    2011-03-08

    03/16/2011 Resolution agreed to in Senate without amendment and with a preamble by Unanimous Consent. (text: CR S1768) (All Actions) Tracker: This bill has the status Passed SenateHere are the steps for Status of Legislation:

  4. GNSS integer ambiguity validation based on posterior probability

    NASA Astrophysics Data System (ADS)

    Wu, Zemin; Bian, Shaofeng

    2015-10-01

    GNSS integer ambiguity validation is considered to be a challenge task for decades. Several kinds of validation tests are developed and widely used in these years, but theoretical basis is their weakness. Ambiguity validation theoretically is an issue of hypothesis test. In the frame of Bayesian hypothesis testing, posterior probability is the canonical standard that statistical decision should be based on. In this contribution, (i) we derive the posterior probability of the fixed ambiguity based on the Bayesian principle and modify it for practice ambiguity validation. (ii) The optimal property of the posterior probability test is proved based on an extended Neyman-Pearson lemma. Since validation failure rate is the issue users most concerned about, (iii) we derive the failure rate upper bound of the posterior probability test, so the user can use the posterior probability test either in the fixed posterior probability or in the fixed failure rate way. Simulated as well as real observed data are used for experimental validations. The results show that (i) the posterior probability test is the most effective within the R-ratio test, difference test, ellipsoidal integer aperture test and posterior probability test, (ii) the posterior probability test is computational efficient and (iii) the failure rate estimation for posterior probability test is useful.

  5. A redox-stratified ocean 3.2 billion years ago

    NASA Astrophysics Data System (ADS)

    Satkoski, Aaron M.; Beukes, Nicolas J.; Li, Weiqiang; Beard, Brian L.; Johnson, Clark M.

    2015-11-01

    Before the Great Oxidation Event (GOE) 2.4-2.2 billion years ago it has been traditionally thought that oceanic water columns were uniformly anoxic due to a lack of oxygen-producing microorganisms. Recently, however, it has been proposed that transient oxygenation of shallow seawater occurred between 2.8 and 3.0 billion years ago. Here, we present a novel combination of stable Fe and radiogenic U-Th-Pb isotope data that demonstrate significant oxygen contents in the shallow oceans at 3.2 Ga, based on analysis of the Manzimnyama Banded Iron Formation (BIF), Fig Tree Group, South Africa. This unit is exceptional in that proximal, shallow-water and distal, deep-water facies are preserved. When compared to the distal, deep-water facies, the proximal samples show elevated U concentrations and moderately positive δ56Fe values, indicating vertical stratification in dissolved oxygen contents. Confirmation of oxidizing conditions using U abundances is robustly constrained using samples that have been closed to U and Pb mobility using U-Th-Pb geochronology. Although redox-sensitive elements have been commonly used in ancient rocks to infer redox conditions, post-depositional element mobility has been rarely tested, and U-Th-Pb geochronology can constrain open- or closed-system behavior. The U abundances and δ56Fe values of the Manzimnyama BIF suggest the proximal, shallow-water samples record precipitation under stronger oxidizing conditions compared to the distal deeper-water facies, which in turn indicates the existence of a discrete redox boundary between deep and shallow ocean waters at this time; this work, therefore, documents the oldest known preserved marine redox gradient in the rock record. The relative enrichment of O2 in the upper water column is likely due to the existence of oxygen-producing microorganisms such as cyanobacteria. These results provide a new approach for identifying free oxygen in Earth's ancient oceans, including confirming the age of redox

  6. The First Billion Years project: dark matter haloes going from contraction to expansion and back again

    NASA Astrophysics Data System (ADS)

    Davis, Andrew J.; Khochfar, Sadegh; Dalla Vecchia, Claudio

    2014-09-01

    We study the effect of baryons on the inner dark matter profile of the first galaxies using the First Billion Years simulation between z = 16 and 6 before secular evolution sets in. Using a large statistical sample from two simulations of the same volume and cosmological initial conditions, one with and one without baryons, we are able to directly compare haloes with their baryon-free counterparts, allowing a detailed study of the modifications to the dark matter density profile due to the presence of baryons during the first billion years of galaxy formation. For each of the ≈5000 haloes in our sample (3 × 107 M⊙ ≤ Mtot ≤ 5 × 109 M⊙), we quantify the impact of the baryons using η, defined as the ratio of dark matter mass enclosed in 100 pc in the baryonic run to its counterpart without baryons. During this epoch of rapid growth of galaxies, we find that many haloes of these first galaxies show an enhancement of dark matter in the halo centre compared to the baryon-free simulation, while many others show a deficit. We find that the mean value of η is close to unity, but there is a large dispersion, with a standard deviation of 0.677. The enhancement is cyclical in time and tracks the star formation cycle of the galaxy; as gas falls to the centre and forms stars, the dark matter moves in as well. Supernova (SN) feedback then removes the gas, and the dark matter again responds to the changing potential. We study three physical models relating the motion of baryons to that of the dark matter: adiabatic contraction, dynamical friction, and rapid outflows. We find that dynamical friction plays only a very minor role, while adiabatic contraction and the rapid outflows due to feedback describe well the enhancement (or decrement) of dark matter. For haloes which show significant decrements of dark matter in the core, we find that to remove the dark matter requires an energy input between 1051 and 1053 erg. For our SN feedback proscription, this requires as a

  7. Dynamics of opinion formation with strengthen selection probability

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Jin, Zhen; Wang, Binghong

    2014-04-01

    The local majority rule is extensively accepted as a paradigmatic model to reflect the formation of opinion. In this paper, we study a model of opinion formation where opinion update rule is not based on the majority rule or linear selection probability but on a strengthen selection probability controlled by an adjustable parameter β. In particular, our proposed probability function can proximately fit the two extreme cases-linear probability function and majority rule or in between the two cases under different values of β. By studying such model on different kinds of networks, including different regular networks and complex networks, we find that there exists an optimal value of β giving the most efficient convergence to consensus regardless of the topology of networks. This work reveals that, compared with the majority rule and linear selection probability, the strengthen selection probability might be a more proper model in understanding the formation of opinions in society.

  8. Industrial R&D Expenditures Rise to $22 Billion in 1974. Science Resources Studies Highlights, January 14, 1976.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Reported in this newsletter in narrative, graphical, and tabular form are data related to industrial research and development expenditures in 1974, showing a seven percent increase over 1973. It is noted that more than 80 percent of a total of $22.3 billion was spent by five industries; these included electrical equipment and communication,…

  9. 77 FR 29458 - Supervisory Guidance on Stress Testing for Banking Organizations With More Than $10 Billion in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... Prudential Standards and Early Remediation Requirements for Covered Companies, 77 FR 594 (Jan. 5, 2012... organizations with consolidated assets of $10 billion or less. \\1\\ See 76 FR 35072 (June 15, 2011). All banking... (Pillar 2) Related to the Implementation of the Basel II Advanced Capital Framework, 73 FR 44620 (July...

  10. Mental Disorders Top The List Of The Most Costly Conditions In The United States: $201 Billion.

    PubMed

    Roehrig, Charles

    2016-06-01

    Estimates of annual health spending for a comprehensive set of medical conditions are presented for the entire US population and with totals benchmarked to the National Health Expenditure Accounts. In 2013 mental disorders topped the list of most costly conditions, with spending at $201 billion. PMID:27193027

  11. $100 Billion: For Reform...or to Subsidize the Status Quo? Education Stimulus Watch. Special Report 1

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    This is the first in a quarterly series of special reports on the K-12 education implications of the federal government's economic stimulus package, the American Recovery and Reinvestment Act (ARRA). That the ARRA, which was signed into law in February, will pump nearly $100 billion--an unprecedented sum of federal money--into K-12 education is…

  12. Measurement outcomes and probability in Everettian quantum mechanics

    NASA Astrophysics Data System (ADS)

    Baker, David J.

    The decision-theoretic account of probability in the Everett or many-worlds interpretation, advanced by David Deutsch and David Wallace, is shown to be circular. Talk of probability in Everett presumes the existence of a preferred basis to identify measurement outcomes for the probabilities to range over. But the existence of a preferred basis can only be established by the process of decoherence, which is itself probabilistic.

  13. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  14. Code System to Calculate Pressure Vessel Failure Probabilities.

    2001-03-27

    Version 00 OCTAVIA (Operationally Caused Transients And Vessel Integrity Analysis) calculates the probability of pressure vessel failure from operationally-caused pressure transients which can occur in a pressurized water reactor (PWR). For specified vessel and operating environment characteristics the program computes the failure pressure at which the vessel will fail for different-sized flaws existing in the beltline and the probability of vessel failure per reactor year due to the flaw. The probabilities are summed over themore » various flaw sizes to obtain the total vessel failure probability. Sensitivity studies can be performed to investigate different vessel or operating characteristics in the same computer run.« less

  15. Anytime synthetic projection: Maximizing the probability of goal satisfaction

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Bresina, John L.

    1990-01-01

    A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

  16. Oscillations in probability distributions for stochastic gene expression

    SciTech Connect

    Petrosyan, K. G. Hu, Chin-Kun

    2014-05-28

    The phenomenon of oscillations in probability distribution functions of number of components is found for a model of stochastic gene expression. It takes place in cases of low levels of molecules or strong intracellular noise. The oscillations distinguish between more probable even and less probable odd number of particles. The even-odd symmetry restores as the number of molecules increases with the probability distribution function tending to Poisson distribution. We discuss the possibility of observation of the phenomenon in gene, protein, and mRNA expression experiments.

  17. Challenges for Enriching the Curriculum: Statistics and Probability.

    ERIC Educational Resources Information Center

    Swift, Jim

    1983-01-01

    Three probability problems designed to challenge students are presented: Liars and Diamonds, Heads Wins, and Random Walks. Other statistic problems are suggested that could involve computer simulations. (MNS)

  18. The controversial "Cambrian" fossils of the Vindhyan are real but more than a billion years older.

    PubMed

    Bengtson, Stefan; Belivanova, Veneta; Rasmussen, Birger; Whitehouse, Martin

    2009-05-12

    The age of the Vindhyan sedimentary basin in central India is controversial, because geochronology indicating early Proterozoic ages clashes with reports of Cambrian fossils. We present here an integrated paleontologic-geochronologic investigation to resolve this conundrum. New sampling of Lower Vindhyan phosphoritic stromatolitic dolomites from the northern flank of the Vindhyans confirms the presence of fossils most closely resembling those found elsewhere in Cambrian deposits: annulated tubes, embryo-like globules with polygonal surface pattern, and filamentous and coccoidal microbial fabrics similar to Girvanella and Renalcis. None of the fossils, however, can be ascribed to uniquely Cambrian or Ediacaran taxa. Indeed, the embryo-like globules are not interpreted as fossils at all but as former gas bubbles trapped in mucus-rich cyanobacterial mats. Direct dating of the same fossiliferous phosphorite yielded a Pb-Pb isochron of 1,650 +/- 89 (2sigma) million years ago, confirming the Paleoproterozoic age of the fossils. New U-Pb geochronology of zircons from tuffaceous mudrocks in the Lower Vindhyan Porcellanite Formation on the southern flank of the Vindhyans give comparable ages. The Vindhyan phosphorites provide a window of 3-dimensionally preserved Paleoproterozoic fossils resembling filamentous and coccoidal cyanobacteria and filamentous eukaryotic algae, as well as problematic forms. Like Neoproterozoic phosphorites a billion years later, the Vindhyan deposits offer important new insights into the nature and diversity of life, and in particular, the early evolution of multicellular eukaryotes.

  19. Ballography: A Billion Nanosecond History of the Bee Bluff Impact Crater of South Texas

    NASA Astrophysics Data System (ADS)

    Graham, R. A.

    2006-07-01

    The Bee Bluff Structure of South Texas in Zavala County near Uvalde has been found to exhibit unusual features permitting study of impactites and meteorite impact processes from the standpoint of grain-level, nanosecond shock-compression science. The site is characterized by a thin cap of Carrizo Sandstone covering a thin hard Indio fm calcareous siltstone. A soft calcareous silt lies below the hard cap. Calculations based on the Earth Impact Effects web-based program indicate that the site is best described by a 60 m diameter iron meteorite striking the ground at 11 km/sec. Such an impact into sandstone is expected to produce a shock pressure of 250 GPa. A large release wave originates from the bottom of the hard target with upward moving melt-vaporization waves of solid, liquid and vapor products that become trapped at the impact interface. Numerous distinctive types of impactites result from this `bottom-up' release behavior. Evidence for hydrodynamic instabilities and resulting density gradients are abundant at the impact interface. An unusually valuable breccia sample called `The Uvalde Crater Rosetta Stone' contains at least seven types of impactites in a well defined arrangement that can be used to read the billion nanosecond history of the impact and identify scattered impactites relative to their place in that history.

  20. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    SciTech Connect

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; Mao, Wendy L.

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.

  1. A billion years of environmental stability and the emergence of eukaryotes: new data from northern Australia.

    PubMed

    Brasier, M D; Lindsay, J F

    1998-06-01

    Carbon isotopes through 6km of fully cored drill holes in 1.7 to 1.5 Ga carbonates of the Mount Isa and McArthur basins, Australia (which host the earliest known eukaryote biomarkers) provide the most comprehensive and best-dated delta 13C stratigraphy yet obtained from such ancient rocks. Both basins reveal remarkably stable temporal delta 13C trends (mean of -0.6% +/- 2% PDB [Peedee belemnite]) and confirm the impression of delta 13C stasis between 2.0 and 1.0 Ga, which, together with other evidence, suggest a prolonged period of stability in crustal dynamics, redox state of surface environments, and planetary climate. This delta 13C stasis is consistent with great stability in the carbon cycle controlled, we suggest, by P limitation of primary productivity. Recent evidence shows that P depletion is a major factor in obligate associations between photosymbionts and host cells. We argue that a billion years of stability in the carbon and nutrient cycles may have been the driving force that propelled prokaryotes toward photosymbiosis and the emergence of the autotrophic eukaryote cell.

  2. Rapid oxygenation of Earth's atmosphere 2.33 billion years ago.

    PubMed

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J; Wang, David T; Xie, Shucheng; Summons, Roger E

    2016-05-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth's biogeochemical cycles. Although "whiffs" of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly-within 1 to 10 million years-and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, "Snowball Earth" glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions.

  3. Rapid oxygenation of Earth’s atmosphere 2.33 billion years ago

    PubMed Central

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J.; Wang, David T.; Xie, Shucheng; Summons, Roger E.

    2016-01-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth’s biogeochemical cycles. Although “whiffs” of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly—within 1 to 10 million years—and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, “Snowball Earth” glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  4. The $17.1 billion problem: the annual cost of measurable medical errors.

    PubMed

    Van Den Bos, Jill; Rustagi, Karan; Gray, Travis; Halford, Michael; Ziemkiewicz, Eva; Shreve, Jonathan

    2011-04-01

    At a minimum, high-quality health care is care that does not harm patients, particularly through medical errors. The first step in reducing the large number of harmful medical errors that occur today is to analyze them. We used an actuarial approach to measure the frequency and costs of measurable US medical errors, identified through medical claims data. This method focuses on the analysis of comparative rates of illness, using mathematical models to assess the risk of occurrence and to project costs to the total population. We estimate that the annual cost of measurable medical errors that harm patients was $17.1 billion in 2008. Pressure ulcers were the most common measurable medical error, followed by postoperative infections and by postlaminectomy syndrome, a condition characterized by persistent pain following back surgery. A total of ten types of errors account for more than two-thirds of the total cost of errors, and these errors should be the first targets of prevention efforts.

  5. Constraints on the first billion years of the geodynamo from paleointensity studies of zircons

    NASA Astrophysics Data System (ADS)

    Tarduno, John; Cottrell, Rory; Davis, William

    2014-05-01

    Several lines of reasoning, including new ideas on core thermal conductivity, suggest that onset of a strong geomagnetic field might have been delayed by one billion years (or more) after the lunar forming event. Here we extend the Proterozoic/Archean to Paleoarchean record of the geomagnetic field constrained by single crystal paleointensity (SCP) analyses (Tarduno et al., Science, 2010) to older times using zircons containing minute magnetic inclusions. Specifically, we focus on samples from the Jack Hills (Yilgarn Craton, Western Australia). We employ a CO2 laser demagnetization system and a small bore (6.3 mm) 3-component DC SQUID magnetometer; the latter offers the highest currently available moment resolution. Sample age is analyzed using SHRIMP U-Pb geochronology. Preliminary data support the presence of a relatively strong Paleoarchean field produced by a core dynamo, extending the known record by at least 100 million years, to approximately 3.55 Ga. These data only serve to exacerbate the apparent problem posed by the presence of a Paleoarchean dynamo. Alternative dynamo driving mechanisms, or efficient core/lowermost mantle heat loss processes unique to the Paleoarchean (and older times) might have been at work. We will discuss these processes, and our efforts to study even older Eoarchean-Hadean zircons.

  6. Providing safe drinking water to 1.2 billion unserved people

    SciTech Connect

    Gadgil, Ashok J.; Derby, Elisabeth A.

    2003-06-01

    Despite substantial advances in the past 100 years in public health, technology and medicine, 20% of the world population, mostly comprised of the poor population segments in developing countries (DCs), still does not have access to safe drinking water. To reach the United Nations (UN) Millennium Goal of halving the number of people without access to safe water by 2015, the global community will need to provide an additional one billion urban residents and 600 million rural residents with safe water within the next twelve years. This paper examines current water treatment measures and implementation methods for delivery of safe drinking water, and offers suggestions for making progress towards the goal of providing a timely and equitable solution for safe water provision. For water treatment, based on the serious limitations of boiling water and chlorination, we suggest an approach based on filtration coupled with ultraviolet (UV) disinfection, combined with public education. Additionally, owing to the capacity limitations for non-governmental organizations (NGOs) to take on this task primarily on their own, we suggest a strategy based on financially sustainable models that include the private sector as well as NGOs.

  7. A Massive Galaxy in Its Core Formation Phase Three Billion Years After the Big Bang

    NASA Technical Reports Server (NTRS)

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha M. Forster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-01-01

    Most massive galaxies are thought to have formed their dense stellar cores at early cosmic epochs. However, cores in their formation phase have not yet been observed. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we present a candidate core in formation 11 billion years ago, at z = 2.3. GOODS-N-774 has a stellar mass of 1.0 × 10 (exp 11) solar mass, a half-light radius of 1.0 kpc, and a star formation rate of 90 (sup +45 / sub -20) solar mass/yr. The star forming gas has a velocity dispersion 317 plus or minus 30 km/s, amongst the highest ever measured. It is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, compact quiescent galaxies at z is approximately equal to 2 (exp 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 appear to be rare; however, from the star formation rate and size of the galaxy we infer that many star forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  8. A billion years of environmental stability and the emergence of eukaryotes: New data from northern Australia

    NASA Astrophysics Data System (ADS)

    Brasier, M. D.; Lindsay, J. F.

    1998-06-01

    Carbon isotopes through 6 km of fully cored drill holes in 1.7 to 1.5 Ga carbonates of the Mount Isa and McArthur basins, Australia (which host the earliest known eukaryote biomarkers) provide the most comprehensive and best-dated δ13C stratigraphy yet obtained from such ancient rocks. Both basins reveal remarkably stable temporal δ13C trends (mean of -0.6‰ ± 2‰ PDB [Peedee belemnite]) and confirm the impression of δ13C stasis between 2.0 and 1.0 Ga, which, together with other evidence, suggest a prolonged period of stability in crustal dynamics, redox state of surface environments, and planetary climate. This δ13C stasis is consistent with great stability in the carbon cycle controlled, we suggest, by P limitation of primary productivity. Recent evidence shows that P depletion is a major factor in obligate associations between photosymbionts and host cells. We argue that a billion years of stability in the carbon and nutrient cycles may have been the driving force that propelled prokaryotes toward photosymbiosis and the emergence of the autotrophic eukaryote cell.

  9. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes.

    PubMed

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D

    2014-07-28

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments.

  10. Evidence for oxygenic photosynthesis half a billion years before the Great Oxidation Event

    NASA Astrophysics Data System (ADS)

    Planavsky, Noah J.; Asael, Dan; Hofmann, Axel; Reinhard, Christopher T.; Lalonde, Stefan V.; Knudsen, Andrew; Wang, Xiangli; Ossa Ossa, Frantz; Pecoits, Ernesto; Smith, Albertus J. B.; Beukes, Nicolas J.; Bekker, Andrey; Johnson, Thomas M.; Konhauser, Kurt O.; Lyons, Timothy W.; Rouxel, Olivier J.

    2014-04-01

    The early Earth was characterized by the absence of oxygen in the ocean-atmosphere system, in contrast to the well-oxygenated conditions that prevail today. Atmospheric concentrations first rose to appreciable levels during the Great Oxidation Event, roughly 2.5-2.3 Gyr ago. The evolution of oxygenic photosynthesis is generally accepted to have been the ultimate cause of this rise, but it has proved difficult to constrain the timing of this evolutionary innovation. The oxidation of manganese in the water column requires substantial free oxygen concentrations, and thus any indication that Mn oxides were present in ancient environments would imply that oxygenic photosynthesis was ongoing. Mn oxides are not commonly preserved in ancient rocks, but there is a large fractionation of molybdenum isotopes associated with the sorption of Mo onto the Mn oxides that would be retained. Here we report Mo isotopes from rocks of the Sinqeni Formation, Pongola Supergroup, South Africa. These rocks formed no less than 2.95 Gyr ago in a nearshore setting. The Mo isotopic signature is consistent with interaction with Mn oxides. We therefore infer that oxygen produced through oxygenic photosynthesis began to accumulate in shallow marine settings at least half a billion years before the accumulation of significant levels of atmospheric oxygen.

  11. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    PubMed

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  12. Precambrian crustal evolution of Peninsular India: A 3.0 billion year odyssey

    NASA Astrophysics Data System (ADS)

    Meert, Joseph G.; Pandit, Manoj K.; Pradhan, Vimal R.; Banks, Jonathan; Sirianni, Robert; Stroud, Misty; Newstead, Brittany; Gifford, Jennifer

    2010-11-01

    The Precambrian geologic history of Peninsular India covers nearly 3.0 billion years of time. India is presently attached to the Eurasian continent although it remains (for now) a separate plate. It comprises several cratonic nuclei namely, Aravalli-Bundelkhand, Eastern Dharwar, Western Dharwar, Bastar and Singhbhum Cratons along with the Southern Granulite Province. Cratonization of India was polyphase, but a stable configuration between the major elements was largely complete by 2.5 Ga. Each of the major cratons was intruded by various age granitoids, mafic dykes and ultramafic bodies throughout the Proterozoic. The Vindhyan, Chhattisgarh, Cuddapah, Pranhita-Godavari, Indravati, Bhima-Kaladgi, Kurnool and Marwar basins are the major Meso to Neoproterozoic sedimentary repositories. In this paper we review the major tectonic and igneous events that led to the formation of Peninsular India and provide an up to date geochronologic summary of the Precambrian. India is thought to have played a role in a number of supercontinental cycles including (from oldest to youngest) Ur, Columbia, Rodinia, Gondwana and Pangea. This paper gives an overview of the deep history of Peninsular India as an introduction to this special TOIS volume.

  13. Broadcasts for a billion: the growth of commercial television in China.

    PubMed

    Schmuck, C

    1987-01-01

    At present, Chinese television reaches 35% of the population (80-90% in urban areas) and is used by the government as a source of education and information. In recognition of the potential market represented by 1.1 billions consumers, Western advertisers have commissioned elaborate market research studies. Drama, sports, news, and movies are consistently identified as the favorite type of programming among Chinese television viewers. About 75% of Beijing adults watch television daily, making the medium both an important target for advertising campaigns and a way for Westerners to influence Chinese business and government leaders. Western advertisers have tended to concentrate their investments in the more urban, affluent regions where products have the greatest likelihood of being sold. There has been a recent trend, however, toward industrial commercials, with British and French companies buying television time to promote their image as partners in China's modernization. Key to the future of commercial advertising on Chinese Television. In many provinces, local television stations have developed a unique character and portray different sociocultural values than the national channel. Outside advertisers have sometimes experienced problems with local networks that substitute local advertising without informing the network. To correct this situation, the government is enacting pro-sponsor regulations that forbid the preemption of the national channel and its advertisements. At the same time, efforts are being made to improve relationships with local television stations by either paying them a fee or airing local commercials on the national network. PMID:12342936

  14. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon.

    PubMed

    Bell, Elizabeth A; Boehnke, Patrick; Harrison, T Mark; Mao, Wendy L

    2015-11-24

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼ 3.5 billion years (Ga), the chemofossil record arguably to ∼ 3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ(13)CPDB of -24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼ 300 My earlier than has been previously proposed.

  15. Sharing global CO2 emission reductions among one billion high emitters.

    PubMed

    Chakravarty, Shoibal; Chikkatur, Ananth; de Coninck, Heleen; Pacala, Stephen; Socolow, Robert; Tavoni, Massimo

    2009-07-21

    We present a framework for allocating a global carbon reduction target among nations, in which the concept of "common but differentiated responsibilities" refers to the emissions of individuals instead of nations. We use the income distribution of a country to estimate how its fossil fuel CO(2) emissions are distributed among its citizens, from which we build up a global CO(2) distribution. We then propose a simple rule to derive a universal cap on global individual emissions and find corresponding limits on national aggregate emissions from this cap. All of the world's high CO(2)-emitting individuals are treated the same, regardless of where they live. Any future global emission goal (target and time frame) can be converted into national reduction targets, which are determined by "Business as Usual" projections of national carbon emissions and in-country income distributions. For example, reducing projected global emissions in 2030 by 13 GtCO(2) would require the engagement of 1.13 billion high emitters, roughly equally distributed in 4 regions: the U.S., the OECD minus the U.S., China, and the non-OECD minus China. We also modify our methodology to place a floor on emissions of the world's lowest CO(2) emitters and demonstrate that climate mitigation and alleviation of extreme poverty are largely decoupled.

  16. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    NASA Astrophysics Data System (ADS)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  17. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon.

    PubMed

    Bell, Elizabeth A; Boehnke, Patrick; Harrison, T Mark; Mao, Wendy L

    2015-11-24

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼ 3.5 billion years (Ga), the chemofossil record arguably to ∼ 3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ(13)CPDB of -24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼ 300 My earlier than has been previously proposed. PMID:26483481

  18. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes

    NASA Astrophysics Data System (ADS)

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D.

    2014-07-01

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments.

  19. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    PubMed Central

    Bell, Elizabeth A.; Harrison, T. Mark; Mao, Wendy L.

    2015-01-01

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼3.5 billion years (Ga), the chemofossil record arguably to ∼3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of −24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼300 My earlier than has been previously proposed. PMID:26483481

  20. Analysis of precious metals at parts-per-billion levels in industrial applications

    NASA Astrophysics Data System (ADS)

    Tickner, James; O'Dwyer, Joel; Roach, Greg; Smith, Michael; Van Haarlem, Yves

    2015-11-01

    Precious metals, including gold and the platinum group metals (notable Pt, Pd and Rh), are mined commercially at concentrations of a few parts-per-million and below. Mining and processing operations demand sensitive and rapid analysis at concentrations down to about 100 parts-per-billion (ppb). In this paper, we discuss two technologies being developed to meet this challenge: X-ray fluorescence (XRF) and gamma-activation analysis (GAA). We have designed on-stream XRF analysers capable of measuring targeted elements in slurries with precisions in the 35-70 ppb range. For the past two years, two on-stream analysers have been in continuous operation at a precious metals concentrator plant. The simultaneous measurement of feed and waste stream grades provides real-time information on metal recovery, allowing changes in operating conditions and plant upsets to be detected and corrected more rapidly. Separately, we have been developing GAA for the measurement of gold as a replacement for the traditional laboratory fire-assay process. High-energy Bremsstrahlung X-rays are used to excite gold via the 197Au(γ,γ‧)197Au-M reaction, and the gamma-rays released in the decay of the meta-state are then counted. We report on work to significantly improve accuracy and detection limits.

  1. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    NASA Astrophysics Data System (ADS)

    Lowenstern, J. B.; Evans, W. C.; Bergfeld, D.; Hunt, A. G.

    2014-02-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  2. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone.

    PubMed

    Lowenstern, J B; Evans, W C; Bergfeld, D; Hunt, A G

    2014-02-20

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  3. Rapid oxygenation of Earth's atmosphere 2.33 billion years ago.

    PubMed

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J; Wang, David T; Xie, Shucheng; Summons, Roger E

    2016-05-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth's biogeochemical cycles. Although "whiffs" of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly-within 1 to 10 million years-and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, "Snowball Earth" glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  4. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes.

    PubMed

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D

    2014-01-01

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments. PMID:25068404

  5. Rapid analysis of perchlorate in drinking water at parts per billion levels using microchip electrophoresis.

    PubMed

    Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S

    2010-05-01

    A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.

  6. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone.

    PubMed

    Lowenstern, J B; Evans, W C; Bergfeld, D; Hunt, A G

    2014-02-20

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions. PMID:24553240

  7. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    DOE PAGES

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; Mao, Wendy L.

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in amore » crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.« less

  8. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    USGS Publications Warehouse

    Lowenstern, Jacob B.; Evans, William C.; Bergfeld, D.; Hunt, Andrew G.

    2014-01-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents1. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot2. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  9. Methods for fitting a parametric probability distribution to most probable number data.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-07-01

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  10. A Mathematical Microworld for Students to Learn Introductory Probability.

    ERIC Educational Resources Information Center

    Jiang, Zhonghong; Potter, Walter D.

    1993-01-01

    Describes the Microworld Chance, a simulation-oriented computer environment that allows students to explore probability concepts in five subenvironments: coins, dice, spinners, thumbtacks, and marbles. Results of a teaching experiment to examine the effectiveness of the microworld in changing students' misconceptions about probability are…

  11. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  12. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  13. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  14. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  15. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  16. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  17. Multiple-event probability in general-relativistic quantum mechanics

    SciTech Connect

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-04-15

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse.

  18. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  19. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  20. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    ERIC Educational Resources Information Center

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response…