Science.gov

Sample records for 5-percent probability billion

  1. Missing billions.

    PubMed

    Conly, S

    1997-01-01

    This article discusses funding of population programs that support the Cairo International Conference on Population and Development's Plan of Action. The Plan of Action calls for a quadrupling of annual financial commitments for population programs to $17 billion by the year 2000 and $22 billion by 2015. The increased expenditures would cover the increased demand for services from unmet need and population growth. Donor countries are expected to increase their share from the current 25% to about 33%, or $5.7 billion by the year 2000. The estimates are in 1993 constant dollars. $17 billion is less than the $40 billion that is spent worldwide on playing golf. During 1993-94, general donor support increased to $1.2 billion. Denmark, Germany, Japan, the Netherlands, the United Kingdom, and the United States increased their support. The United States doubled its support for population programs during 1992-95 to $583 million. During 1996-97 the US Congress cut funding back to the 1995 level. France, Italy, Spain, Belgium, and Austria have lagged in support for population programs in the present and the past. Equal burden sharing would require the US to increase funding to $1.9 billion. Developed country assistance declined to the lowest share of combined gross national product since 1970. This shifts the burden to multilateral sources. The European Union is committed to increasing its funding, and the World Bank increased funding for population and reproductive health to about $600 million in 1996 from $424 million in 1994. Bangladesh, China, India, Indonesia, Mexico, South Africa, and Turkey spent 85% of all government expenditures on family planning in developing countries. External donors in Africa are the main support of family planning. Private consumers in Latin America pay most of the costs of family planning. External assistance will be needed for some time.

  2. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  3. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  4. Median CBO Salary Rises by 4.5 Percent, Annual Study Finds.

    ERIC Educational Resources Information Center

    Business Officer, 1997

    1997-01-01

    An annual national survey of college and university salaries found chief business officers' salaries rose 4.5 percent in 1996-97, less than the previous year. Salaries of women and minority CBOs continued to gain equity with that of men. Rates of increase varied by institution type. Salary gains for all administrative job types were less than in…

  5. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Fibers present in amounts of less than 5... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... Act, as amended, no fiber present in the amount of less than 5 percent of the total fiber weight...

  6. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Fibers present in amounts of less than 5... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... Act, as amended, no fiber present in the amount of less than 5 percent of the total fiber weight...

  7. Evaluation of EA-934NA with 2.5 percent Cab-O-Sil

    NASA Technical Reports Server (NTRS)

    Caldwell, Gordon A.

    1990-01-01

    Currently, Hysol adhesive EA-934NA is used to bond the Field Joint Protection System on the Shuttle rocket motors at Kennedy Space Center. However, due to processing problems, an adhesive with a higher viscosity is needed to alleviate these difficulties. One possible solution is to add Cab-O-Sil to the current adhesive. The adhesive strength and bond strengths that can be obtained when 2.5 percent Cab-O-Sil is added to adhesive EA-934NA are examined and tested over a range of test temperatures from -20 to 300 F. Tensile adhesion button and lap shear specimens were bonded to D6AC steel and uniaxial tensile specimens (testing for strength, initial tangent modulus, elongation and Poisson's ratio) were prepared using Hysol adhesive EA-934NA with 2.5 percent Cab-O-Sil added. These specimens were tested at -20, 20, 75, 100, 125, 150, 200, 250, and 300 F, respectively. Additional tensile adhesion button specimens bonding Rust-Oleum primed and painted D6AC steel to itself and to cork using adhesive EA-934NA with 2.5 percent Cab-O-Sil added were tested at 20, 75, 125, 200, and 300 F, respectively. Results generally show decreasing strength values with increasing test temperatures. The bond strengths obtained using cork as a substrate were totally dependent on the cohesive strength of the cork.

  8. Nine billion or bust?

    NASA Astrophysics Data System (ADS)

    nerd, nerd; Pepperday, Mike; Szautner, a. a. z.

    2014-02-01

    In reply to a review of Tony Ryan and Steve McKevitt's book Project Sunshine, which explores ways in which the Earth could support a future population of nine billion people (Letting the sunshine in, November 2013 pp50-51, http://ow.ly/r0FTM).

  9. 43 CFR 30.183 - Who may receive a renounced interest of less than 5 percent in trust or restricted land?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... less than 5 percent in trust or restricted land? 30.183 Section 30.183 Public Lands: Interior Office of... may receive a renounced interest of less than 5 percent in trust or restricted land? You may renounce... less than 5 percent of the entire undivided ownership of a parcel of land only in favor of: (a)...

  10. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22232 Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines). If methane reaches...

  11. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...

  12. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...

  13. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22232 Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines). If methane reaches...

  14. World population beyond six billion.

    PubMed

    Gelbard, A; Haub, C; Kent, M M

    1999-03-01

    This world report reviews population growth pre-1900, population change during 1900-50 and 1950-2000, causes and effects of population change and projections to 2050. World population grew from 2 billion in 1900 to almost 6 billion in 2000. Population showed more rapid growth in the 17th and 18th centuries. Better hygiene and public sanitation in the 19th century led to expanded life expectancies and quicker growth, primarily in developed countries. Demographic transition in the 19th and 20th centuries was the result of shifts from high to low mortality and fertility. The pace of change varies with culture, level of economic development, and other factors. Not all countries follow the same path of change. The reproductive revolution in the mid-20th century and modern contraception led to greater individual control of fertility and the potential for rapid fertility decline. Political and cultural barriers that limit access affect the pace of decline. Population change is also affected by migration. Migration has the largest effect on the distribution of population. Bongaarts explains differences in fertility by the proportion in unions, contraceptive prevalence, infertility, and abortion. Educational status has a strong impact on adoption of family planning. Poverty is associated with multiple risks. In 2050, population could reach 10.7 billion or remain low at 7.3 billion.

  15. 5.5 billion -- and growing.

    PubMed

    Robey, B

    1992-07-17

    On World Population Day in 1992 the total world population reached 5.5 billion, 100 million more than on July 11, 1991, and it is expected to pass the 6 billion figure by 2000. In developed countries the average number of children is 2/family, while in developing countries the norm is 4 children, a substantial drop from 6 children in the 1960s. The 2 billion level was reached in 1930, the world population doubled from 1 billion in 100 years. The billion mark in 1960 took only 30 years, the 4 billion figure was reached in 1975 in only 15 years, and to grow to 5 billion in 1987 took only 12 years. A 20-year delay to reach replacement level adds another 1 billion to the population size. If the 2-children-per family size had been reached in 1990, the total would still have increased to 8 billion by 2100. Provided the present trend of 3.4 children/couple continues, there will be 102 billion people in 2100. Some claim that the resources of the Earth are already overtaxed with ever-worsening environmental pollution. Family planning information and services have to me made available to those millions who want to avoid pregnancy and ensure a better future for fewer children.

  16. Life with Four Billion Atoms

    SciTech Connect

    Knight, Thomas

    2013-04-10

    Today it is commonplace to design and construct single silicon chips with billions of transistors. These are complex systems, difficult (but possible) to design, test, and fabricate. Remarkably, simple living systems can be assembled from a similar number of atoms, most of them in water molecules. In this talk I will present the current status of our attempts at full understanding and complexity reduction of one of the simplest living systems, the free-living bacterial species Mesoplasma florum. This 400 nm diameter cell thrives and replicates every 40 minutes with a genome of only 800 kilobases. Our recent experiments using transposon gene knockouts identified 354 of 683 annotated genes as inessential in laboratory culture when inactivated individually. While a functional redesigned genome will certainly not remove all of those genes, this suggests that roughly half the genome can be removed in an intentional redesign. I will discuss our recent knockout results and methodology, and our future plans for Genome re-engineering using targeted knock-in/knock-out double recombination; whole cell metabolic models; comprehensive whole cell metabolite measurement techniques; creation of plug-and-play metabolic modules for the simplified organism; inherent and engineered biosafety control mechanisms. This redesign is part of a comprehensive plan to lay the foundations for a new discipline of engineering biology. Engineering biological systems requires a fundamentally different viewpoint from that taken by the science of biology. Key engineering principles of modularity, simplicity, separation of concerns, abstraction, flexibility, hierarchical design, isolation, and standardization are of critical importance. The essence of engineering is the ability to imagine, design, model, build, and characterize novel systems to achieve specific goals. Current tools and components for these tasks are primitive. Our approach is to create and distribute standard biological parts

  17. Countdown to Six Billion Teaching Kit.

    ERIC Educational Resources Information Center

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  18. Spend Billions and They Will Come

    ERIC Educational Resources Information Center

    Fox, Bette-Lee

    2004-01-01

    People look at one billion dollars in one of two ways: if it is the result of the long, hard effort of years of fundraising, they rejoice; if it signifies an astronomical budget deficit, they cringe. How, then, should people respond as a community to reaching the $1 billion mark ($1,242,436,438, to be exact) in this year's spending for public…

  19. Americans Are Spending Billions Nipping and Tucking

    MedlinePlus

    ... Tucking New report details costs of most popular plastic surgery procedures To use the sharing features on ... A new report from the American Society of Plastic Surgeons (ASPS) found that Americans spent $16 billion ...

  20. Atmospheric oxygenation three billion years ago.

    PubMed

    Crowe, Sean A; Døssing, Lasse N; Beukes, Nicolas J; Bau, Michael; Kruger, Stephanus J; Frei, Robert; Canfield, Donald E

    2013-09-26

    It is widely assumed that atmospheric oxygen concentrations remained persistently low (less than 10(-5) times present levels) for about the first 2 billion years of Earth's history. The first long-term oxygenation of the atmosphere is thought to have taken place around 2.3 billion years ago, during the Great Oxidation Event. Geochemical indications of transient atmospheric oxygenation, however, date back to 2.6-2.7 billion years ago. Here we examine the distribution of chromium isotopes and redox-sensitive metals in the approximately 3-billion-year-old Nsuze palaeosol and in the near-contemporaneous Ijzermyn iron formation from the Pongola Supergroup, South Africa. We find extensive mobilization of redox-sensitive elements through oxidative weathering. Furthermore, using our data we compute a best minimum estimate for atmospheric oxygen concentrations at that time of 3 × 10(-4) times present levels. Overall, our findings suggest that there were appreciable levels of atmospheric oxygen about 3 billion years ago, more than 600 million years before the Great Oxidation Event and some 300-400 million years earlier than previous indications for Earth surface oxygenation.

  1. Emergence of modern continental crust about 3 billion years ago

    NASA Astrophysics Data System (ADS)

    Dhuime, Bruno; Wuestefeld, Andreas; Hawkesworth, Chris J.

    2015-07-01

    The continental crust is the principal record of conditions on the Earth during the past 4.4 billion years. However, how the continental crust formed and evolved through time remains highly controversial. In particular, the composition and thickness of juvenile continental crust are unknown. Here we show that Rb/Sr ratios can be used as a proxy for both the silica content and the thickness of the continental crust. We calculate Rb/Sr ratios of the juvenile crust for over 13,000 samples, with Nd model ages ranging from the Hadean to Phanerozoic. The ratios were calculated based on the evolution of Sr isotopes in the period between the TDM Nd model age and the crystallization of the samples analysed. We find that the juvenile crust had a low silica content and was largely mafic in composition during the first 1.5 billion years of Earth’s evolution, consistent with magmatism on a pre-plate tectonics planet. About 3 billion years ago, the Rb/Sr ratios of the juvenile continental crust increased, indicating that the newly formed crust became more silica-rich and probably thicker. This transition is in turn linked to the onset of plate tectonics and an increase of continental detritus into the oceans.

  2. Four billion people facing severe water scarcity

    PubMed Central

    Mekonnen, Mesfin M.; Hoekstra, Arjen Y.

    2016-01-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare. PMID:26933676

  3. Four billion people facing severe water scarcity.

    PubMed

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  4. Great Plains makes 100 billion cubic feet

    SciTech Connect

    Not Available

    1987-03-01

    The Great Plains coal gasification plant on January 18, 1987 produced its 100 billionth cubic foot of gas since start-up July 28, 1984. Owned by the Department of Energy and operated by ANG Coal Gasification Company, the plant uses the Lurgi process to produce about 50 billion cubic feet per year of gas from five million tons per year of lignite. The plant has been performing at well above design capacity.

  5. Teledesic pushes $9-billion, 900-satellite system

    NASA Astrophysics Data System (ADS)

    1994-03-01

    Teledesic Corp. is seeking FCC approval to deploy a communication satellite system, costing $9 billion and using more than 900 satellites in low Earth orbit. This system would provide telephone and broadband data service to remote areas and developing countries. The two major stockholders in Teledesic are William Gates (of Microsoft Corp.) and Craig McCaw (of McCaw Cellular Communications). Each satellite would act as a node in a packet-switching network. The satellites would provide continuous global coverage.

  6. The updated billion-ton resource assessment

    SciTech Connect

    Turhollow, Anthony; Perlack, Robert; Eaton, Laurence; Langholtz, Matthew; Brandt, Craig; Downing, Mark; Wright, Lynn; Skog, Kenneth; Hellwinckel, Chad; Stokes, Bryce; Lebow, Patricia

    2014-10-03

    This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the billion-ton study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. However, in looking at the major categories of feedstocks the forest residue biomass potential was determined to be less owing to tighter restrictions on forest residue supply including restrictions due to limited projected increase in traditional harvest for pulpwood and sawlogs. The crop residue potential was also determined to be less because of the consideration of soil carbon and not allowing residue removal from conventionally tilled corn acres. The energy crop potential was estimated to be much greater largely because of land availability and modeling of competition among various competing uses of the land. Generally, the scenario assumptions in the updated assessment are much more plausible to show a billion-ton resource, which would be sufficient to displace 30% or more of the country s present petroleum consumption.

  7. The updated billion-ton resource assessment

    DOE PAGES

    Turhollow, Anthony; Perlack, Robert; Eaton, Laurence; ...

    2014-10-03

    This paper summarizes the results of an update to a resource assessment, published in 2005, commonly referred to as the billion-ton study (BTS). The updated results are consistent with the 2005 BTS in terms of overall magnitude. However, in looking at the major categories of feedstocks the forest residue biomass potential was determined to be less owing to tighter restrictions on forest residue supply including restrictions due to limited projected increase in traditional harvest for pulpwood and sawlogs. The crop residue potential was also determined to be less because of the consideration of soil carbon and not allowing residue removalmore » from conventionally tilled corn acres. The energy crop potential was estimated to be much greater largely because of land availability and modeling of competition among various competing uses of the land. Generally, the scenario assumptions in the updated assessment are much more plausible to show a billion-ton resource, which would be sufficient to displace 30% or more of the country s present petroleum consumption.« less

  8. Simulating Billion-Task Parallel Programs

    SciTech Connect

    Perumalla, Kalyan S; Park, Alfred J

    2014-01-01

    In simulating large parallel systems, bottom-up approaches exercise detailed hardware models with effects from simplified software models or traces, whereas top-down approaches evaluate the timing and functionality of detailed software models over coarse hardware models. Here, we focus on the top-down approach and significantly advance the scale of the simulated parallel programs. Via the direct execution technique combined with parallel discrete event simulation, we stretch the limits of the top-down approach by simulating message passing interface (MPI) programs with millions of tasks. Using a timing-validated benchmark application, a proof-of-concept scaling level is achieved to over 0.22 billion virtual MPI processes on 216,000 cores of a Cray XT5 supercomputer, representing one of the largest direct execution simulations to date, combined with a multiplexing ratio of 1024 simulated tasks per real task.

  9. Probability Theory

    NASA Astrophysics Data System (ADS)

    Jaynes, E. T.; Bretthorst, G. Larry

    2003-04-01

    Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.

  10. Life: the first two billion years.

    PubMed

    Knoll, Andrew H; Bergmann, Kristin D; Strauss, Justin V

    2016-11-05

    Microfossils, stromatolites, preserved lipids and biologically informative isotopic ratios provide a substantial record of bacterial diversity and biogeochemical cycles in Proterozoic (2500-541 Ma) oceans that can be interpreted, at least broadly, in terms of present-day organisms and metabolic processes. Archean (more than 2500 Ma) sedimentary rocks add at least a billion years to the recorded history of life, with sedimentological and biogeochemical evidence for life at 3500 Ma, and possibly earlier; phylogenetic and functional details, however, are limited. Geochemistry provides a major constraint on early evolution, indicating that the first bacteria were shaped by anoxic environments, with distinct patterns of major and micronutrient availability. Archean rocks appear to record the Earth's first iron age, with reduced Fe as the principal electron donor for photosynthesis, oxidized Fe the most abundant terminal electron acceptor for respiration, and Fe a key cofactor in proteins. With the permanent oxygenation of the atmosphere and surface ocean ca 2400 Ma, photic zone O2 limited the access of photosynthetic bacteria to electron donors other than water, while expanding the inventory of oxidants available for respiration and chemoautotrophy. Thus, halfway through Earth history, the microbial underpinnings of modern marine ecosystems began to take shape.This article is part of the themed issue 'The new bacteriology'.

  11. Eight billion asteroids in the Oort cloud

    NASA Astrophysics Data System (ADS)

    Shannon, Andrew; Jackson, Alan P.; Veras, Dimitri; Wyatt, Mark

    2015-01-01

    The Oort cloud is usually thought of as a collection of icy comets inhabiting the outer reaches of the Solar system, but this picture is incomplete. We use simulations of the formation of the Oort cloud to show that ˜4 per cent of the small bodies in the Oort cloud should have formed within 2.5 au of the Sun, and hence be ice-free rock-iron bodies. If we assume that these Oort cloud asteroids have the same size distribution as their cometary counterparts, the Large Synoptic Survey Telescope should find roughly a dozen Oort cloud asteroids during 10 years of operations. Measurement of the asteroid fraction within the Oort cloud can serve as an excellent test of the Solar system's formation and dynamical history. Oort cloud asteroids could be of particular concern as impact hazards as their high mass density, high impact velocity, and low visibility make them both hard to detect and hard to divert or destroy. However, they should be a rare class of object, and we estimate globally catastrophic collisions should only occur about once per billion years.

  12. Uranium in Canada: A billion dollar industry

    SciTech Connect

    Ruzicka, V. )

    1989-12-01

    In 1988, Canada maintained its position as the world's leading producer of uranium with an output of more than 12,400 MT of uranium in concentrates, worth $1.1 billion Canadian. As domestic requirements represent only 15% of current Canadian production, most of the output was exported. With current implementation of the Canada/US Free Trade Agreement, the US has become Canada's major uranium export customer. With a large share of the world's known uranium resources, Canada remains the focus of international uranium exploration activity. In 1988, the uranium exploration expenditures in Canada exceeded $58 million Canadian. The principal exploration targets were deposits associated with Proterozoic unconformities in Saskatchewan and Northwest Territories, particularly those in the Athabasca and Thelon basin regions of the Canadian Shield. Major attention was also paid to polymetallic deposits in which uranium is associated with precious metals, such as gold and platinum group elements. Conceptual genetic models for these deposit types represent useful tools to guide exploration.

  13. Agroecohydrology: Key to Feeding 9 Billion?

    NASA Astrophysics Data System (ADS)

    Herrick, J.

    2011-12-01

    Agricultural production necessary to feed 9 billion people in 2050 depends on increased production on existing croplands, and expanding onto 'marginal' lands. A high proportion of these lands are marginal because they are too steep or too dry to reliably support crop production. These same characteristics increase their susceptibility to accelerated erosion, leading (for most soil profiles) to further reductions in plant available water as infiltration and soil profile water holding capacity decline. Sustaining production on these marginal lands will require careful land use planning. In this paper, we present a land use planning framework that integrates 4 elements: (1) potential production (based on soil profile characteristics), (2) edaphic, topographic and climatic limitations to production, (3) soil resistance to degradation, and (4) resilience. This framework expands existing land capability classification systems through the integration of biophysical feedbacks and thresholds. State and transition models, similar to those currently applied to rangelands in the United States and other countries, are used to organize and communicate knowledge about the sustainability of different land use changes and management actions at field to regional scales. This framework emphasizes hydrologic characteristics of soil profiles and landscapes over fertility because fertility declines are more easily addressed through increased inputs. The presentation will conclude with a discussion of how research in ecohydrology can be more effectively focused to support sustainable food production in the context of increasingly rapid social and economic changes throughout the world.

  14. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    DTIC Science & Technology

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  15. Sneak Peek to the 2016 Billion-Ton Report

    SciTech Connect

    2016-06-01

    The 2005 Billion-Ton Study became a landmark resource for bioenergy stakeholders, detailing for the first time the potential to produce at least one billion dry tons of biomass annually in a sustainable manner from U.S. agriculture and forest resources. The 2011 U.S. Billion-Ton Update expanded and updated the analysis, and in 2016, the U.S. Department of Energy’s Bioenergy Technologies Office plans to release the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy.

  16. Confidence Probability versus Detection Probability

    SciTech Connect

    Axelrod, M

    2005-08-18

    In a discovery sampling activity the auditor seeks to vet an inventory by measuring (or inspecting) a random sample of items from the inventory. When the auditor finds every sample item in compliance, he must then make a confidence statement about the whole inventory. For example, the auditor might say: ''We believe that this inventory of 100 items contains no more than 5 defectives with 95% confidence.'' Note this is a retrospective statement in that it asserts something about the inventory after the sample was selected and measured. Contrast this to the prospective statement: ''We will detect the existence of more than 5 defective items in this inventory with 95% probability.'' The former uses confidence probability while the latter uses detection probability. For a given sample size, the two probabilities need not be equal, indeed they could differ significantly. Both these probabilities critically depend on the auditor's prior belief about the number of defectives in the inventory and how he defines non-compliance. In other words, the answer strongly depends on how the question is framed.

  17. EPA Survey Shows $271 Billion Needed for Nations Wastewater Infrastructure

    EPA Pesticide Factsheets

    WASHINGTON - The U.S. Environmental Protection Agency (EPA) today released a survey showing that $271 billion is needed to maintain and improve the nation's wastewater infrastructure, including the pipes that carry wastewater to treatment plants, th

  18. Harnessing Energy from the Sun for Six Billion People

    ScienceCinema

    Daniel Nocera

    2016-07-12

    Daniel Nocera, a Massachusetts Institute of Technology professor whose recent research focuses on solar-powered fuels, presents a Brookhaven Science Associates Distinguished Lecture, titled "Harnessing Energy from the Sun for Six Billion People -- One at a Time."

  19. NASA Now Minute: Earth and Space Science: 100 Billion Planets

    NASA Video Gallery

    Stephen Kane, co-author of the article, “Study Shows Our Galaxy has 100Billion Planets” reveals details about this incredible study explainsjust how common planets are in our Milky Way galaxy...

  20. Academic Pork Barrel Tops $2-Billion for the First Time.

    ERIC Educational Resources Information Center

    Brainard, Jeffrey; Borrego, Anne Marie

    2003-01-01

    Describes how, despite the growing budget deficit, Congress directed a record $2 billion to college projects in 2003, many of them dealing with security and bioterrorism. Includes data tables on the earmarks. (EV)

  1. Summary and Comparison of the 2016 Billion-Ton Report with the 2011 U.S. Billion-Ton Update

    SciTech Connect

    2016-06-01

    In terms of the magnitude of the resource potential, the results of the 2016 Billion-Ton Report (BT16) are consistent with the original 2005 Billion-Ton Study (BTS) and the 2011 report, U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry (BT2. An effort was made to reevaluate the potential forestland, agricultural, and waste resources at the roadside, then extend the analysis by adding transportation costs to a biorefinery under specified logistics assumptions to major resource fractions.

  2. White Nail Radio Transmitter: Billion Dollar Savings through Energy Efficiency

    DTIC Science & Technology

    2011-05-10

    energy consumption ashore by 50 percent CNO, Navy Energy Vision, P 10 White Nail Vision Your Cell Phone Cell ...Greenhouse Gas Power 4 1 Energy Navy Use 7.3 Billion kWh White Nail Cell Phone Savings 11 Billion kWh One and a half times!!! Saves the output of four of...Estimated Total Number of transmitters 3,000,000 Estimated total power saved Watt 1,250,000,000 Cell Phone Transmitter Efficiency 1.25 Gigawatts

  3. High-Reynolds-Number Test of a 5-Percent-Thick Low-Aspect-Ratio Semispan Wing in the Langley 0.3-Meter Transonic Cryogenic Tunnel: Wing Pressure Distributions

    NASA Technical Reports Server (NTRS)

    Chu, Julio; Lawing, Pierce L.

    1990-01-01

    A high Reynolds number test of a 5 percent thick low aspect ratio semispan wing was conducted in the adaptive wall test section of the Langley 0.3 m Transonic Cryogenic Tunnel. The model tested had a planform and a NACA 64A-105 airfoil section that is similar to that of the pressure instrumented canard on the X-29 experimental aircraft. Chordwise pressure data for Mach numbers of 0.3, 0.7, and 0.9 were measured for an angle-of-attack range of -4 to 15 deg. The associated Reynolds numbers, based on the geometric mean chord, encompass most of the flight regime of the canard. This test was a free transition investigation. A summary of the wing pressures are presented without analysis as well as adapted test section top and bottom wall pressure signatures. However, the presented graphical data indicate Reynolds number dependent complex leading edge separation phenomena. This data set supplements the existing high Reynolds number database and are useful for computational codes comparison.

  4. Winglets Save Billions of Dollars in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  5. Projecting a world of 10.4 billion.

    PubMed

    Yanagishita, M

    1988-01-01

    Summary data are presented from the World Bank's "World Population 1987-88: Short and Long-Term Estimates by Age and Sex with Related Demographic Statistics." The projections do not differ much from those in the World Bank's 1985 projection except for large upward revisions for South Asian and West Asian countries and especially large upward revisions for Kenya, Ethiopia, Burkina Faso, Nigeria, and Egypt. World population is expected to reach 10.4 billion in 2100 and to stabilize at 10 billion around year 2070. Intermediate figures are given for year 2000 (6.2 billion) and year 2050 (9.5 billion). The fifteen most populous countries in 2100 will be (in millions) China (1683), India (1678), Nigeria (529), Pakistan (395), USSR (385), Indonesia (363), Brazil (292), US (279), Ethiopia (204), Mexico (197), Iran (157), Philippines (137), Egypt (132), Japan (124), and Tanzania (123). The world's annual growth rate (currently 1.7%) will decrease to .9% in 2025 and .07% in 2100 due to decreasing birth rates, especially in Africa. Nevertheless, the population of Sub-Saharan Africa will be 5 times its present size. The slowest annual growth will be for Europe, North America, and China; and the highest for Sri Lanka, Pakistan, and Bangladesh.

  6. Bill and Melinda Gates Pledge $1-Billion for Minority Scholarships.

    ERIC Educational Resources Information Center

    Monaghan, Peter; Lederman, Douglas; van der Werf, Martin; Pulley, John

    1999-01-01

    Reports on a $1 billion dollar grant from Bill and Melinda Gates to send 20,000 low-income minority students to college. The Gates Millenium Scholars Program will require students to demonstrate financial need and maintain a 3.0 grade point average in college. A list of the largest private gifts to higher education since 1967 is also provided. (DB)

  7. Colleges' Billion-Dollar Campaigns Feel the Economy's Sting

    ERIC Educational Resources Information Center

    Masterson, Kathryn

    2009-01-01

    The economy's collapse has caught up with the billion-dollar campaign. In the past 12 months, the amount of money raised by a dozen of the colleges engaged in higher education's biggest fund-raising campaigns fell 32 percent from the year before. The decline, which started before the worst of the recession, has forced colleges to postpone…

  8. Skeptics Say Billions for Education Won't Stimulate Economy

    ERIC Educational Resources Information Center

    Field, Kelly

    2009-01-01

    Skeptics question whether infusion of billions of dollars for education in the economic-stimulus bill before Congress would actually give a healthy jolt to the economy. The bill would help thousands of students pay for college and could give colleges money to fix crumbling buildings. Some members of Congress are calling for the removal of…

  9. Four laser companies to exceed $1 billion revenue in 2016

    NASA Astrophysics Data System (ADS)

    Thoss, Andreas F.

    2017-02-01

    It seems very likely that for first time four companies will exceed the revenue of 1 billion in 2016. This comes along with substantial changes in the market for lasers and laser systems. The article analyzes some of the changes and looks at the individual success strategies of the major players in these markets.

  10. Conservation of protein structure over four billion years

    PubMed Central

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M.; Gaucher, Eric A.; Sanchez-Ruiz, Jose M.; Gavira, Jose A.

    2013-01-01

    SUMMARY Little is known with certainty about the evolution of protein structures in general and the degree of protein structure conservation over planetary time scales in particular. Here we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating back up to ~4 billion years before present. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold while only small structural changes have occurred over 4 billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods of time and is followed by long periods of structural stasis. PMID:23932589

  11. Conservation of protein structure over four billion years.

    PubMed

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M; Gaucher, Eric A; Sanchez-Ruiz, Jose M; Gavira, Jose A

    2013-09-03

    Little is known about the evolution of protein structures and the degree of protein structure conservation over planetary time scales. Here, we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating up to approximately four billion years ago. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold, whereas only small structural changes have occurred over four billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods and is followed by long periods of structural stasis.

  12. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  13. Oncology pharma costs to exceed $150 billion by 2020.

    PubMed

    2016-10-01

    Worldwide costs of oncology drugs will rise above $150 billion by 2020, according to a report by the IMS Institute for Healthcare Informatics. Many factors are in play, according to IMS, including the new wave of expensive immunotherapies. Pembrolizumab (Keytruda), priced at $150,000 per year per patient, and nivolumab (Opdivo), priced at $165,000, may be harbingers of the market for cancer immunotherapies.

  14. President Carter signs $227 billion excise tax measure

    SciTech Connect

    Miller, C.J.; McAfee, J.; Dibona, C.J.; Carter, J.

    1980-04-07

    According to President J. Carter, who signed into law a $227 billion excise tax (windfall profits tax) on revenue from decontrolled U.S. crude oil production, the new tax program will provide the U.S. with the incentive and the means to produce and conserve domestic oil and replace more oil with alternative sources of energy. According to C. DiBona (API), the new tax will discourage the increased amount of domestic production required to compensate, by the mid-to-late 1980's, for a 1.7 million bbl/day shortfall, which will have to be made up with imports from foreign producers. According to J. McAfee of Gulf Oil Corp., only a token amount, about $34 billion of the $227 billion which will be raised by the new tax over the next decade, will be devoted to energy development and mass transit. According to C. J. Miller of the Independent Petroleum Association of America, the tax's complex and sometimes conflicting regulations will pose harsh problems for smaller producers.

  15. On the constancy of the lunar cratering flux over the past 3.3 billion yr

    NASA Technical Reports Server (NTRS)

    Guinness, E. A.; Arvidson, R. E.

    1977-01-01

    Utilizing a method that minimizes random fluctuations in sampling crater populations, it can be shown that the ejecta deposit of Tycho, the floor of Copernicus, and the region surrounding the Apollo 12 landing site have incremental crater size-frequency distributions that can be expressed as log-log linear functions over the diameter range from 0.1 to 1 km. Slopes are indistinguishable for the three populations, probably indicating that the surfaces are dominated by primary craters. Treating the crater populations of Tycho, the floor of Copernicus, and Apollo 12 as primary crater populations contaminated, but not overwhelmed, with secondaries, allows an attempt at calibration of the post-heavy bombardment cratering flux. Using the age of Tycho as 109 m.y., Copernicus as 800 m.y., and Apollo 12 as 3.26 billion yr, there is no basis for assuming that the flux has changed over the past 3.3 billion yr. This result can be used for dating intermediate aged surfaces by crater density.

  16. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  17. Constraining the last 7 billion years of galaxy evolution in semi-analytic models

    NASA Astrophysics Data System (ADS)

    Mutch, Simon J.; Poole, Gregory B.; Croton, Darren J.

    2013-01-01

    We investigate the ability of the Croton et al. semi-analytic model to reproduce the evolution of observed galaxies across the final 7 billion years of cosmic history. Using Monte Carlo Markov Chain techniques we explore the available parameter space to produce a model which attempts to achieve a statistically accurate fit to the observed stellar mass function at z = 0 and z ≈ 0.8, as well as the local black hole-bulge relation. We find that in order to be successful we are required to push supernova feedback efficiencies to extreme limits which are, in some cases, unjustified by current observations. This leads us to the conclusion that the current model may be incomplete. Using the posterior probability distributions provided by our fitting, as well as the qualitative details of our produced stellar mass functions, we suggest that any future model improvements must act to preferentially bolster star formation efficiency in the most massive haloes at high redshift.

  18. Scalable in-memory RDFS closure on billions of triples.

    SciTech Connect

    Goodman, Eric L.; Mizell, David

    2010-06-01

    We present an RDFS closure algorithm, specifically designed and implemented on the Cray XMT supercomputer, that obtains inference rates of 13 million inferences per second on the largest system configuration we used. The Cray XMT, with its large global memory (4TB for our experiments), permits the construction of a conceptually straightforward algorithm, fundamentally a series of operations on a shared hash table. Each thread is given a partition of triple data to process, a dedicated copy of the ontology to apply to the data, and a reference to the hash table into which it inserts inferred triples. The global nature of the hash table allows the algorithm to avoid a common obstacle for distributed memory machines: the creation of duplicate triples. On LUBM data sets ranging between 1.3 billion and 5.3 billion triples, we obtain nearly linear speedup except for two portions: file I/O, which can be ameliorated with the additional service nodes, and data structure initialization, which requires nearly constant time for runs involving 32 processors or more.

  19. Well servicing boom pushes costs over $3 billion

    SciTech Connect

    Not Available

    1981-07-01

    Results of the Annual Petroleum Engineer Well Servicing Survey are presented. The most significant change in well servicing trends was reduced abandonments - the number of abandoned wells dropped from 9011 in 1979 to 3021 in 1980. For the second year in a row, producers will spend more than $3 billion for well services. Well servicing operators performed nearly 568,000 servicing and repair jobs on 734,728 operating wells in the US during 1980. In addition, 71,000 wells were completed or recompleted in 1980. Tables of data are summarized for completion, workover, and servicing activities and for servicing operations for 11 individual US regions, including Appalachia, California on shore/off shore, Four Corners, Great Lakes, Gulf of Mexico, Louisiana (on shore), mid-continent, Rocky-Williston, Texas (on shore, includes SE New Mexico), and the southeast. The US total data exclude wells in Alaska.

  20. Ultrarelativistic heavy ion collisions: the first billion seconds

    NASA Astrophysics Data System (ADS)

    Baym, Gordon

    2016-12-01

    I first review the early history of the ultrarelativistic heavy ion program, starting with the 1974 Bear Mountain Workshop, and the 1983 Aurora meeting of the U.S. Nuclear Science Committtee, just one billion seconds ago, which laid out the initial science goals of an ultrarelativistic collider. The primary goal, to discover the properties of nuclear matter at the highest energy densities, included finding new states of matter - the quark-gluon plasma primarily - and to use collisions to open a new window on related problems of matter in cosmology, neutron stars, supernovae, and elsewhere. To bring out how the study of heavy ions and hot, dense matter in QCD has been fulfilling these goals, I concentrate on a few topics, the phase diagram of matter in QCD, and connections of heavy ion physics to cold atoms, cosmology, and neutron stars.

  1. Fast scalable visualization techniques for interactive billion-particle walkthrough

    NASA Astrophysics Data System (ADS)

    Liu, Xinlian

    This research develops a comprehensive framework for interactive walkthrough involving one billion particles in an immersive virtual environment to enable interrogative visualization of large atomistic simulation data. As a mixture of scientific and engineering approaches, the framework is based on four key techniques: adaptive data compression based on space-filling curves, octree-based visibility and occlusion culling, predictive caching based on machine learning, and scalable data reduction based on parallel and distributed processing. In terms of parallel rendering, this system combines functional parallelism, data parallelism, and temporal parallelism to improve interactivity. The visualization framework will be applicable not only to material simulation, but also to computational biology, applied mathematics, mechanical engineering, and nanotechnology, etc.

  2. Bigger, Better Catalog Unveils Half a Billion Celestial Objects

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These frames are samples from the photographic sky surveys, which have been digitized by a technical team at the Space Telescope Science Institute to support the Hubble Space Telescope operations. The team processed these images to create a new astronomical catalog, called the Guide Star Catalog II. This project was undertaken by the Space Telescope Science Institute as an upgrade to an earlier sky survey and catalog (DSS-I and GSC-I), initially done to provide guide stars for pointing the Hubble Space Telescope. By virtue of its sheer size, the DSS-II and GSC-II have many research applications for both professional and amateur astronomers. [Top] An example from the DSS-II shows the Rosette Nebula, (originally photographed by the Palomar Observatory) as digitized in the DSS-I (left) and DSS-II (right). The DSS-II includes views of the sky at both red and blue wavelengths, providing invaluable color information on about one billion deep-sky objects. [Bottom] This blow-up of the inset box in the raw DSS-I scan shows examples of the GSC-I and the improved GSC-II catalogs. Astronomers extracted the stars from the scanned plate of the Rosette and listed them in the catalogs. The new GSC-II catalog provides the colors, positions, and luminosities of nearly half a billion stars -- over 20 times as many as the original GSC-I. The GSC-II contains information on stars as dim as the 19th magnitude. Credit: NASA, the DSS-II and GSC-II Consortia (with images from the Palomar Observatory-STScI Digital Sky Survey of the northern sky, based on scans of the Second Palomar Sky Survey are copyright c 1993-1999 by the California Institute of Technology)

  3. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  4. $75 Billion in Formula Grants Failed to Drive Reform. Can $5 Billion in Competitive Grants Do the Job? Education Stimulus Watch. Special Report 2

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    In early 2009, Congress passed and President Barack Obama signed into law the American Recovery and Reinvestment Act (ARRA), the federal government's nearly $800 billion stimulus legislation. According to key members of Congress and the Obama administration, the education portions of the law, totaling about $100 billion, were designed both to…

  5. What Are Probability Surveys?

    EPA Pesticide Factsheets

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  6. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  7. Orbital forcing of climate 1.4 billion years ago.

    PubMed

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U; Wang, Huajian; Costa, M Mafalda; Bjerrum, Christian J; Connelly, James N; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E

    2015-03-24

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment.

  8. Nine Billion Years: Past and Future of the Solar System

    NASA Astrophysics Data System (ADS)

    Leubner, I. H.

    2013-05-01

    As the Sun is losing mass and thus gravity by radiation and solar wind, solar-planetary energy balances diminish. Since the planets are only weakly bound to the Sun, the planets have been moving away from the Sun, causing increases of orbits and orbital periods. This is modeled for selected planets from Mercury to Sedna and from the formation of the Solar system at -4.5 to +4.5 billion years (Byr/Ma). Planets were initially significantly closer to the Sun, suggesting that modeling of the formation of the solar system needs to be revisited. By +4.5Byr planets beyond Saturn will have separated from the Solar System. The presently outermost solar object, Sedna, is in the process of separation. Climate changes of Mars and Earth are modeled as a function of time. The prediction of the transition of Mars from water to ice at -3.6 Byr is in agreement with observations (-2.9 to -3.7 Byr). This provides for the first time answers to the why and when of water to ice transition on Mars. Earth temperatures are predicted to decrease by of 38, 24, and 20C between -4.5 Byr to +4.5 Byr for present temperatures of +50, 0, and -50 C, respectively. Mars: Water - Ice Transition

  9. Fuel efficient stoves for the poorest two billion

    NASA Astrophysics Data System (ADS)

    Gadgil, Ashok

    2012-03-01

    About 2 billion people cook their daily meals on generally inefficient, polluting, biomass cookstoves. The fuels include twigs and leaves, agricultural waste, animal dung, firewood, and charcoal. Exposure to resulting smoke leads to acute respiratory illness, and cancers, particularly among women cooks, and their infant children near them. Resulting annual mortality estimate is almost 2 million deaths, higher than that from malaria or tuberculosis. There is a large diversity of cooking methods (baking, boiling, long simmers, brazing and roasting), and a diversity of pot shapes and sizes in which the cooking is undertaken. Fuel-efficiency and emissions depend on the tending of the fire (and thermal power), type of fuel, stove characteristics, and fit of the pot to the stove. Thus, no one perfect fuel-efficient low-emitting stove can suit all users. Affordability imposes a further severe constraint on the stove design. For various economic strata within the users, a variety of stove designs may be appropriate and affordable. In some regions, biomass is harvested non-renewably for cooking fuel. There is also increasing evidence that black carbon emitted from stoves is a significant contributor to atmospheric forcing. Thus improved biomass stoves can also help mitigate global climate change. The speaker will describe specific work undertaken to design, develop, test, and disseminate affordable fuel-efficient stoves for internally displaced persons (IDPs) of Darfur, Sudan, where the IDPs face hardship, humiliation, hunger, and risk of sexual assault owing to their dependence on local biomass for cooking their meals.

  10. Orbital forcing of climate 1.4 billion years ago

    PubMed Central

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U.; Wang, Huajian; Costa, M. Mafalda; Bjerrum, Christian J.; Connelly, James N.; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E.

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment. PMID:25775605

  11. Dynamical Simulation of Probabilities

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  12. 3.5 billion years of reshaped Moho, southern Africa

    NASA Astrophysics Data System (ADS)

    Stankiewicz, Jacek; de Wit, Maarten

    2013-12-01

    According to some previous studies, Archean continental crust is, on global average, apparently thinner than Proterozoic crust. Subsequently, the validity of this statement has been questioned. To provide an additional perspective on this issue, we present analyses of Moho signatures derived from recent seismic data along swaths 2000 km in length across southern Africa and its flanking ocean. The imaged crust has a near continuous age range between ca. 0.1 and 3.7 billion years, and the seismic data allow direct comparison of Moho depths between adjacent Archean, Proterozoic and Phanerozoic crust. We find no simple secular change in depth to Moho over this time period. In contrast, there is significant variation in depth to Moho beneath both Archean and Proterozoic crust; Archean crust of southern Africa displays as much crustal diversity in thickness as the adjacent Proterozoic crust. The Moho beneath all crustal provinces that we have analysed has been severely altered by tectono-metamorphic and igneous processes, in many cases more than once, and cannot provide unequivocal data for geodynamic models dealing with secular changes in continental crust formation. These results and conclusions are similar to those documented along ca. 2000 km swaths across the Canadian Shield recorded by Lithoprobe. Tying the age and character of the Precambrian crust of southern Africa to their depth diversities is clearly related to manifold processes of tectono-thermal ‘surgery’ subsequent to their origin, the details of which are still to be resolved, as they are in most Precambrian terranes. Reconstructing pristine Moho of the early Earth therefore remains a formidable challenge. In South Africa, better knowledge of ‘fossilised’ Archean crustal sections ‘turned-on-edge’, such as at the Vredefort impact crater (for the continental crust), and from the Barberton greenstone belt (for oceanic crust) is needed to characterize potential pristine Archean Moho transitions.

  13. Vizualization Challenges of a Subduction Simulation Using One Billion Markers

    NASA Astrophysics Data System (ADS)

    Rudolph, M. L.; Gerya, T. V.; Yuen, D. A.

    2004-12-01

    Recent advances in supercomputing technology have permitted us to study the multiscale, multicomponent fluid dynamics of subduction zones at unprecedented resolutions down to about the length of a football field. We have performed numerical simulations using one billion tracers over a grid of about 80 thousand points in two dimensions. These runs have been performed using a thermal-chemical simulation that accounts for hydration and partial melting in the thermal, mechanical, petrological, and rheological domains. From these runs, we have observed several geophysically interesting phenomena including the development of plumes with unmixed mantle composition as well as plumes with mixed mantle/crust components. Unmixed plumes form at depths greater than 100km (5-10 km above the upper interface of subducting slab) and consist of partially molten wet peridotite. Mixed plumes form at lesser depth directly from the subducting slab and contain partially molten hydrated oceanic crust and sediments. These high resolution simulations have also spurred the development of new visualization methods. We have created a new web-based interface to data from our subduction simulation and other high-resolution 2D data that uses an hierarchical data format to achieve response times of less than one second when accessing data files on the order of 3GB. This interface, WEB-IS4, uses a Javascript and HTML frontend coupled with a C and PHP backend and allows the user to perform region of interest zooming, real-time colormap selection, and can return relevant statistics relating to the data in the region of interest.

  14. Probability and radical behaviorism

    PubMed Central

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  15. Statistics and Probability

    NASA Astrophysics Data System (ADS)

    Laktineh, Imad

    2010-04-01

    This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p.) corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  16. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  17. PROBABILITY AND STATISTICS.

    DTIC Science & Technology

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  18. Probability and Statistics.

    ERIC Educational Resources Information Center

    Barnes, Bernis, Ed.; And Others

    This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…

  19. Teachers' Understandings of Probability

    ERIC Educational Resources Information Center

    Liu, Yan; Thompson, Patrick

    2007-01-01

    Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…

  20. A SWIRE Picture is Worth Billions of Years

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: SWIRE View of Distant Galaxies [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 2Figure 3 Figure 4

    These spectacular images, taken by the Spitzer Wide-area Infrared Extragalactic (SWIRE) Legacy project, encapsulate one of the primary objectives of the Spitzer mission: to connect the evolution of galaxies from the distant, or early, universe to the nearby, or present day, universe.

    The Tadpole galaxy (main image) is the result of a recent galactic interaction in the local universe. Although these galactic mergers are rare in the universe's recent history, astronomers believe that they were much more common in the early universe. Thus, SWIRE team members will use this detailed image of the Tadpole galaxy to help understand the nature of the 'faint red-orange specks' of the early universe.

    The larger picture (figure 2) depicts one-sixteenth of the SWIRE survey field called ELAIS-N1. In this image, the bright blue sources are hot stars in our own Milky Way, which range anywhere from 3 to 60 times the mass of our Sun. The fainter green spots are cooler stars and galaxies beyond the Milky Way whose light is dominated by older stellar populations. The red dots are dusty galaxies that are undergoing intense star formation. The faintest specks of red-orange are galaxies billions of light-years away in the distant universe.

    Figure 3 features an unusual ring-like galaxy called CGCG 275-022. The red spiral arms indicate that this galaxy is very dusty and perhaps undergoing intense star formation. The star-forming activity could have been initiated by a near head-on collision with another galaxy.

    The most distant galaxies that SWIRE is able to detect are revealed in a zoom of deep space (figure 4). The colors in this feature represent the same objects as those in the larger field image of ELAIS

  1. Iapetus: 4.5 Billion Years of Contamination by Phoebe Dust

    NASA Astrophysics Data System (ADS)

    Hamilton, Douglas P.

    1997-07-01

    One of the strangest satellites in the Solar System is Saturn's tidally-locked Iapetus which has a bright white trailing hemisphere and a jet-black leading face. It has long been suspected that dark dusty debris, originating from Saturn's outermost satellite Phoebe and brought inward by Poynting-Robertson drag, is responsible for Iapetus' striking albedo asymmetry. The Phoebe-dust model is very compelling because it naturally explains why the leading face of Iapetus, the side that is receiving dust from Phoebe, is dark. The model has not gained universal acceptance, however, primarily due to the following dynamical problems: i) the distribution of dark material on Iapetus does not precisely match predicted contours of constant dust flux from Phoebe, ii) there are dark-floored craters in Iapetus' high-albedo hemisphere, and iii) Iapetus' North pole is brighter than parts of the trailing hemisphere. These problems are greatly reduced with the realization that Iapetus took nearly a billion years to become tidally locked to Saturn. I suggest the following scenario for the origin of the black/while dichotomy on Iapetus. Phoebe was probably captured early in the Solar System's history, well before Iapetus' spin slowed to its present synchronous rate. While Iapetus was spinning rapidly, dust from Phoebe accumulated at all longitudes on Iapetus uniformly. The accumulation was greatest near Iapetus' equator and decreased with roughly a cosine dependence toward the poles where the dust flux was lowest. After Iapetus became tidally locked, its trailing side was shielded from Phoebe dust, and volatile ice accumulated there burying the dark Phoebe material. Large impacts on the trailing side have dredged up some ancient Phoebe debris, creating the dark-floored craters. In addition, impacts have mixed dark debris with icy material, thereby lowering the albedo of the trailing side. Iapetus' poles are the brightest parts of the satellite simply because little Phoebe dust ever

  2. Guide star probabilities

    NASA Technical Reports Server (NTRS)

    Soneira, R. M.; Bahcall, J. N.

    1981-01-01

    Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.

  3. Stimulus Plan Aids Education: House Bill Could Provide $100 Billion to K-12 Schools

    ERIC Educational Resources Information Center

    Klein, Alyson

    2009-01-01

    Cash-strapped school districts could see an unprecedented $100 billion infusion of federal aid under a massive economic-stimulus package unveiled by House Democrats this week. The overall measure, put forth January 15 by the House Appropriations Committee, is aimed at providing a $825 billion jolt to the stumbling U.S. economy, and to help avert…

  4. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Rationalizing Hybrid Earthquake Probabilities

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.

    2003-12-01

    An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.

  6. Asteroidal collision probabilities

    NASA Astrophysics Data System (ADS)

    Bottke, W. F.; Greenberg, R.

    1993-05-01

    Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.

  7. Probabilities in implicit learning.

    PubMed

    Tseng, Philip; Hsu, Tzu-Yu; Tzeng, Ovid J L; Hung, Daisy L; Juan, Chi-Hung

    2011-01-01

    The visual system possesses a remarkable ability in learning regularities from the environment. In the case of contextual cuing, predictive visual contexts such as spatial configurations are implicitly learned, retained, and used to facilitate visual search-all without one's subjective awareness and conscious effort. Here we investigated whether implicit learning and its facilitatory effects are sensitive to the statistical property of such implicit knowledge. In other words, are highly probable events learned better than less probable ones even when such learning is implicit? We systematically varied the frequencies of context repetition to alter the degrees of learning. Our results showed that search efficiency increased consistently as contextual probabilities increased. Thus, the visual contexts, along with their probability of occurrences, were both picked up by the visual system. Furthermore, even when the total number of exposures was held constant between each probability, the highest probability still enjoyed a greater cuing effect, suggesting that the temporal aspect of implicit learning is also an important factor to consider in addition to the effect of mere frequency. Together, these findings suggest that implicit learning, although bypassing observers' conscious encoding and retrieval effort, behaves much like explicit learning in the sense that its facilitatory effect also varies as a function of its associative strengths.

  8. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  9. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  10. The perception of probability.

    PubMed

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  11. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  12. The Yatela gold deposit: 2 billion years in the making

    NASA Astrophysics Data System (ADS)

    Hein, K. A. A.; Matsheka, I. R.; Bruguier, O.; Masurel, Q.; Bosch, D.; Caby, R.; Monié, P.

    2015-12-01

    Gold mineralisation in the Yatela Main gold mine is hosted in a saprolitic residuum situated above Birimian supracrustal rocks, and at depth. The supracrustal rocks comprise metamorphosed calcitic and dolomitic marbles that were intruded by diorite (2106 ± 10 Ma, 207Pb/206Pb), and sandstone-siltstone-shale sequences (youngest detrital zircon population dated at 2139 ± 6 Ma). In-situ gold-sulphide mineralisation is associated with hydrothermal activity synchronous to emplacement of the diorite and forms a sub-economic resource; however, the overlying saprolitic residuum hosts economic gold mineralisation in friable lateritized palaeosols and aeolian sands (loess). Samples of saprolitic residuum were studied to investigate the morphology and composition of gold grains as a proxy for distance from source (and possible exploration vector) because the deposit hosts both angular and detrital gold suggesting both proximal and distal sources. U-Pb geochronology of detrital zircons also indicated a proximal and distal source, with the age spectra giving Archaean (2.83-3.28 Ga), and Palaeoproterozoic (1.95-2.20 Ga) to Neoproterozoic (1.1-1.8 Ga) zircons in the Yatela depocentre. The 1.1-1.8 Ga age spectrum restricts the maximum age for the first deposition of the sedimentary units in the Neoproterozoic, or during early deposition in the Taoudeni Basin. Models for formation of the residuum include distal and proximal sources for detritus into the depocentre, however, it is more likely that material was sourced locally and included recycled material. The creation of a deep laterite weathering profile and supergene enrichment of the residuum probably took place during the mid-Cretaceous-early Tertiary.

  13. Billion here, a billion there - a review and analysis of synthetic-fuels development under Title I of the Energy Security Act

    SciTech Connect

    Contratto, D.C.

    1980-01-01

    Title I of the Energy Security Act launched a synthetic fuels program that could produce 2 billion barrels of fuel per day by 1992 and could cost $88 billion. A review of the Act's statutory language to see how implementation will take place and to identify potential problems and opportunities concludes that there is room for creative use of the money in the institutional structure. It will be up to those in charge of implementing the Act to seek out and develop these opportunities. 271 references.

  14. Earth's air pressure 2.7 billion years ago constrained to less than half of modern levels

    NASA Astrophysics Data System (ADS)

    Som, Sanjoy M.; Buick, Roger; Hagadorn, James W.; Blake, Tim S.; Perreault, John M.; Harnmeijer, Jelte P.; Catling, David C.

    2016-06-01

    How the Earth stayed warm several billion years ago when the Sun was considerably fainter is the long-standing problem of the `faint young Sun paradox'. Because of negligible O2 and only moderate CO2 levels in the Archaean atmosphere, methane has been invoked as an auxiliary greenhouse gas. Alternatively, pressure broadening in a thicker atmosphere with a N2 partial pressure around 1.6-2.4 bar could have enhanced the greenhouse effect. But fossilized raindrop imprints indicate that air pressure 2.7 billion years ago (Gyr) was below twice modern levels and probably below 1.1 bar, precluding such pressure enhancement. This result is supported by nitrogen and argon isotope studies of fluid inclusions in 3.0-3.5 Gyr rocks. Here, we calculate absolute Archaean barometric pressure using the size distribution of gas bubbles in basaltic lava flows that solidified at sea level ~2.7 Gyr in the Pilbara Craton, Australia. Our data indicate a surprisingly low surface atmospheric pressure of Patm = 0.23 +/- 0.23 (2σ) bar, and combined with previous studies suggests ~0.5 bar as an upper limit to late Archaean Patm. The result implies that the thin atmosphere was rich in auxiliary greenhouse gases and that Patm fluctuated over geologic time to a previously unrecognized extent.

  15. Estimating tail probabilities

    SciTech Connect

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  16. Survival of Pure Disk Galaxies over the Last 8 Billion Years

    NASA Astrophysics Data System (ADS)

    Sachdeva, Sonali; Saha, Kanak

    2016-03-01

    Pure disk galaxies without any bulge component, i.e., bulges that are neither classical nor pseudo, seem to have escaped the effects of merger activity that are inherent to hierarchical galaxy formation models as well as strong internal secular evolution. We discover that a significant fraction (˜15%-18%) of disk galaxies in the Hubble Deep Field (0.4\\lt z\\lt 1.0) and in the local universe (0.02\\lt z\\lt 0.05) are such pure disk systems (PDSs). The spatial distribution of light in these PDSs is well-described by a single exponential function from the outskirts to the center and appears to have remained intact over the last 8 billion years, keeping the mean central surface brightness and scale-length nearly constant. These two disk parameters of PDSs are brighter and shorter, respectively, than those of disks which are part of disk galaxies with bulges. Since the fraction of PDSs, as well as their profile-defining parameters, do not change, this indicates that these galaxies have not witnessed either major mergers or multiple minor mergers since z˜ 1. However, there is a substantial increase in their total stellar mass and total size over the same time range. This suggests that smooth accretion of cold gas via cosmic filaments is the most probable mode of their evolutions. We speculate that PDSs are dynamically hotter and cushioned in massive dark matter halos, which may prevent them from undergoing strong secular evolution.

  17. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  18. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  19. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  20. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  1. Rules Set for $4 Billion Race to Top Contest: Final Rules Give States Detailed Map in Quest for $4 Billion in Education Stimulus Aid

    ERIC Educational Resources Information Center

    McNeil, Michele

    2009-01-01

    For a good shot at $4 billion in grants from the federal Race to the Top Fund, states will need to make a persuasive case for their education reform agendas, demonstrate significant buy-in from local school districts, and devise plans to evaluate teachers and principals based on student performance, according to final regulations released last…

  2. Lewin estimates 2 billion barrels of US tar sand recoverable at mid $20/bbl

    SciTech Connect

    Not Available

    1986-12-01

    In 1983, Lewin and Associates prepared a report which established that the US tar sands resource amounts to over 60 billion barrels of bitumen in-place. However, no estimate was made of the technically or economically recoverable portion of this resource. More recent work carried out by Lewin for the US Department of Energy presents an appraisal of technically and economically recoverable tar sands. The paper describes the tar sand resource in-place, tar sand recovery models used in the study, engineering cost models, the economics of the steam soak prospect, and the economics of a surface mining prospect. The results of the Lewin study show that 5.7 billion barrels of domestic tar sand are technically recoverable, using cyclic steam injection and surface extractive mining. Of this, 4.9 billion barrels are technically recoverable from surface mining methods, with 0.8 billion recoverable from steam soak applications. 1 figure, 3 tables.

  3. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  4. Superpositions of probability distributions

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  5. Efficient Probability Sequences

    DTIC Science & Technology

    2014-08-18

    Ungar (2014), to produce a distinct forecasting system. The system consists of the method for eliciting individual subjective forecasts together with...E. Stone, and L. H. Ungar (2014). Two reasons to make aggregated probability forecasts more extreme. Decision Analysis 11 (2), 133–145. Bickel, J. E...Letters 91 (3), 425–429. Mellers, B., L. Ungar , J. Baron, J. Ramos, B. Gurcay, K. Fincher, S. E. Scott, D. Moore, P. Atanasov, S. A. Swift, et al. (2014

  6. Searching with Probabilities

    DTIC Science & Technology

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  7. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  8. Expert fears doom if world population hits 12-15 billion.

    PubMed

    1994-02-22

    Earth's land, water and cropland are disappearing so rapidly that the world population must be slashed to 2 billion or less by 2100 to provide prosperity for all in that year, says a study released yesterday. The alternative, if current trends continue, is a population of 12 billion to 15 billion people and an apocalyptic worldwide scene of "absolute misery, poverty, disease and starvation," said the study's author, David Pimentel, an ecologist at Cornell University. In the US, the population would climb to 500 million and the standard of living would decline to slightly better than in present-day China. Mr. Pimentel said at the annual meeting of the American Association for the Advancement of Science. Even now, the world population of 6 billion is at least 3 times what the Earth's battered natural resources and depleted energy reserves would be able to comfortably support in 2100, Mr. Pimentel said. Mr. Pimentel defines "comfortably support" as providing something close to the current American standard of living, but with wiser use of energy and natural resources. Although a decline to 1 billion or 2 billion people over the next century sounds nearly impossible, it could be done by limiting families around the world to an average of 1.5 children, Mr. Pimentel said. Currently, US women have an average of 2.1 children, while the average in Rwanda is 8.5.

  9. Retrieve Tether Survival Probability

    DTIC Science & Technology

    2007-11-02

    cuts of the tether by meteorites and orbital debris , is calculated to be 99.934% for the planned experiment duration of six months or less. This is...due to the unlikely event of a strike by a large piece of orbital debris greater than 1 meter in size cutting all the lines of the tether at once. The...probability of the tether surviving multiple cuts by meteoroid and orbital debris impactors smaller than 5 cm in diameter is 99.9993% at six months

  10. Regional Feedstock Partnership Summary Report: Enabling the Billion-Ton Vision

    SciTech Connect

    Owens, Vance N.; Karlen, Douglas L.; Lacey, Jeffrey A.

    2016-07-12

    The U.S. Department of Energy (DOE) and the Sun Grant Initiative established the Regional Feedstock Partnership (referred to as the Partnership) to address information gaps associated with enabling the vision of a sustainable, reliable, billion-ton U.S. bioenergy industry by the year 2030 (i.e., the Billion-Ton Vision). Over the past 7 years (2008–2014), the Partnership has been successful at advancing the biomass feedstock production industry in the United States, with notable accomplishments. The Billion-Ton Study identifies the technical potential to expand domestic biomass production to offset up to 30% of U.S. petroleum consumption, while continuing to meet demands for food, feed, fiber, and export. This study verifies for the biofuels and chemical industries that a real and substantial resource base could justify the significant investment needed to develop robust conversion technologies and commercial-scale facilities. DOE and the Sun Grant Initiative established the Partnership to demonstrate and validate the underlying assumptions underpinning the Billion-Ton Vision to supply a sustainable and reliable source of lignocellulosic feedstock to a large-scale bioenergy industry. This report discusses the accomplishments of the Partnership, with references to accompanying scientific publications. These accomplishments include advances in sustainable feedstock production, feedstock yield, yield stability and stand persistence, energy crop commercialization readiness, information transfer, assessment of the economic impacts of achieving the Billion-Ton Vision, and the impact of feedstock species and environment conditions on feedstock quality characteristics.

  11. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  12. Birth of the Kaapvaal tectosphere 3.08 billion years ago.

    PubMed

    Moser, D E; Flowers, R M; Hart, R J

    2001-01-19

    The crustal remnants of Earth's Archean continents have been shielded from mantle convection by thick roots of ancient mantle lithosphere. The precise time of crust-root coupling (tectosphere birth) is poorly known but is needed to test competing theories of continental plate genesis. Our mapping and geochronology of an impact-generated section through the Mesoarchean crust of the Kaapvaal craton indicates tectosphere birth at 3.08 +/- 0.01 billion years ago, roughly 0.12 billion years after crust assembly. Growth of the southern African mantle root by subduction processes occurred within about 0.2 billion years. The assembly of crust before mantle may be common to the tectosphere.

  13. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence.

    PubMed

    Ohmoto, H; Kakegawa, T; Lowe, D R

    1993-10-22

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>10(-13) of the present atmospheric level) of free oxygen.

  14. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence

    NASA Technical Reports Server (NTRS)

    Ohmoto, H.; Kakegawa, T.; Lowe, D. R.

    1993-01-01

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>>10(-13) of the present atmospheric level) of free oxygen.

  15. Billions for biodefense: federal agency biodefense funding, FY2001-FY2005.

    PubMed

    Schuler, Ari

    2004-01-01

    Over the past several years, the United States government has spent substantial resources on preparing the nation against a bioterrorist attack. This article analyzes the civilian biodefense funding by the federal government from fiscal years 2001 through 2005, specifically analyzing the budgets and allocations for biodefense at the Department of Health and Human Services, the Department of Homeland Security, the Department of Defense, the Department of Agriculture, the Environmental Protection Agency, the National Science Foundation, and the Department of State. In total, approximately $14.5 billion has been funded for civilian biodefense through FY2004, with an additional $7.6 billion in the President's budget request for FY2005.

  16. Probability state modeling theory.

    PubMed

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  17. Multi-Billion Shot, High-Fluence Exposure of Cr(4+): YAG Passive Q-Switch

    NASA Technical Reports Server (NTRS)

    Stephen, Mark A.; Dallas, Joseph L.; Afzal, Robert S.

    1997-01-01

    NASA's Goddard Space Flight Center is developing the Geoscience Laser Altimeter System (GLAS) employing a diode pumped, Q-Switched, ND:YAG laser operating at 40 Hz repetition rate. To meet the five-year mission lifetime goal, a single transmitter would accumulate over 6.3 billion shots. Cr(4+):YAG is a promising candidate material for passively Q-switching the laser. Historically, the performance of saturable absorbers has degraded over long-duration usage. To measure the multi-billion shot performance of Cr(4+):YAG, a passively Q-switched GLAS-like oscillator was tested at an accelerated repetition rate of 500 Hz. The intracavity fluence was calculated to be approximately 2.5 J/cm(exp 2). The laser was monitored autonomously for 165 days. There was no evidence of change in the material optical properties during the 7.2 billion shot test.. All observed changes in laser operation could be attributed to pump laser diode aging. This is the first demonstration of multi-billion shot exposure testing of Cr(4+):YAG in this pulse energy regime

  18. Two Billion Cars: What it Means for Climate and Energy Policy

    ScienceCinema

    Daniel Sperling

    2016-07-12

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  19. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    SciTech Connect

    Langholtz, M. H.; Stokes, B. J.; Eaton, L. M.

    2016-07-06

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations, and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.

  20. The uncertain timing of reaching 8 billion, peak world population, and other demographic milestones.

    PubMed

    Scherbov, Sergei; Lutz, Wolfgang Lutz; Sanderson, Warren C

    2011-01-01

    We present new probabilistic forecasts of the timing of the world's population reaching 8 billion, the world's peak population, and the date at which one-third or more of the world's population would be 60+ years old. The timing of these milestones, as well as the timing of the Day of 7 Billion, is uncertain. We compute that the 60 percent prediction interval for the Day of 8 Billion is between 2024 and 2033. Our figures show that there is around a 60 percent chance that one-third of the world's population would be 60+ years old in 2100. In the UN 2010 medium variant, that proportion never reaches one-third. As in our past forecasts (Lutz et al. 2001, 2008), we find the chance that the world's population will peak in this century to be around 84 percent and the timing of that peak to be highly uncertain. Focal days, like the Day of 7 Billion, play a role in raising public awareness of population issues, but they give a false sense of the certainty of our knowledge. The uncertainty of the timing of demographic milestones is not a constant of nature. Understanding the true extent of our demographic uncertainty can help motivate governments and other agencies to make the investments necessary to reduce it.

  1. High-Stakes Hustle: Public Schools and the New Billion Dollar Accountability

    ERIC Educational Resources Information Center

    Baines, Lawrence A.; Stanley, Gregory Kent

    2004-01-01

    High-stakes testing costs up to $50 billion per annum, has no impact on student achievement, and has changed the focus of American public schools. This article analyzes the benefits and costs of the accountability movement, as well as discusses its roots in the eugenics movements of the early 20th century.

  2. Conservation in a World of Six Billion: A Grassroots Action Guide.

    ERIC Educational Resources Information Center

    Hren, Benedict J.

    This grassroots action guide features a conservation initiative working to bring the impacts of human population growth, economic development, and natural resource consumption into balance with the limits of nature for the benefit of current and future generations. Contents include information sheets entitled "Six Billion People and Growing,""The…

  3. U.S. Health Care Costs from Birth Defects Total Almost $23 Billion a Year

    MedlinePlus

    ... https://medlineplus.gov/news/fullstory_163141.html U.S. Health Care Costs From Birth Defects Total Almost $23 Billion ... U.S. newborns have a serious birth defect, and health care costs tied to these difficulties total almost $23 ...

  4. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  5. Mass spectrometry at and below 0.1 parts per billion

    SciTech Connect

    Bradley, M.; Palmer, F.; Pritchard, D.E.

    1994-12-31

    The single ion Penning trap mass spectrometer at M.I.T. can compare masses to within 0.1 parts per billion. We have created a short table of fundamental atomic masses and made measurements useful for calibrating the X-ray standard, and determining Avogadro`s number, the molar Plank constant, and the fine structure constant.

  6. Two Billion Cars: What it Means for Climate and Energy Policy

    SciTech Connect

    Daniel Sperling

    2009-04-15

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  7. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  8. Coherent Assessment of Subjective Probability

    DTIC Science & Technology

    1981-03-01

    known results of de Finetti (1937, 1972, 1974), Smith (1961), and Savage (1971) and some recent results of Lind- ley (1980) concerning the use of...provides the motivation for de Finettis definition of subjective probabilities as coherent bet prices. From the definition of the probability measure...subjective probability, the probability laws which are traditionally stated as axioms or definitions are obtained instead as theorems. (De Finetti F -7

  9. Probabilities of transversions and transitions.

    PubMed

    Vol'kenshtein, M V

    1976-01-01

    The values of the mean relative probabilities of transversions and transitions have been refined on the basis of the data collected by Jukes and found to be equal to 0.34 and 0.66, respectively. Evolutionary factors increase the probability of transversions to 0.44. The relative probabilities of individual substitutions have been determined, and a detailed classification of the nonsense mutations has been given. Such mutations are especially probable in the UGG (Trp) codon. The highest probability of AG, GA transitions correlates with the lowest mean change in the hydrophobic nature of the amino acids coded.

  10. The Stability Of Disk Barred Galaxies Over the Past 7 Billion Years

    NASA Astrophysics Data System (ADS)

    Tapia, Amauri; Simmons, Brooke

    2017-01-01

    A recently released model of interacting disk galaxies provides a hypothesis for the origins of off center bars in disks. No systematic search for offset bars in the early universe has yet been undertaken. The Galaxy Zoo project has produced data regarding the large-scale bars of many galaxies. Using this data alongside images collected by the Hubble Space Telescope and other sources, we have examined 5190 galaxies for signatures of off-centered bars. Less than 5 percent of the sample shows clear signs of an offset bar. We describe their overall properties of this sub-sample and compare the properties of galaxies with offset bars to those with centered bars. We assess the feasibility of the proposed model and place these galaxies in the context of the overall evolution of galaxies.

  11. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  12. The rapid assembly of an elliptical galaxy of 400 billion solar masses at a redshift of 2.3.

    PubMed

    Fu, Hai; Cooray, Asantha; Feruglio, C; Ivison, R J; Riechers, D A; Gurwell, M; Bussmann, R S; Harris, A I; Altieri, B; Aussel, H; Baker, A J; Bock, J; Boylan-Kolchin, M; Bridge, C; Calanog, J A; Casey, C M; Cava, A; Chapman, S C; Clements, D L; Conley, A; Cox, P; Farrah, D; Frayer, D; Hopwood, R; Jia, J; Magdis, G; Marsden, G; Martínez-Navajas, P; Negrello, M; Neri, R; Oliver, S J; Omont, A; Page, M J; Pérez-Fournon, I; Schulz, B; Scott, D; Smith, A; Vaccari, M; Valtchanov, I; Vieira, J D; Viero, M; Wang, L; Wardlow, J L; Zemcov, M

    2013-06-20

    Stellar archaeology shows that massive elliptical galaxies formed rapidly about ten billion years ago with star-formation rates of above several hundred solar masses per year. Their progenitors are probably the submillimetre bright galaxies at redshifts z greater than 2. Although the mean molecular gas mass (5 × 10(10) solar masses) of the submillimetre bright galaxies can explain the formation of typical elliptical galaxies, it is inadequate to form elliptical galaxies that already have stellar masses above 2 × 10(11) solar masses at z ≈ 2. Here we report multi-wavelength high-resolution observations of a rare merger of two massive submillimetre bright galaxies at z = 2.3. The system is seen to be forming stars at a rate of 2,000 solar masses per year. The star-formation efficiency is an order of magnitude greater than that of normal galaxies, so the gas reservoir will be exhausted and star formation will be quenched in only around 200 million years. At a projected separation of 19 kiloparsecs, the two massive starbursts are about to merge and form a passive elliptical galaxy with a stellar mass of about 4 × 10(11) solar masses. We conclude that gas-rich major galaxy mergers with intense star formation can form the most massive elliptical galaxies by z ≈ 1.5.

  13. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  14. Two billion years of magmatism recorded from a single Mars meteorite ejection site.

    PubMed

    Lapen, Thomas J; Righter, Minako; Andreasen, Rasmus; Irving, Anthony J; Satkoski, Aaron M; Beard, Brian L; Nishiizumi, Kunihiko; Jull, A J Timothy; Caffee, Marc W

    2017-02-01

    The timing and nature of igneous activity recorded at a single Mars ejection site can be determined from the isotope analyses of Martian meteorites. Northwest Africa (NWA) 7635 has an Sm-Nd crystallization age of 2.403 ± 0.140 billion years, and isotope data indicate that it is derived from an incompatible trace element-depleted mantle source similar to that which produced a geochemically distinct group of 327- to 574-million-year-old "depleted" shergottites. Cosmogenic nuclide data demonstrate that NWA 7635 was ejected from Mars 1.1 million years ago (Ma), as were at least 10 other depleted shergottites. The shared ejection age is consistent with a common ejection site for these meteorites. The spatial association of 327- to 2403-Ma depleted shergottites indicates >2 billion years of magmatism from a long-lived and geochemically distinct volcanic center near the ejection site.

  15. Two billion years of magmatism recorded from a single Mars meteorite ejection site

    PubMed Central

    Lapen, Thomas J.; Righter, Minako; Andreasen, Rasmus; Irving, Anthony J.; Satkoski, Aaron M.; Beard, Brian L.; Nishiizumi, Kunihiko; Jull, A. J. Timothy; Caffee, Marc W.

    2017-01-01

    The timing and nature of igneous activity recorded at a single Mars ejection site can be determined from the isotope analyses of Martian meteorites. Northwest Africa (NWA) 7635 has an Sm-Nd crystallization age of 2.403 ± 0.140 billion years, and isotope data indicate that it is derived from an incompatible trace element–depleted mantle source similar to that which produced a geochemically distinct group of 327- to 574-million-year-old “depleted” shergottites. Cosmogenic nuclide data demonstrate that NWA 7635 was ejected from Mars 1.1 million years ago (Ma), as were at least 10 other depleted shergottites. The shared ejection age is consistent with a common ejection site for these meteorites. The spatial association of 327- to 2403-Ma depleted shergottites indicates >2 billion years of magmatism from a long-lived and geochemically distinct volcanic center near the ejection site. PMID:28164153

  16. Billions for biodefense: federal agency biodefense funding, FY2009-FY2010.

    PubMed

    Franco, Crystal

    2009-09-01

    Since 2001, the United States government has spent substantial resources on preparing the nation against a bioterrorist attack. Earlier articles in this series analyzed civilian biodefense funding by the federal government for fiscal years (FY) 2001 through 2009. This article updates those figures with budgeted amounts for FY2010, specifically analyzing the budgets and allocations for biodefense at the Departments of Health and Human Services, Defense, Homeland Security, Agriculture, and State; the Environmental Protection Agency; and the National Science Foundation. This year's article also provides an assessment of the proportion of the biodefense budget that serves multiple programmatic goals and benefits, including research into infectious disease pathogenesis and immunology, public health planning and preparedness, and disaster response efforts. The FY2010 federal budget for civilian biodefense totals $6.05 billion. Of that total, $4.96 billion is budgeted for programs that serve multiple goals and provide manifold benefits.

  17. How to Bring Solar Energy to Seven Billion People (LBNL Science at the Theater)

    ScienceCinema

    Wadia, Cyrus

    2016-07-12

    By exploiting the powers of nanotechnology and taking advantage of non-toxic, Earth-abundant materials, Berkeley Lab's Cyrus Wadia has fabricated new solar cell devices that have the potential to be several orders of magnitude less expensive than conventional solar cells. And by mastering the chemistry of these materials-and the economics of solar energy-he envisions bringing electricity to the 1.2 billion people now living without it.

  18. Medicare, Medicaid fraud a billion-dollar art form in the US

    PubMed Central

    Korcok, M

    1997-01-01

    Medicare and Medicaid fraud costs billions of dollars each year in the US. Investigators have shown that fraud is found in all segments of the health care system. Even though the Canadian system has stricter regulations and tighter controls, can regulators here afford to be complacent about believing that such abuse would not happen here? One province has established an antifraud unit to monitor its health insurance scheme; it already has 1 prosecution under its belt. PMID:9141996

  19. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    probabilities into quantum mechanics, and some psychologists have argued that they have a role to play in accounting for errors in judgment [30]. But, in...Discussion The mechanisms underlying naive estimates of the probabilities of unique events are largely inaccessible to consciousness , but they...Can quantum probability provide a new direc- tion for cognitive modeling? Behavioral and Brain Sciences (in press). 31. Paolacci G, Chandler J

  20. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  1. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    PubMed

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  4. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    PubMed

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-08

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  5. Planet Earth 2025. A look into a future world of 8 billion humans.

    PubMed

    Hinrichsen, D; Rowley, J

    1999-01-01

    Population projections for the next quarter century are reasonably predictable, and related resource challenges are quite visible. The world's population is expected to grow to around 8 billion by 2025. According to the International Food Policy Research Institute, if current levels of investments in agriculture and social welfare continue, food grain production will increase by about 1.5% and livestock production by 2.7% a year over the next 2 decades. These levels are much lower now compared to previous decades, and population could outstrip supply unless there is a big increase in developing country imports. The continued destruction of the earth's forest mantle as a result of human activities is another desperate concern. By 2025, some 3 billion people will live in land-short countries and another 2 billion will be living in urban areas with high levels of air pollution. In addition, coastal ecosystems, which are already exposed to unbridled coastal development and mounting pollution loads, will experience more pressures as the number of people living near them increases in the next 25 years. One final challenge is the unprecedented rate of habitat loss and species extinction. Ecosystem destruction is so severe that as many as 60,000 plant species could be lost by the year 2025.

  6. MMap: Fast Billion-Scale Graph Computation on a PC via Memory Mapping

    PubMed Central

    Lin, Zhiyuan; Kahng, Minsuk; Sabrin, Kaeser Md.; Chau, Duen Horng (Polo); Lee, Ho; Kang, U

    2015-01-01

    Graph computation approaches such as GraphChi and TurboGraph recently demonstrated that a single PC can perform efficient computation on billion-node graphs. To achieve high speed and scalability, they often need sophisticated data structures and memory management strategies. We propose a minimalist approach that forgoes such requirements, by leveraging the fundamental memory mapping (MMap) capability found on operating systems. We contribute: (1) a new insight that MMap is a viable technique for creating fast and scalable graph algorithms that surpasses some of the best techniques; (2) the design and implementation of popular graph algorithms for billion-scale graphs with little code, thanks to memory mapping; (3) extensive experiments on real graphs, including the 6.6 billion edge YahooWeb graph, and show that this new approach is significantly faster or comparable to the highly-optimized methods (e.g., 9.5× faster than GraphChi for computing PageRank on 1.47B edge Twitter graph). We believe our work provides a new direction in the design and development of scalable algorithms. Our packaged code is available at http://poloclub.gatech.edu/mmap/. PMID:25866846

  7. Severe Obesity In Adults Cost State Medicaid Programs Nearly $8 Billion In 2013.

    PubMed

    Wang, Y Claire; Pamplin, John; Long, Michael W; Ward, Zachary J; Gortmaker, Steven L; Andreyeva, Tatiana

    2015-11-01

    Efforts to expand Medicaid while controlling spending must be informed by a deeper understanding of the extent to which the high medical costs associated with severe obesity (having a body mass index of [Formula: see text] or higher) determine spending at the state level. Our analysis of population-representative data indicates that in 2013, severe obesity cost the nation approximately $69 billion, which accounted for 60 percent of total obesity-related costs. Approximately 11 percent of the cost of severe obesity was paid for by Medicaid, 30 percent by Medicare and other federal health programs, 27 percent by private health plans, and 30 percent out of pocket. Overall, severe obesity cost state Medicaid programs almost $8 billion a year, ranging from $5 million in Wyoming to $1.3 billion in California. These costs are likely to increase following Medicaid expansion and enhanced coverage of weight loss therapies in the form of nutrition consultation, drug therapy, and bariatric surgery. Ensuring and expanding Medicaid-eligible populations' access to cost-effective treatment for severe obesity should be part of each state's strategy to mitigate rising obesity-related health care costs.

  8. Information Processing Using Quantum Probability

    NASA Astrophysics Data System (ADS)

    Behera, Laxmidhar

    2006-11-01

    This paper presents an information processing paradigm that introduces collective response of multiple agents (computational units) while the level of intelligence associated with the information processing has been increased manifold. It is shown that if the potential field of the Schroedinger wave equation is modulated using a self-organized learning scheme, then the probability density function associated with the stochastic data is transferred to the probability amplitude function which is the response of the Schroedinger wave equation. This approach illustrates that information processing of data with stochastic behavior can be efficiently done using quantum probability instead of classical probability. The proposed scheme has been demonstrated through two applications: denoising and adaptive control.

  9. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  10. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  11. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    SciTech Connect

    Stewart, Jeffrey S.

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  12. Capture probabilities for secondary resonances

    NASA Technical Reports Server (NTRS)

    Malhotra, Renu

    1990-01-01

    A perturbed pendulum model is used to analyze secondary resonances, and it is shown that a self-similarity between secondary and primary resonances exists. Henrard's (1982) theory is used to obtain formulas for the capture probability into secondary resonances. The tidal evolution of Miranda and Umbriel is considered as an example, and significant probabilities of capture into secondary resonances are found.

  13. Definition of the Neutrosophic Probability

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2014-03-01

    Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.

  14. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.

  15. Cluster membership probability: polarimetric approach

    NASA Astrophysics Data System (ADS)

    Medhi, Biman J.; Tamura, Motohide

    2013-04-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.

  16. Holographic Probabilities in Eternal Inflation

    NASA Astrophysics Data System (ADS)

    Bousso, Raphael

    2006-11-01

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  17. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core

    NASA Astrophysics Data System (ADS)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J.; Greene, Jenny E.; Blakeslee, John P.; Janish, Ryan

    2016-04-01

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day ‘dormant’ descendants of this population of ‘active’ black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall—the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600—a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  18. The First Billion Years: The Growth of Galaxies in the Reionization Epoch

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth

    2015-08-01

    Detection and measurement of the earliest galaxies in the first billion years only became possible after the Hubble Space Telescope was updated in 2009 with the infrared WFC3/IR camera during Shuttle servicing mission SM4. The first billion years is a fascinating epoch, not just because of the earliest galaxies known from about 450 Myr after the Big Bang, but also because it encompasses the reionization epoch that peaked around z~9, as Planck has recently shown, and ended around redshift z~6 at 900 Myr. Before 2009 just a handful of galaxies were known in the reionization epoch at z>6. But within the last 5 years, with the first HUDF09 survey, the HUDF12, CANDELS and numerous other surveys on the GOODS and CANDELS fields, as well as detections from the cluster lensing programs like CLASH and the Frontier Fields, the number of galaxies at redshifts 7-10 has exploded, with some 700 galaxies being found and characterized. The first billion years was a period of extraordinary growth in the galaxy population with rapid growth in the star formation rate density and global mass density in galaxies. Spitzer observations in the infrared of these Hubble fields are establishing masses as well as giving insights into the nature and timescales of star formation from the very powerful emission lines being revealed by the Spitzer IRAC data. I will discuss what we understand about the growth of galaxies in this epoch from the insights gained from remarkable deep fields like the XDF, as well as the wide-area GOODS/CANDELS fields, the detection of unexpectedly luminous galaxies at redshifts 8-10, the impact of early galaxies on reionization, confirmation of a number of galaxies at z~7-8 from ground-based spectroscopic measurements, and the indications of a change in the growth of the star formation rate around 500 Myr. The first billion years was a time of dramatic growth and change in the early galaxy population.

  19. The $1. 5 billion question: Can the US Global Change Research Program deliver on its promises

    SciTech Connect

    Monastersky, R.

    1993-09-04

    President Clinton has continued the funding for scientific investigations of global climatic change, increasing funds to a total of $1.5 billion spread amoung 11 different agencies. However, a growing number of critics warn that the program appears heading toward failure. The main issue is relevancy. Almost every agrees that the research effort will support important scientific work over the next decade, but it will not necessarily provide the information policymakers need to address the threat of climatic change, ozone depletion, deforestation, desertification, and similiar issues. This article summarizes the concerns and comments of critics, and the gap between the climate scientists and governmental policymakers.

  20. Exploring for Galaxies in the First Billion Years with Hubble and Spitzer - Pathfinding for JWST

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth D.

    2017-01-01

    Hubble has revolutionized the field of distant galaxies through its deep imaging surveys, starting with the Hubble Deep Field (HDF) in 1995. That first deep survey revealed galaxies at redshift z~1-3 that provided insights into the development of the Hubble sequence. Each new HST instrument has explored new regimes, through the peak of star formation at z~2-3, just 2-3 billion years after the Big Bang, to our first datasets at a billion years at z~6, and then earlier to z~11. HST's survey capabilities were enhanced by 40X with ACS, and then similarly with the WFC3/IR, which opened up the first billion years to an unforeseen degree. I will discuss what we have learned from the remarkable HST and Spitzer imaging surveys (HUDF, GOODS, HUDF09/12 and CANDELS), as well as surveys of clusters like the Hubble Frontier Fields (HFF). Lensing clusters provide extraordinary opportunities for characterizing the faintest earliest galaxies, but also present extraordinary challenges. Together these surveys have resulted in the measurement of the volume density of galaxies in the first billion years down to astonishingly faint levels. The role of faint galaxies in reionizing the universe is still much-discussed, but there is no doubt that such galaxies contribute greatly to the UV ionizing flux, as shown by deep luminosity function studies. Together Hubble and Spitzer have also established the stellar-mass buildup over 97% of cosmic history. Yet some of the greatest surprises have come from the discovery of very luminous galaxies at z~8-11, around 400-650 million years after the Big Bang. Spectroscopic followup by Keck of some of these very rare, bright galaxies has confirmed redshifts from z~7 to z~9, and revealed, surprisingly, strong Lyα emission near the peak of reionization when the HI fraction in the IGM is high. The recent confirmation of a z=11.1 galaxy, just 400 million years after the Big Bang, by a combination of Hubble and Spitzer data, moved Hubble into JWST territory

  1. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  2. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  3. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Air density 2.7 billion years ago limited to less than twice modern levels by fossil raindrop imprints.

    PubMed

    Som, Sanjoy M; Catling, David C; Harnmeijer, Jelte P; Polivka, Peter M; Buick, Roger

    2012-03-28

    According to the 'Faint Young Sun' paradox, during the late Archaean eon a Sun approximately 20% dimmer warmed the early Earth such that it had liquid water and a clement climate. Explanations for this phenomenon have invoked a denser atmosphere that provided warmth by nitrogen pressure broadening or enhanced greenhouse gas concentrations. Such solutions are allowed by geochemical studies and numerical investigations that place approximate concentration limits on Archaean atmospheric gases, including methane, carbon dioxide and oxygen. But no field data constraining ground-level air density and barometric pressure have been reported, leaving the plausibility of these various hypotheses in doubt. Here we show that raindrop imprints in tuffs of the Ventersdorp Supergroup, South Africa, constrain surface air density 2.7 billion years ago to less than twice modern levels. We interpret the raindrop fossils using experiments in which water droplets of known size fall at terminal velocity into fresh and weathered volcanic ash, thus defining a relationship between imprint size and raindrop impact momentum. Fragmentation following raindrop flattening limits raindrop size to a maximum value independent of air density, whereas raindrop terminal velocity varies as the inverse of the square root of air density. If the Archaean raindrops reached the modern maximum measured size, air density must have been less than 2.3 kg m(-3), compared to today's 1.2 kg m(-3), but because such drops rarely occur, air density was more probably below 1.3 kg m(-3). The upper estimate for air density renders the pressure broadening explanation possible, but it is improbable under the likely lower estimates. Our results also disallow the extreme CO(2) levels required for hot Archaean climates.

  5. A massive galaxy in its core formation phase three billion years after the Big Bang.

    PubMed

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha Förster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-09-18

    Most massive galaxies are thought to have formed their dense stellar cores in early cosmic epochs. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes, but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we report a candidate core in the process of formation 11 billion years ago, at redshift z = 2.3. This galaxy, GOODS-N-774, has a stellar mass of 100 billion solar masses, a half-light radius of 1.0 kiloparsecs and a star formation rate of solar masses per year. The star-forming gas has a velocity dispersion of 317 ± 30 kilometres per second. This is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, which are compact quiescent galaxies at z ≈ 2 (refs 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 seem to be rare; however, from the star formation rate and size of this galaxy we infer that many star-forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  6. Greenhouse gas implications of a 32 billion gallon bioenergy landscape in the US

    NASA Astrophysics Data System (ADS)

    DeLucia, E. H.; Hudiburg, T. W.; Wang, W.; Khanna, M.; Long, S.; Dwivedi, P.; Parton, W. J.; Hartman, M. D.

    2015-12-01

    Sustainable bioenergy for transportation fuel and greenhouse gas (GHGs) reductions may require considerable changes in land use. Perennial grasses have been proposed because of their potential to yield substantial biomass on marginal lands without displacing food and reduce GHG emissions by storing soil carbon. Here, we implemented an integrated approach to planning bioenergy landscapes by combining spatially-explicit ecosystem and economic models to predict a least-cost land allocation for a 32 billion gallon (121 billion liter) renewable fuel mandate in the US. We find that 2022 GHG transportation emissions are decreased by 7% when 3.9 million hectares of eastern US land are converted to perennial grasses supplemented with corn residue to meet cellulosic ethanol requirements, largely because of gasoline displacement and soil carbon storage. If renewable fuel production is accompanied by a cellulosic biofuel tax credit, CO2 equivalent emissions could be reduced by 12%, because it induces more cellulosic biofuel and land under perennial grasses (10 million hectares) than under the mandate alone. While GHG reducing bioenergy landscapes that meet RFS requirements and do not displace food are possible, the reductions in GHG emissions are 50% less compared to previous estimates that did not account for economically feasible land allocation.

  7. Energy tax price tag for CPI: $1. 2 billion, jobs, and production

    SciTech Connect

    Begley, R.

    1993-03-03

    If President Clinton's proposed energy tax had been fully in place last year, it would have cost the US chemical industry an additional $1.2 billion and 9,900 jobs, according to Chemical Manufacturers Association (CMA; Washington) estimates. It also would have driven output down 3% and prices up 5%, CMA says. Allen Lenz, CMA director/trade and economics, says the increase in production costs that would accompany the tax will not be shared by foreign competitors, cannot be neutralized with higher border taxes because of existing trade agreements, and provides another reason to move production offshore. Worse, the US chemical industry's generally impressive trade surplus declined by $2.5 billion last year, and a further drop is projected for this year. The margin of error gets thinner all the time as competition increases, Lenz says. We're not concerned only with the chemical industry, but the rest of US-based manufacturing because they taken half our output, he adds. One problem is the energy intensiveness of the chemical process industries-a CMA report says that 55% of the cost of producing ethylene glycol is energy related. And double taxation of such things as coproducts returned for credit to oil refineries could add up to $115 million/year, the report says.

  8. The Value Of The Nonprofit Hospital Tax Exemption Was $24.6 Billion In 2011.

    PubMed

    Rosenbaum, Sara; Kindig, David A; Bao, Jie; Byrnes, Maureen K; O'Laughlin, Colin

    2015-07-01

    The federal government encourages public support for charitable activities by allowing people to deduct donations to tax-exempt organizations on their income tax returns. Tax-exempt hospitals are major beneficiaries of this policy because it encourages donations to the hospitals while shielding them from federal and state tax liability. In exchange, these hospitals must engage in community benefit activities, such as providing care to indigent patients and participating in Medicaid. The congressional Joint Committee on Taxation estimated the value of the nonprofit hospital tax exemption at $12.6 billion in 2002--a number that included forgone taxes, public contributions, and the value of tax-exempt bond financing. In this article we estimate that the size of the exemption reached $24.6 billion in 2011. The Affordable Care Act (ACA) brings a new focus on community benefit activities by requiring tax-exempt hospitals to engage in communitywide planning efforts to improve community health. The magnitude of the tax exemption, coupled with ACA reforms, underscores the public's interest not only in community benefit spending generally but also in the extent to which nonprofit hospitals allocate funds for community benefit expenditures that improve the overall health of their communities.

  9. Increased subaerial volcanism and the rise of atmospheric oxygen 2.5 billion years ago.

    PubMed

    Kump, Lee R; Barley, Mark E

    2007-08-30

    The hypothesis that the establishment of a permanently oxygenated atmosphere at the Archaean-Proterozoic transition (approximately 2.5 billion years ago) occurred when oxygen-producing cyanobacteria evolved is contradicted by biomarker evidence for their presence in rocks 200 million years older. To sustain vanishingly low oxygen levels despite near-modern rates of oxygen production from approximately 2.7-2.5 billion years ago thus requires that oxygen sinks must have been much larger than they are now. Here we propose that the rise of atmospheric oxygen occurred because the predominant sink for oxygen in the Archaean era-enhanced submarine volcanism-was abruptly and permanently diminished during the Archaean-Proterozoic transition. Observations are consistent with the corollary that subaerial volcanism only became widespread after a major tectonic episode of continental stabilization at the beginning of the Proterozoic. Submarine volcanoes are more reducing than subaerial volcanoes, so a shift from predominantly submarine to a mix of subaerial and submarine volcanism more similar to that observed today would have reduced the overall sink for oxygen and led to the rise of atmospheric oxygen.

  10. The evolution in the stellar mass of brightest cluster galaxies over the past 10 billion years

    NASA Astrophysics Data System (ADS)

    Bellstedt, Sabine; Lidman, Chris; Muzzin, Adam; Franx, Marijn; Guatelli, Susanna; Hill, Allison R.; Hoekstra, Henk; Kurinsky, Noah; Labbe, Ivo; Marchesini, Danilo; Marsan, Z. Cemile; Safavi-Naeini, Mitra; Sifón, Cristóbal; Stefanon, Mauro; van de Sande, Jesse; van Dokkum, Pieter; Weigel, Catherine

    2016-08-01

    Using a sample of 98 galaxy clusters recently imaged in the near-infrared with the European Southern Observatory (ESO) New Technology Telescope, WIYN telescope and William Herschel Telescope, supplemented with 33 clusters from the ESO archive, we measure how the stellar mass of the most massive galaxies in the universe, namely brightest cluster galaxies (BCGs), increases with time. Most of the BCGs in this new sample lie in the redshift range 0.2 < z < 0.6, which has been noted in recent works to mark an epoch over which the growth in the stellar mass of BCGs stalls. From this sample of 132 clusters, we create a subsample of 102 systems that includes only those clusters that have estimates of the cluster mass. We combine the BCGs in this subsample with BCGs from the literature, and find that the growth in stellar mass of BCGs from 10 billion years ago to the present epoch is broadly consistent with recent semi-analytic and semi-empirical models. As in other recent studies, tentative evidence indicates that the stellar mass growth rate of BCGs may be slowing in the past 3.5 billion years. Further work in collecting larger samples, and in better comparing observations with theory using mock images, is required if a more detailed comparison between the models and the data is to be made.

  11. Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Becker, A.C.; Becla, J.; Borne, K.; Burke, David L.; Claver, C.F.; Cook, K.H.; Connolly, A.; Gilmore, D.K.; Jones, R.L.; Juric, M.; Kahn, Steven M.; Lim, K-T.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Sesar, B.; Stubbs, Christopher W.; Tyson, J.Anthony; /UC, Davis

    2011-11-10

    The Large Synoptic Survey Telescope (LSST) will be a large, wide-field ground-based system designed to obtain, starting in 2015, multiple images of the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times during the anticipated 10 years of operations (distributed over six bands, ugrizy). Each 30-second long visit will deliver 5{sigma} depth for point sources of r {approx} 24.5 on average. The co-added map will be about 3 magnitudes deeper, and will include 10 billion galaxies and a similar number of stars. We discuss various measurements that will be automatically performed for these 20 billion sources, and how they can be used for classification and determination of source physical and other properties. We provide a few classification examples based on SDSS data, such as color classification of stars, color-spatial proximity search for wide-angle binary stars, orbital-color classification of asteroid families, and the recognition of main Galaxy components based on the distribution of stars in the position-metallicity-kinematics space. Guided by these examples, we anticipate that two grand classification challenges for LSST will be (1) rapid and robust classification of sources detected in difference images, and (2) simultaneous treatment of diverse astrometric and photometric time series measurements for an unprecedentedly large number of objects.

  12. A stimulating conversation. Healthcare organizations praise the economic stimulus law, start considering ways to use the $150 billion in relief.

    PubMed

    Lubell, Jennifer

    2009-02-23

    The industry eagerly awaits its $150 billion under the stimulus package, but not everyone will win, experts say. Still, "The initial stimulus package was a very solid start," said system exec Conway Collis, left.

  13. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  14. Detonation probabilities of high explosives

    SciTech Connect

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  15. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  16. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  17. Evidence for Early Life in ˜3.5 Billion-Year-Old Pillow Lavas

    NASA Astrophysics Data System (ADS)

    Banerjee, N. R.; Furnes, H.; Muehlenbachs, K.; Staudigel, H.; de Wit, M.

    2004-12-01

    Recently discovered biosignatures in the formerly glassy rims of ˜3.5 billion-year-old pillow lavas from the Barberton Greenstone Belt (BGB) in South Africa suggest they were colonized by microbes early in Earth's history. These subaqueous volcanic rocks represent a new geological setting in the search for early life on Earth. This is not entirely surprising since microbial alteration of basaltic glass in pillow lavas and volcaniclastic rocks has been well documented from recent oceanic crust and well-preserved ophiolites. The BGB magmatic sequence contains exceptionally well-preserved mafic to ultramafic pillow lavas, sheet flows, and intrusions interpreted to represent 3.48 to 3.22 billion-year-old oceanic crust and island arc assemblages. We observed micron-sized tubular structures mineralized by titanite in the formerly glassy rims of the BGB pillow lavas. Based on their similarity to textures observed in recent glassy pillow basalts we interpret these structures to represent ancient traces of microbial activity formed during biogenic etching of the originally glassy pillow rims as microbes colonized the glass surface. Petrographic observations coupled with overlapping metamorphic and magmatic dates indicate this process occurred soon after eruption of the pillow lavas. Subsequent greenschist facies seafloor hydrothermal alteration caused the structures to be mineralized by titanite; a process also observed in ophiolitic pillow lavas of much younger age. X-ray mapping reveals the presence of carbon along the margins of the tubular structures interpreted as residual organic material. Disseminated carbonates within the microbially-altered BGB pillow rims have low carbon isotope values consistent with microbial oxidation of organic matter. In contrast, disseminated carbonate in the crystalline pillow interiors have carbon isotope values bracketed between Archean marine carbonate and mantle carbon dioxide. It remains to be seen how deep into the Archean oceanic

  18. The VIRMOS-VLT Deep Survey: the Last 10 Billion Years of Evolution of Galaxy Clustering

    NASA Astrophysics Data System (ADS)

    Pollo, A.; Guzzo, L.; Le Fèvre, O.; Meneux, B.; Cappi, A.; McCracken, H. J.; Iovino, A.; Marinoni, C.; Bottini, D.; Garilli, B.; Le Brun, V. L.; Maccagni, D.; Picat, J. P.; Scaramella, R.; Scodeggio, M.; Tresse, L.; Vettolani, G.; Zanichelli, A.; Adami, C.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Charlot, S.; Ciliegi, P.; Contini, T.; Foucaud, S.; Franzetti, P.; Gavignaud, I.; Ilbert, O.; Marano, B.; Mazure, A.; Merighi, R.; Paltani, S.; Pellò, R.; Pozzetti, L.; Radovich, M.; Zamorani, G.; Zucca, E.; Bondi, M.; Bongiorno, A.; Brinchmann, J.; Cucciati, O.; de la Torre, S.; Lamareille, F.; Mellier, Y.; Merluzzi, P.; Temporin, S.; Vergani, D.; Walcher, C. J.

    2007-12-01

    We discuss the evolution of clustering of galaxies in the Universe from the present epoch back to z ˜ 2, using the first-epoch data from the VIMOS-VLT Deep Survey (VVDS). We present the evolution of the projected two-point correlation function of galaxies for the global galaxy population, as well as its dependence on galaxy intrinsic luminosities and spectral types. While we do not find strong variations of the correlation function parameters with redshift for the global galaxy population, the clustering of objects with different intrinsic luminosities evolved significantly during last 8-10 billion years. Our findings indicate that bright galaxies in the past traced higher density peaks than they do now and that the shape of the correlation function of most luminous galaxies is different from observed for their local counterparts, which is a supporting evidence of a non-trivial evolution of the galaxy vs. dark matter bias.

  19. Collision-free spatial hash functions for structural analysis of billion-vertex chemical bond networks

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Bansal, Bhupesh; Branicio, Paulo S.; Kalia, Rajiv K.; Nakano, Aiichiro; Sharma, Ashish; Vashishta, Priya

    2006-09-01

    State-of-the-art molecular dynamics (MD) simulations generate massive datasets involving billion-vertex chemical bond networks, which makes data mining based on graph algorithms such as K-ring analysis a challenge. This paper proposes an algorithm to improve the efficiency of ring analysis of large graphs, exploiting properties of K-rings and spatial correlations of vertices in the graph. The algorithm uses dual-tree expansion (DTE) and spatial hash-function tagging (SHAFT) to optimize computation and memory access. Numerical tests show nearly perfect linear scaling of the algorithm. Also a parallel implementation of the DTE + SHAFT algorithm achieves high scalability. The algorithm has been successfully employed to analyze large MD simulations involving up to 500 million atoms.

  20. Enumeration of 166 billion organic small molecules in the chemical universe database GDB-17.

    PubMed

    Ruddigkeit, Lars; van Deursen, Ruud; Blum, Lorenz C; Reymond, Jean-Louis

    2012-11-26

    Drug molecules consist of a few tens of atoms connected by covalent bonds. How many such molecules are possible in total and what is their structure? This question is of pressing interest in medicinal chemistry to help solve the problems of drug potency, selectivity, and toxicity and reduce attrition rates by pointing to new molecular series. To better define the unknown chemical space, we have enumerated 166.4 billion molecules of up to 17 atoms of C, N, O, S, and halogens forming the chemical universe database GDB-17, covering a size range containing many drugs and typical for lead compounds. GDB-17 contains millions of isomers of known drugs, including analogs with high shape similarity to the parent drug. Compared to known molecules in PubChem, GDB-17 molecules are much richer in nonaromatic heterocycles, quaternary centers, and stereoisomers, densely populate the third dimension in shape space, and represent many more scaffold types.

  1. Geodynamo, solar wind, and magnetopause 3.4 to 3.45 billion years ago.

    PubMed

    Tarduno, John A; Cottrell, Rory D; Watkeys, Michael K; Hofmann, Axel; Doubrovine, Pavel V; Mamajek, Eric E; Liu, Dunji; Sibeck, David G; Neukirch, Levi P; Usui, Yoichi

    2010-03-05

    Stellar wind standoff by a planetary magnetic field prevents atmospheric erosion and water loss. Although the early Earth retained its water and atmosphere, and thus evolved as a habitable planet, little is known about Earth's magnetic field strength during that time. We report paleointensity results from single silicate crystals bearing magnetic inclusions that record a geodynamo 3.4 to 3.45 billion years ago. The measured field strength is approximately 50 to 70% that of the present-day field. When combined with a greater Paleoarchean solar wind pressure, the paleofield strength data suggest steady-state magnetopause standoff distances of < or = 5 Earth radii, similar to values observed during recent coronal mass ejection events. The data also suggest lower-latitude aurora and increases in polar cap area, as well as heating, expansion, and volatile loss from the exosphere that would have affected long-term atmospheric composition.

  2. Early formation of the Moon 4.51 billion years ago

    PubMed Central

    Barboni, Melanie; Boehnke, Patrick; Keller, Brenhin; Kohl, Issaku E.; Schoene, Blair; Young, Edward D.; McKeegan, Kevin D.

    2017-01-01

    Establishing the age of the Moon is critical to understanding solar system evolution and the formation of rocky planets, including Earth. However, despite its importance, the age of the Moon has never been accurately determined. We present uranium-lead dating of Apollo 14 zircon fragments that yield highly precise, concordant ages, demonstrating that they are robust against postcrystallization isotopic disturbances. Hafnium isotopic analyses of the same fragments show extremely low initial 176Hf/177Hf ratios corrected for cosmic ray exposure that are near the solar system initial value. Our data indicate differentiation of the lunar crust by 4.51 billion years, indicating the formation of the Moon within the first ~60 million years after the birth of the solar system. PMID:28097222

  3. Investigation of Radar Propagation in Buildings: A 10 Billion Element Cartesian-Mesh FETD Simulation

    SciTech Connect

    Stowell, M L; Fasenfest, B J; White, D A

    2008-01-14

    In this paper large scale full-wave simulations are performed to investigate radar wave propagation inside buildings. In principle, a radar system combined with sophisticated numerical methods for inverse problems can be used to determine the internal structure of a building. The composition of the walls (cinder block, re-bar) may effect the propagation of the radar waves in a complicated manner. In order to provide a benchmark solution of radar propagation in buildings, including the effects of typical cinder block and re-bar, we performed large scale full wave simulations using a Finite Element Time Domain (FETD) method. This particular FETD implementation is tuned for the special case of an orthogonal Cartesian mesh and hence resembles FDTD in accuracy and efficiency. The method was implemented on a general-purpose massively parallel computer. In this paper we briefly describe the radar propagation problem, the FETD implementation, and we present results of simulations that used over 10 billion elements.

  4. Billion-atom synchronous parallel kinetic Monte Carlo simulations of critical 3D Ising systems

    SciTech Connect

    Martinez, E.; Monasterio, P.R.; Marian, J.

    2011-02-20

    An extension of the synchronous parallel kinetic Monte Carlo (spkMC) algorithm developed by Martinez et al. [J. Comp. Phys. 227 (2008) 3804] to discrete lattices is presented. The method solves the master equation synchronously by recourse to null events that keep all processors' time clocks current in a global sense. Boundary conflicts are resolved by adopting a chessboard decomposition into non-interacting sublattices. We find that the bias introduced by the spatial correlations attendant to the sublattice decomposition is within the standard deviation of serial calculations, which confirms the statistical validity of our algorithm. We have analyzed the parallel efficiency of spkMC and find that it scales consistently with problem size and sublattice partition. We apply the method to the calculation of scale-dependent critical exponents in billion-atom 3D Ising systems, with very good agreement with state-of-the-art multispin simulations.

  5. Extraterrestrial demise of banded iron formations 1.85 billion years ago

    USGS Publications Warehouse

    Slack, J.F.; Cannon, W.F.

    2009-01-01

    In the Lake Superior region of North America, deposition of most banded iron formations (BIFs) ended abruptly 1.85 Ga ago, coincident with the oceanic impact of the giant Sudbury extraterrestrial bolide. We propose a new model in which this impact produced global mixing of shallow oxic and deep anoxic waters of the Paleoproterozoic ocean, creating a suboxic redox state for deep seawater. This suboxic state, characterized by only small concentrations of dissolved O2 (???1 ??M), prevented transport of hydrothermally derived Fe(II) from the deep ocean to continental-margin settings, ending an ???1.1 billion-year-long period of episodic BIF mineralization. The model is supported by the nature of Precambrian deep-water exhalative chemical sediments, which changed from predominantly sulfide facies prior to ca. 1.85 Ga to mainly oxide facies thereafter. ?? 2009 Geological Society of America.

  6. Barium fluoride whispering-gallery-mode disk-resonator with one billion quality-factor.

    PubMed

    Lin, Guoping; Diallo, Souleymane; Henriet, Rémi; Jacquot, Maxime; Chembo, Yanne K

    2014-10-15

    We demonstrate a monolithic optical whispering-gallery-mode resonator fabricated with barium fluoride (BaF₂) with an ultra-high quality (Q) factor above 10⁹ at 1550 nm, and measured with both the linewidth and cavity-ring-down methods. Vertical scanning optical profilometry shows that the root mean square surface roughness of 2 nm is achieved for our mm-size disk. To the best of our knowledge, we show for the first time that one billion Q-factor is achievable by precision polishing in relatively soft crystals with mohs hardness of 3. We show that complex thermo-optical dynamics can take place in these resonators. Beside usual applications in nonlinear optics and microwave photonics, high-energy particle scintillation detection utilizing monolithic BaF₂ resonators potentially becomes feasible.

  7. Star Formation in Galaxy Clusters Over the Past 10 Billion Years

    NASA Astrophysics Data System (ADS)

    Tran, Kim-Vy

    2012-01-01

    Galaxy clusters are the largest gravitationally bound systems in the universe and include the most massive galaxies in the universe; this makes galaxy clusters ideal laboratories for disentangling the nature versus nurture aspect of how galaxies evolve. Understanding how galaxies form and evolve in clusters continues to be a fundamental question in astronomy. The ages and assembly histories of galaxies in rich clusters test both stellar population models and hierarchical formation scenarios. Is star formation in cluster galaxies simply accelerated relative to their counterparts in the lower density field, or do cluster galaxies assemble their stars in a fundamentally different manner? To answer this question, I review multi-wavelength results on star formation in galaxy clusters from Coma to the most distant clusters yet discovered at look-back times of 10 billion years (z 2).

  8. Atmospheric carbon dioxide: a driver of photosynthetic eukaryote evolution for over a billion years?

    PubMed

    Beerling, David J

    2012-02-19

    Exciting evidence from diverse fields, including physiology, evolutionary biology, palaeontology, geosciences and molecular genetics, is providing an increasingly secure basis for robustly formulating and evaluating hypotheses concerning the role of atmospheric carbon dioxide (CO(2)) in the evolution of photosynthetic eukaryotes. Such studies span over a billion years of evolutionary change, from the origins of eukaryotic algae through to the evolution of our present-day terrestrial floras, and have relevance for plant and ecosystem responses to future global CO(2) increases. The papers in this issue reflect the breadth and depth of approaches being adopted to address this issue. They reveal new discoveries pointing to deep evidence for the role of CO(2) in shaping evolutionary changes in plants and ecosystems, and establish an exciting cross-disciplinary research agenda for uncovering new insights into feedbacks between biology and the Earth system.

  9. Atmospheric carbon dioxide: a driver of photosynthetic eukaryote evolution for over a billion years?

    PubMed Central

    Beerling, David J.

    2012-01-01

    Exciting evidence from diverse fields, including physiology, evolutionary biology, palaeontology, geosciences and molecular genetics, is providing an increasingly secure basis for robustly formulating and evaluating hypotheses concerning the role of atmospheric carbon dioxide (CO2) in the evolution of photosynthetic eukaryotes. Such studies span over a billion years of evolutionary change, from the origins of eukaryotic algae through to the evolution of our present-day terrestrial floras, and have relevance for plant and ecosystem responses to future global CO2 increases. The papers in this issue reflect the breadth and depth of approaches being adopted to address this issue. They reveal new discoveries pointing to deep evidence for the role of CO2 in shaping evolutionary changes in plants and ecosystems, and establish an exciting cross-disciplinary research agenda for uncovering new insights into feedbacks between biology and the Earth system. PMID:22232760

  10. Dust production 0.7-1.5 billion years after the Big Bang

    NASA Astrophysics Data System (ADS)

    Michałowski, Michał J.

    2016-06-01

    Cosmic dust is an important component of the Universe, and its origin, especially at high redshifts, is still unknown. I present a simple but powerful method of assessing whether dust observed in a given galaxy could in principle have been formed by asymptotic giant branch (AGB) stars or supernovae (SNe). Using this method I show that for most of the galaxies with detected dust emission between z=4 and z=7.5 (1.5-0.7 billion years after the Big Bang) AGB stars are not numerous and efficient enough to be responsible for the measured dust masses. Supernovae could account for most of the dust, but only if all of them had efficiencies close to the maximal theoretically allowed value. This suggests that a different mechanism is responsible for dust production at high redshifts, and the most likely possibility is the grain growth in the interstellar medium.

  11. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  12. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  13. On probability-possibility transformations

    NASA Technical Reports Server (NTRS)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  14. Children's Understanding of Posterior Probability

    ERIC Educational Resources Information Center

    Girotto, Vittorio; Gonzalez, Michael

    2008-01-01

    Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five,…

  15. Comments on quantum probability theory.

    PubMed

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  16. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  17. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  18. Searching for Organics Preserved in 4.5 Billion Year Old Salt

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.; Fries, M.; Steele, A.; Bodnar, R.

    2012-01-01

    Our understanding of early solar system fluids took a dramatic turn a decade ago with the discovery of fluid inclusion-bearing halite (NaCl) crystals in the matrix of two freshly fallen brecciated H chondrite falls, Monahans and Zag. Both meteorites are regolith breccias, and contain xenolithic halite (and minor admixed sylvite -- KCl, crystals in their regolith lithologies. The halites are purple to dark blue, due to the presence of color centers (electrons in anion vacancies) which slowly accumulated as 40K (in sylvite) decayed over billions of years. The halites were dated by K-Ar, Rb-Sr and I-Xe systematics to be 4.5 billion years old. The "blue" halites were a fantastic discovery for the following reasons: (1) Halite+sylvite can be dated (K is in sylvite and will substitute for Na in halite, Rb substitutes in halite for Na, and I substitutes for Cl). (2) The blue color is lost if the halite dissolves on Earth and reprecipitates (because the newly-formed halite has no color centers), so the color serves as a "freshness" or pristinity indicator. (3) Halite frequently contains aqueous fluid inclusions. (4) Halite contains no structural oxygen, carbon or hydrogen, making them ideal materials to measure these isotopic systems in any fluid inclusions. (5) It is possible to directly measure fluid inclusion formation temperatures, and thus directly measure the temperature of the mineralizing aqueous fluid. In addition to these two ordinary chondrites halite grains have been reliably reported in several ureilites, an additional ordinary chondrite (Jilin), and in the carbonaceous chondrite (Murchison), although these reports were unfortunately not taken seriously. We have lately found additional fluid inclusions in carbonates in several additional carbonaceous chondrites. Meteoritic aqueous fluid inclusions are apparently relatively widespread in meteorites, though very small and thus difficult to analyze.

  19. Time-dependent earthquake probabilities

    USGS Publications Warehouse

    Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.

    2005-01-01

    We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.

  20. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  1. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    SciTech Connect

    Downing, Mark; Eaton, Laurence M; Graham, Robin Lambert; Langholtz, Matthew H; Perlack, Robert D; Turhollow Jr, Anthony F; Stokes, Bryce; Brandt, Craig C

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small-diameter trees were

  2. Probability summation--a critique.

    PubMed

    Laming, Donald

    2013-03-01

    This Discussion Paper seeks to kill off probability summation, specifically the high-threshold assumption, as an explanatory idea in visual science. In combination with a Weibull function of a parameter of about 4, probability summation can accommodate, to within the limits of experimental error, the shape of the detectability function for contrast, the reduction in threshold that results from the combination of widely separated grating components, summation with respect to duration at threshold, and some instances, but not all, of spatial summation. But it has repeated difficulty with stimuli below threshold, because it denies the availability of input from such stimuli. All the phenomena listed above, and many more, can be accommodated equally accurately by signal-detection theory combined with an accelerated nonlinear transform of small, near-threshold, contrasts. This is illustrated with a transform that is the fourth power for the smallest contrasts, but tends to linear above threshold. Moreover, this particular transform can be derived from elementary properties of sensory neurons. Probability summation cannot be regarded as a special case of a more general theory, because it depends essentially on the 19th-century notion of a high fixed threshold. It is simply an obstruction to further progress.

  3. Objective Probability and Quantum Fuzziness

    NASA Astrophysics Data System (ADS)

    Mohrhoff, U.

    2009-02-01

    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which

  4. Cooling and exhumation of continents at billion-year time scales

    NASA Astrophysics Data System (ADS)

    Blackburn, T.; Bowring, S. A.; Perron, T.; Mahan, K. H.; Dudas, F. O.

    2011-12-01

    The oldest rocks on Earth are preserved within the continental lithosphere, where assembled fragments of ancient orogenic belts have survived erosion and destruction by plate tectonic and surface processes for billions of years. Though the rate of orogenic exhumation and erosion has been measured for segments of an orogenic history, it remains unclear how these exhumation rates have changed over the lifetime of any terrane. Because the exhumation of the lithospheric surface has a direct effect on the rate of heat loss within the lithosphere, a continuous record of lithosphere exhumation can be reconstructed through the use of thermochronology. Thermochronologic studies have typically employed systems sensitive to cooling at temperatures <300 °C, such as the (U-Th)/He and 40Ar/39Ar systems. This largely restricts their application to measuring cooling in rocks from the outer 10 km of the Earth's crust, resulting in a thermal history that is controlled by either upper crustal flexure and faulting and/or isotherm inflections related to surface topography. Combining these biases with the uplift, erosion and recycling of these shallow rocks results in a poor preservation potential of any long-term record. Here, an ancient and long-term record of lithosphere exhumation is constructed using U-Pb thermochronology, a geochronologic system sensitive to cooling at temperatures found at 20-50 km depth (400-650 °C). Lower crustal xenoliths provide material that resided at these depths for billions of years or more, recording a thermal history that is buried deep enough to remain insensitive to upper crustal deformation and instead is dominated by the vertical motions of the continents. We show how this temperature-sensitive system can produce a long-term integrated measure of continental exhumation and erosion. Preserved beneath Phanerozoic sedimentary rocks within Montana, USA, the Great Falls Tectonic Zone formed when two Archean cratons, the Wyoming Province and Medicine

  5. A redox-stratified ocean 3.2 billion years ago

    NASA Astrophysics Data System (ADS)

    Satkoski, Aaron M.; Beukes, Nicolas J.; Li, Weiqiang; Beard, Brian L.; Johnson, Clark M.

    2015-11-01

    Before the Great Oxidation Event (GOE) 2.4-2.2 billion years ago it has been traditionally thought that oceanic water columns were uniformly anoxic due to a lack of oxygen-producing microorganisms. Recently, however, it has been proposed that transient oxygenation of shallow seawater occurred between 2.8 and 3.0 billion years ago. Here, we present a novel combination of stable Fe and radiogenic U-Th-Pb isotope data that demonstrate significant oxygen contents in the shallow oceans at 3.2 Ga, based on analysis of the Manzimnyama Banded Iron Formation (BIF), Fig Tree Group, South Africa. This unit is exceptional in that proximal, shallow-water and distal, deep-water facies are preserved. When compared to the distal, deep-water facies, the proximal samples show elevated U concentrations and moderately positive δ56Fe values, indicating vertical stratification in dissolved oxygen contents. Confirmation of oxidizing conditions using U abundances is robustly constrained using samples that have been closed to U and Pb mobility using U-Th-Pb geochronology. Although redox-sensitive elements have been commonly used in ancient rocks to infer redox conditions, post-depositional element mobility has been rarely tested, and U-Th-Pb geochronology can constrain open- or closed-system behavior. The U abundances and δ56Fe values of the Manzimnyama BIF suggest the proximal, shallow-water samples record precipitation under stronger oxidizing conditions compared to the distal deeper-water facies, which in turn indicates the existence of a discrete redox boundary between deep and shallow ocean waters at this time; this work, therefore, documents the oldest known preserved marine redox gradient in the rock record. The relative enrichment of O2 in the upper water column is likely due to the existence of oxygen-producing microorganisms such as cyanobacteria. These results provide a new approach for identifying free oxygen in Earth's ancient oceans, including confirming the age of redox

  6. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    PubMed

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores.

  7. Switching To Less-Expensive Blindness Drug Could Save Medicare Part B $18 Billion Over A Ten-Year Period

    PubMed Central

    Hutton, DW; Newman-Casey, PA; Tavag, M; Zacks, DN; Stein, JD

    2014-01-01

    The biologic drugs bevacizumab and ranibizumab have revolutionized treatment of diabetic macular edema and macular degeneration, leading causes of blindness. Ophthalmologic use of these drugs has increased, now accounting for roughly one-sixth of the Medicare Part B drug budget. Ranibizumab and bevacizumab have similar efficacy and potentially minor differences in adverse event rates, but at $2,023 per dose, ranibizumab costs forty times more than bevacizumab. Using modeling methods, we predict ten-year (2010–2020) population-level costs and health benefits of using bevacizumab and ranibizumab. Our results show that if all patients were treated with the less-expensive bevacizumab instead of current usage patterns, Medicare Part B, patients, and the health care system would save $18 billion, $4.6 billion, and $29 billion, respectively. Altering patterns of use with these therapies by encouraging bevacizumab use and hastening approval of biosimilar therapies would dramatically reduce spending without substantially affecting patient outcomes. PMID:24889941

  8. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  9. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew R.; Piro, Anthony; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  10. Analogues of primeval galaxies two billion years after the Big Bang

    NASA Astrophysics Data System (ADS)

    Amorín, Ricardo; Fontana, Adriano; Pérez-Montero, Enrique; Castellano, Marco; Guaita, Lucia; Grazian, Andrea; Fèvre, Olivier Le; Ribeiro, Bruno; Schaerer, Daniel; Tasca, Lidia A. M.; Thomas, Romain; Bardelli, Sandro; Cassarà, Letizia; Cassata, Paolo; Cimatti, Andrea; Contini, Thierry; Barros, Stephane De; Garilli, Bianca; Giavalisco, Mauro; Hathi, Nimish; Koekemoer, Anton; Le Brun, Vincent; Lemaux, Brian C.; Maccagni, Dario; Pentericci, Laura; Pforr, Janine; Talia, Margherita; Tresse, Laurence; Vanzella, Eros; Vergani, Daniela; Zamorani, Giovanni; Zucca, Elena; Merlin, Emiliano

    2017-03-01

    Deep observations are revealing a growing number of young galaxies in the first billion years of cosmic time1. Compared to typical galaxies at later times, they show more extreme emission-line properties2, higher star formation rates3, lower masses4, and smaller sizes5. However, their faintness precludes studies of their chemical abundances and ionization conditions, strongly limiting our understanding of the physics driving early galaxy build-up and metal enrichment. Here we study a rare population of ultraviolet-selected, low-luminosity galaxies at redshift 2.4 < z < 3.5 that exhibit all the rest-frame properties expected from primeval galaxies. These low-mass, highly compact systems are rapidly forming galaxies able to double their stellar mass in only a few tens of millions of years. They are characterized by very blue ultraviolet spectra with weak absorption features and bright nebular emission lines, which imply hard radiation fields from young hot massive stars6,7. Their highly ionized gas phase has strongly sub-solar carbon and oxygen abundances, with metallicities more than a factor of two lower than that found in typical galaxies of similar mass and star formation rate at z≤2.58. These young galaxies reveal an early and short stage in the assembly of their galactic structures and their chemical evolution, a vigorous phase that is likely to be dominated by the effects of gas-rich mergers, accretion of metal-poor gas and strong outflows.

  11. Rapid analysis of perchlorate in drinking water at parts per billion levels using microchip electrophoresis.

    PubMed

    Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S

    2010-05-01

    A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.

  12. A Massive Galaxy in Its Core Formation Phase Three Billion Years After the Big Bang

    NASA Technical Reports Server (NTRS)

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha M. Forster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-01-01

    Most massive galaxies are thought to have formed their dense stellar cores at early cosmic epochs. However, cores in their formation phase have not yet been observed. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we present a candidate core in formation 11 billion years ago, at z = 2.3. GOODS-N-774 has a stellar mass of 1.0 × 10 (exp 11) solar mass, a half-light radius of 1.0 kpc, and a star formation rate of 90 (sup +45 / sub -20) solar mass/yr. The star forming gas has a velocity dispersion 317 plus or minus 30 km/s, amongst the highest ever measured. It is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, compact quiescent galaxies at z is approximately equal to 2 (exp 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 appear to be rare; however, from the star formation rate and size of the galaxy we infer that many star forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  13. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    PubMed

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  14. A role for copper in protozoan grazing - two billion years selecting for bacterial copper resistance.

    PubMed

    Hao, Xiuli; Lüthje, Freja; Rønn, Regin; German, Nadezhda A; Li, Xuanji; Huang, Fuyi; Kisaka, Javan; Huffman, David; Alwathnani, Hend A; Zhu, Yong-Guan; Rensing, Christopher

    2016-11-01

    The Great Oxidation Event resulted in integration of soft metals in a wide range of biochemical processes including, in our opinion, killing of bacteria by protozoa. Compared to pressure from anthropologic copper contamination, little is known on impacts of protozoan predation on maintenance of copper resistance determinants in bacteria. To evaluate the role of copper and other soft metals in predatory mechanisms of protozoa, we examined survival of bacteria mutated in different transition metal efflux or uptake systems in the social amoeba Dictyostelium discoideum. Our data demonstrated a strong correlation between the presence of copper/zinc efflux as well as iron/manganese uptake, and bacterial survival in amoebae. The growth of protozoa, in turn, was dependent on bacterial copper sensitivity. The phagocytosis of bacteria induced upregulation of Dictyostelium genes encoding the copper uptake transporter p80 and a triad of Cu(I)-translocating PIB -type ATPases. Accumulated Cu(I) in Dictyostelium was monitored using a copper biosensor bacterial strain. Altogether, our data demonstrate that Cu(I) is ultimately involved in protozoan predation of bacteria, supporting our hypothesis that protozoan grazing selected for the presence of copper resistance determinants for about two billion years.

  15. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon.

    PubMed

    Bell, Elizabeth A; Boehnke, Patrick; Harrison, T Mark; Mao, Wendy L

    2015-11-24

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼ 3.5 billion years (Ga), the chemofossil record arguably to ∼ 3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ(13)CPDB of -24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼ 300 My earlier than has been previously proposed.

  16. The controversial "Cambrian" fossils of the Vindhyan are real but more than a billion years older.

    PubMed

    Bengtson, Stefan; Belivanova, Veneta; Rasmussen, Birger; Whitehouse, Martin

    2009-05-12

    The age of the Vindhyan sedimentary basin in central India is controversial, because geochronology indicating early Proterozoic ages clashes with reports of Cambrian fossils. We present here an integrated paleontologic-geochronologic investigation to resolve this conundrum. New sampling of Lower Vindhyan phosphoritic stromatolitic dolomites from the northern flank of the Vindhyans confirms the presence of fossils most closely resembling those found elsewhere in Cambrian deposits: annulated tubes, embryo-like globules with polygonal surface pattern, and filamentous and coccoidal microbial fabrics similar to Girvanella and Renalcis. None of the fossils, however, can be ascribed to uniquely Cambrian or Ediacaran taxa. Indeed, the embryo-like globules are not interpreted as fossils at all but as former gas bubbles trapped in mucus-rich cyanobacterial mats. Direct dating of the same fossiliferous phosphorite yielded a Pb-Pb isochron of 1,650 +/- 89 (2sigma) million years ago, confirming the Paleoproterozoic age of the fossils. New U-Pb geochronology of zircons from tuffaceous mudrocks in the Lower Vindhyan Porcellanite Formation on the southern flank of the Vindhyans give comparable ages. The Vindhyan phosphorites provide a window of 3-dimensionally preserved Paleoproterozoic fossils resembling filamentous and coccoidal cyanobacteria and filamentous eukaryotic algae, as well as problematic forms. Like Neoproterozoic phosphorites a billion years later, the Vindhyan deposits offer important new insights into the nature and diversity of life, and in particular, the early evolution of multicellular eukaryotes.

  17. Rapid oxygenation of Earth's atmosphere 2.33 billion years ago.

    PubMed

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J; Wang, David T; Xie, Shucheng; Summons, Roger E

    2016-05-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth's biogeochemical cycles. Although "whiffs" of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly-within 1 to 10 million years-and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, "Snowball Earth" glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions.

  18. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    NASA Astrophysics Data System (ADS)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  19. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    NASA Astrophysics Data System (ADS)

    Lowenstern, J. B.; Evans, W. C.; Bergfeld, D.; Hunt, A. G.

    2014-02-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  20. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone.

    PubMed

    Lowenstern, J B; Evans, W C; Bergfeld, D; Hunt, A G

    2014-02-20

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  1. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    USGS Publications Warehouse

    Lowenstern, Jacob B.; Evans, William C.; Bergfeld, D.; Hunt, Andrew G.

    2014-01-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents1. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot2. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  2. Sharing global CO2 emission reductions among one billion high emitters.

    PubMed

    Chakravarty, Shoibal; Chikkatur, Ananth; de Coninck, Heleen; Pacala, Stephen; Socolow, Robert; Tavoni, Massimo

    2009-07-21

    We present a framework for allocating a global carbon reduction target among nations, in which the concept of "common but differentiated responsibilities" refers to the emissions of individuals instead of nations. We use the income distribution of a country to estimate how its fossil fuel CO(2) emissions are distributed among its citizens, from which we build up a global CO(2) distribution. We then propose a simple rule to derive a universal cap on global individual emissions and find corresponding limits on national aggregate emissions from this cap. All of the world's high CO(2)-emitting individuals are treated the same, regardless of where they live. Any future global emission goal (target and time frame) can be converted into national reduction targets, which are determined by "Business as Usual" projections of national carbon emissions and in-country income distributions. For example, reducing projected global emissions in 2030 by 13 GtCO(2) would require the engagement of 1.13 billion high emitters, roughly equally distributed in 4 regions: the U.S., the OECD minus the U.S., China, and the non-OECD minus China. We also modify our methodology to place a floor on emissions of the world's lowest CO(2) emitters and demonstrate that climate mitigation and alleviation of extreme poverty are largely decoupled.

  3. Genetic Code Mutations: The Breaking of a Three Billion Year Invariance

    PubMed Central

    Mat, Wai-Kin; Xue, Hong; Wong, J. Tze-Fei

    2010-01-01

    The genetic code has been unchanging for some three billion years in its canonical ensemble of encoded amino acids, as indicated by the universal adoption of this ensemble by all known organisms. Code mutations beginning with the encoding of 4-fluoro-Trp by Bacillus subtilis, initially replacing and eventually displacing Trp from the ensemble, first revealed the intrinsic mutability of the code. This has since been confirmed by a spectrum of other experimental code alterations in both prokaryotes and eukaryotes. To shed light on the experimental conversion of a rigidly invariant code to a mutating code, the present study examined code mutations determining the propagation of Bacillus subtilis on Trp and 4-, 5- and 6-fluoro-tryptophans. The results obtained with the mutants with respect to cross-inhibitions between the different indole amino acids, and the growth effects of individual nutrient withdrawals rendering essential their biosynthetic pathways, suggested that oligogenic barriers comprising sensitive proteins which malfunction with amino acid analogues provide effective mechanisms for preserving the invariance of the code through immemorial time, and mutations of these barriers open up the code to continuous change. PMID:20808824

  4. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    DOE PAGES

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; ...

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in amore » crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.« less

  5. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    SciTech Connect

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; Mao, Wendy L.

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.

  6. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes.

    PubMed

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D

    2014-07-28

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments.

  7. If slow rate of health care spending growth persists, projections may be off by $770 billion.

    PubMed

    Cutler, David M; Sahni, Nikhil R

    2013-05-01

    Despite earlier forecasts to the contrary, US health care spending growth has slowed in the past four years, continuing a trend that began in the early 2000s. In this article we attempt to identify why US health care spending growth has slowed, and we explore the spending implications if the trend continues for the next decade. We find that the 2007-09 recession, a one-time event, accounted for 37 percent of the slowdown between 2003 and 2012. A decline in private insurance coverage and cuts to some Medicare payment rates accounted for another 8 percent of the slowdown, leaving 55 percent of the spending slowdown unexplained. We conclude that a host of fundamental changes--including less rapid development of imaging technology and new pharmaceuticals, increased patient cost sharing, and greater provider efficiency--were responsible for the majority of the slowdown in spending growth. If these trends continue during 2013-22, public-sector health care spending will be as much as $770 billion less than predicted. Such lower levels of spending would have an enormous impact on the US economy and on government and household finances.

  8. Rapid oxygenation of Earth’s atmosphere 2.33 billion years ago

    PubMed Central

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J.; Wang, David T.; Xie, Shucheng; Summons, Roger E.

    2016-01-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth’s biogeochemical cycles. Although “whiffs” of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly—within 1 to 10 million years—and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, “Snowball Earth” glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  9. GERLUMPH Data Release 2: 2.5 Billion Simulated Microlensing Light Curves

    NASA Astrophysics Data System (ADS)

    Vernardos, G.; Fluke, C. J.; Bate, N. F.; Croton, D.; Vohl, D.

    2015-04-01

    In the upcoming synoptic all-sky survey era of astronomy, thousands of new multiply imaged quasars are expected to be discovered and monitored regularly. Light curves from the images of gravitationally lensed quasars are further affected by superimposed variability due to microlensing. In order to disentangle the microlensing from the intrinsic variability of the light curves, the time delays between the multiple images have to be accurately measured. The resulting microlensing light curves can then be analyzed to reveal information about the background source, such as the size of the quasar accretion disk. In this paper we present the most extensive and coherent collection of simulated microlensing light curves; we have generated \\gt 2.5 billion light curves using the GERLUMPH high resolution microlensing magnification maps. Our simulations can be used to train algorithms to measure lensed quasar time delays, plan future monitoring campaigns, and study light curve properties throughout parameter space. Our data are openly available to the community and are complemented by online eResearch tools, located at http://gerlumph.swin.edu.au.

  10. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    PubMed Central

    Bell, Elizabeth A.; Harrison, T. Mark; Mao, Wendy L.

    2015-01-01

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼3.5 billion years (Ga), the chemofossil record arguably to ∼3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of −24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼300 My earlier than has been previously proposed. PMID:26483481

  11. Providing safe drinking water to 1.2 billion unserved people

    SciTech Connect

    Gadgil, Ashok J.; Derby, Elisabeth A.

    2003-06-01

    Despite substantial advances in the past 100 years in public health, technology and medicine, 20% of the world population, mostly comprised of the poor population segments in developing countries (DCs), still does not have access to safe drinking water. To reach the United Nations (UN) Millennium Goal of halving the number of people without access to safe water by 2015, the global community will need to provide an additional one billion urban residents and 600 million rural residents with safe water within the next twelve years. This paper examines current water treatment measures and implementation methods for delivery of safe drinking water, and offers suggestions for making progress towards the goal of providing a timely and equitable solution for safe water provision. For water treatment, based on the serious limitations of boiling water and chlorination, we suggest an approach based on filtration coupled with ultraviolet (UV) disinfection, combined with public education. Additionally, owing to the capacity limitations for non-governmental organizations (NGOs) to take on this task primarily on their own, we suggest a strategy based on financially sustainable models that include the private sector as well as NGOs.

  12. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes

    PubMed Central

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D.

    2014-01-01

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments. PMID:25068404

  13. Constraints on the first billion years of the geodynamo from paleointensity studies of zircons

    NASA Astrophysics Data System (ADS)

    Tarduno, John; Cottrell, Rory; Davis, William

    2014-05-01

    Several lines of reasoning, including new ideas on core thermal conductivity, suggest that onset of a strong geomagnetic field might have been delayed by one billion years (or more) after the lunar forming event. Here we extend the Proterozoic/Archean to Paleoarchean record of the geomagnetic field constrained by single crystal paleointensity (SCP) analyses (Tarduno et al., Science, 2010) to older times using zircons containing minute magnetic inclusions. Specifically, we focus on samples from the Jack Hills (Yilgarn Craton, Western Australia). We employ a CO2 laser demagnetization system and a small bore (6.3 mm) 3-component DC SQUID magnetometer; the latter offers the highest currently available moment resolution. Sample age is analyzed using SHRIMP U-Pb geochronology. Preliminary data support the presence of a relatively strong Paleoarchean field produced by a core dynamo, extending the known record by at least 100 million years, to approximately 3.55 Ga. These data only serve to exacerbate the apparent problem posed by the presence of a Paleoarchean dynamo. Alternative dynamo driving mechanisms, or efficient core/lowermost mantle heat loss processes unique to the Paleoarchean (and older times) might have been at work. We will discuss these processes, and our efforts to study even older Eoarchean-Hadean zircons.

  14. Archean rocks in antarctica: 2.5-billion-year uranium-lead ages of pegmatites in enderby land.

    PubMed

    Grew, E S; Manton, W I

    1979-10-26

    Uranium-lead isotopic data indicate that the granulite-facies Napier complex of Enderby Land, Antarctica, was cut by charnockitic pegmatites 2.5 billion years ago and by pegmatites lacking hypersthene 0.52 billion years ago. The 4-bil-lion-years lead-lead ages (whole rock) reported for the Napier complex are rejected since these leads developed in three stages. Reconstructions of Gondwanaland suggest that the Napier complex may be a continuation of the Archean granulitic terrain of southern India.

  15. Megascopic Eukaryotic Algae from the 2.1-Billion-Year-Old Negaunee Iron-Formation, Michigan

    NASA Astrophysics Data System (ADS)

    Han, Tsu-Ming; Runnegar, Bruce

    1992-07-01

    Hundreds of specimens of spirally coiled, megascopic, carbonaceous fossils resembling Grypania spiralis (Walcott), have been found in the 2.1-billion-year-old Negaunee Iron-Formation at the Empire Mine, near Marquette, Michigan. This occurrence of Grypania is 700 million to 1000 million years older than fossils from previously known sites in Montana, China, and India. As Grypania appears to have been a photosynthetic alga, this discovery places the origin of organelle-bearing eukaryotic cells prior to 2.1 billion years ago.

  16. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  17. Modality, probability, and mental models.

    PubMed

    Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N

    2016-10-01

    We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both. Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning.

  18. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  19. WITPO (What Is the Probability Of).

    ERIC Educational Resources Information Center

    Ericksen, Donna Bird; And Others

    1991-01-01

    Included in this probability board game are the requirements, the rules, the board, and 44 sample questions. This game can be used as a probability unit review for practice on basic skills and algorithms, such as computing compound probability and using Pascal's triangle to solve binomial probability problems. (JJK)

  20. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.

  1. Herschel-ATLAS: rapid evolution of dust in galaxies over the last 5 billion years

    NASA Astrophysics Data System (ADS)

    Dunne, L.; Gomez, H. L.; da Cunha, E.; Charlot, S.; Dye, S.; Eales, S.; Maddox, S. J.; Rowlands, K.; Smith, D. J. B.; Auld, R.; Baes, M.; Bonfield, D. G.; Bourne, N.; Buttiglione, S.; Cava, A.; Clements, D. L.; Coppin, K. E. K.; Cooray, A.; Dariush, A.; de Zotti, G.; Driver, S.; Fritz, J.; Geach, J.; Hopwood, R.; Ibar, E.; Ivison, R. J.; Jarvis, M. J.; Kelvin, L.; Pascale, E.; Pohlen, M.; Popescu, C.; Rigby, E. E.; Robotham, A.; Rodighiero, G.; Sansom, A. E.; Serjeant, S.; Temi, P.; Thompson, M.; Tuffs, R.; van der Werf, P.; Vlahakis, C.

    2011-10-01

    We present the first direct and unbiased measurement of the evolution of the dust mass function of galaxies over the past 5 billion years of cosmic history using data from the Science Demonstration Phase of the Herschel-Astrophysical Terahertz Large Area Survey (Herschel-ATLAS). The sample consists of galaxies selected at 250 ?m which have reliable counterparts from the Sloan Digital Sky Survey (SDSS) at z < 0.5, and contains 1867 sources. Dust masses are calculated using both a single-temperature grey-body model for the spectral energy distribution and also a model with multiple temperature components. The dust temperature for either model shows no trend with redshift. Splitting the sample into bins of redshift reveals a strong evolution in the dust properties of the most massive galaxies. At z= 0.4-0.5, massive galaxies had dust masses about five times larger than in the local Universe. At the same time, the dust-to-stellar mass ratio was about three to four times larger, and the optical depth derived from fitting the UV-sub-mm data with an energy balance model was also higher. This increase in the dust content of massive galaxies at high redshift is difficult to explain using standard dust evolution models and requires a rapid gas consumption time-scale together with either a more top-heavy initial mass function (IMF), efficient mantle growth, less dust destruction or combinations of all three. This evolution in dust mass is likely to be associated with a change in overall interstellar medium mass, and points to an enhanced supply of fuel for star formation at earlier cosmic epochs.

  2. ESA's billion star surveyor - Flight operations experience from Gaia's first 1.5 Years

    NASA Astrophysics Data System (ADS)

    Milligan, D.; Rudolph, A.; Whitehead, G.; Loureiro, T.; Serpell, E.; di Marco, F.; Marie, J.; Ecale, E.

    2016-10-01

    This paper details the initial in-flight mission operations experience from ESA's ultra-precise Gaia spacecraft. Tasked with mapping the positions and movements of 1 billion stars to unprecedented precision (to the 10 s of micro-arc-second level, comparable to the width of a coin on the Moon as viewed from Earth). ESA's Science cornerstone mission is expected to also discover and chart 100,000's of new objects including near Earth Asteroids, exoplanets, brown dwarfs and Quasars. After a flawless launch 19 Dec 2013, Gaia was brought the circa 1.5 million kms into L2 via a sequence of technically demanding orbit transfer manoeuvres using onboard thrusters in thrust vectoring mode. Starting in parallel to this, and lasting 6 months, the full spacecraft was commissioned and brought gradually up to the highest operational mode. A number of problems were detected and tackled during commissioning and early routine phase operations. An apparent dimming of the on-board laser and imaged stars, was tracked down to water ice building up inside the telescope enclosure. Also apparent was more straylight than expected. Elsewhere, a micro-propulsion thruster developed unexpected performance levels and a back-up chemical thruster suffered a failed latch valve. These issues, like several others, were dealt with and solved in a series of review meetings, in-orbit special operations and newly developed procedures and on-board software changes. After commissioning Gaia was working so well that it was producing approximately 45% more science data than originally foreseen, primarily since it was able to see stars fainter than required. The mission operations concept was quickly adapted to partially automate ground operations and increase ground station time to allow the full scientific potential of Gaia to be realised.

  3. Searching for the birthplaces of open clusters with ages of several billion years

    NASA Astrophysics Data System (ADS)

    Acharova, I. A.; Shevtsova, E. S.

    2016-01-01

    We discuss the possibility of finding the birthplaces of open clusters (OC) with ages of several billion years. The proposed method is based on the comparison of the results of the chemical evolution modeling of the Galactic disk with the parameters of the cluster. Five OCs older than 7 Gyr are known: NGC6791, BH176, Collinder 261, Berkeley 17, and Berkeley 39. The oxygen and iron abundances in NGC6791 and the oxygen abundance in BH176 are twice the solar level, the heavy-element abundances in other clusters are close to the corresponding solar values. According to chemical evolution models, at the time of the formation of the objects considered the regions where the oxygen and iron abundances reached the corresponding levels extended out to 5 kpc from the Galactic center.At present time theOCs considered are located several kpc from the Galactic center. Some of these clusters are located extremely high, about 1 kpc above the disk midplane, i.e., they have been subject to some mechanism that has carried them into orbits uncharacteristic of this type of objects. It follows from a comparison with the results of chemical evolution that younger clusters with ages of 4-5 Gyr, e.g., NGC1193,M67, and others, may have formed in a broad range of Galactocentric distances. Their large heights above the disk midplane is sufficient to suggest that these clusters have moved away from their likely birthplaces. Clusters are carried far away from the Galactic disk until the present time: about 40 clusters with ages from 0 to 2 Gyr are observed at heights ranging from 300 to 750 pc.

  4. Layout finishing of a 28nm, 3 billions transistors, multi-core processor

    NASA Astrophysics Data System (ADS)

    Morey-Chaisemartin, Philippe; Beisser, Eric

    2013-06-01

    Designing a fully new 256 cores processor is a great challenge for a fabless startup. In addition to all architecture, functionalities and timing issues, the layout by itself is a bottleneck due to all the process constraints of a 28nm technology. As developers of advanced layout finishing solutions, we were involved in the design flow of this huge chip with its 3 billions transistors. We had to face the issue of dummy patterns instantiation with respect to design constraints. All the design rules to generate the "dummies" are clearly defined in the Design Rule Manual, and some automatic procedures are provided by the foundry itself, but these routines don't take care of the designer requests. Such a chip, embeds both digital parts and analog modules for clock and power management. These two different type of designs have each their own set of constraints. In both cases, the insertion of dummies should not introduce unexpected variations leading to malfunctions. For example, on digital parts were signal race conditions are critical on long wires or bus, introduction of uncontrolled parasitic along these nets are highly critical. For analog devices such as high frequency and high sensitivity comparators, the exact symmetry of the two parts of a current mirror generator should be guaranteed. Thanks to the easily customizable features of our dummies insertion tool, we were able to configure it in order to meet all the designer requirements as well as the process constraints. This paper will present all these advanced key features as well as the layout tricks used to fulfill all requirements.

  5. Large molecular gas reservoirs in ancestors of Milky Way-mass galaxies nine billion years ago

    NASA Astrophysics Data System (ADS)

    Papovich, C.; Labbé, I.; Glazebrook, K.; Quadri, R.; Bekiaris, G.; Dickinson, M.; Finkelstein, S. L.; Fisher, D.; Inami, H.; Livermore, R. C.; Spitler, L.; Straatman, C.; Tran, K.-V.

    2016-12-01

    The gas accretion and star formation histories of galaxies like the Milky Way remain an outstanding problem in astrophysics 1,2 . Observations show that 8 billion years ago, the progenitors to Milky Way-mass galaxies were forming stars 30 times faster than today and were predicted to be rich in molecular gas 3 , in contrast to the low present-day gas fractions (<10%) 4-6 . Here we show the detection of molecular gas from the CO (J = 3-2) emission (rest-frame 345.8 GHz) in galaxies at redshifts z = 1.2-1.3, selected to have the stellar mass and star formation rate of the progenitors of today's Milky Way-mass galaxies. The CO emission reveals large molecular gas masses, comparable to or exceeding the galaxy stellar masses, and implying that most of the baryons are in cold gas, not stars. The total luminosities of the galaxies from star formation and CO luminosities yield long gas consumption timescales. Compared to local spiral galaxies, the star formation efficiency, estimated from the ratio of total infrared luminosity (L IR) to CO emission, has remained nearly constant since redshift z = 1.2, despite the order of magnitude decrease in gas fraction, consistent with the results for other galaxies at this epoch 7-10 . Therefore, the physical processes that determine the rate at which gas cools to form stars in distant galaxies appear to be similar to that in local galaxies.

  6. The formation of submillimetre-bright galaxies from gas infall over a billion years.

    PubMed

    Narayanan, Desika; Turk, Matthew; Feldmann, Robert; Robitaille, Thomas; Hopkins, Philip; Thompson, Robert; Hayward, Christopher; Ball, David; Faucher-Giguère, Claude-André; Kereš, Dušan

    2015-09-24

    Submillimetre-bright galaxies at high redshift are the most luminous, heavily star-forming galaxies in the Universe and are characterized by prodigious emission in the far-infrared, with a flux of at least five millijanskys at a wavelength of 850 micrometres. They reside in haloes with masses about 10(13) times that of the Sun, have low gas fractions compared to main-sequence disks at a comparable redshift, trace complex environments and are not easily observable at optical wavelengths. Their physical origin remains unclear. Simulations have been able to form galaxies with the requisite luminosities, but have otherwise been unable to simultaneously match the stellar masses, star formation rates, gas fractions and environments. Here we report a cosmological hydrodynamic galaxy formation simulation that is able to form a submillimetre galaxy that simultaneously satisfies the broad range of observed physical constraints. We find that groups of galaxies residing in massive dark matter haloes have increasing rates of star formation that peak at collective rates of about 500-1,000 solar masses per year at redshifts of two to three, by which time the interstellar medium is sufficiently enriched with metals that the region may be observed as a submillimetre-selected system. The intense star formation rates are fuelled in part by the infall of a reservoir gas supply enabled by stellar feedback at earlier times, not through major mergers. With a lifetime of nearly a billion years, our simulations show that the submillimetre-bright phase of high-redshift galaxies is prolonged and associated with significant mass buildup in early-Universe proto-clusters, and that many submillimetre-bright galaxies are composed of numerous unresolved components (for which there is some observational evidence).

  7. Atmospheric sulfur rearrangement 2.7 billion years ago: Evidence for oxygenic photosynthesis

    NASA Astrophysics Data System (ADS)

    Kurzweil, Florian; Claire, Mark; Thomazo, Christophe; Peters, Marc; Hannington, Mark; Strauss, Harald

    2013-03-01

    Mass-independently fractionated sulfur isotopes (MIF-S) provide strong evidence for an anoxic atmosphere during the Archean. Moreover, the temporal evolution of MIF-S shows increasing magnitudes between 2.7 and 2.5 Ga until the start of the Great Oxidation Event (G.O.E.) at around 2.4 Ga. The conclusion of a completely anoxic atmosphere up to the G.O.E. is in contrast to recent studies on redox-sensitive elements, which suggest slightly oxidizing conditions during continental weathering already several hundred million years prior to the G.O.E. In order to investigate this apparent inconsistency, we present multiple sulfur isotopes for 2.71 Ga pyritic black shales derived from the Kidd Creek area, Ontario, Canada. These samples display high positive Δ33S values up to 3.8‰ and the typical late Archean slope in Δ36S/Δ33S of -0.9. In contrast, the time period before (3.2-2.73 Ga) is characterized by greatly attenuated MIF-S magnitudes and a slope in Δ36S/Δ33S of -1.5. We attribute the increase in Δ33S magnitude as well as the contemporaneous change in the slope of Δ36S/Δ33S to changes in the relative reaction rate of different MIF-S source reactions and changes in atmospheric sulfur exit channels. Both of these are dependent on atmospheric CH4:CO2 and O2 mixing ratios. We propose a distinct change in atmospheric composition at 2.7 Ga resulting from increased fluxes of oxygen and methane as the best explanation for the observed Neoarchean MIF-S record. Our data and modeling results suggest that oxygenic photosynthesis was a major contributor to primary productivity 2.7 billion years ago.

  8. The transition to a sulphidic ocean approximately 1.84 billion years ago.

    PubMed

    Poulton, Simon W; Fralick, Philip W; Canfield, Donald E

    2004-09-09

    The Proterozoic aeon (2.5 to 0.54 billion years (Gyr) ago) marks the time between the largely anoxic world of the Archean (> 2.5 Gyr ago) and the dominantly oxic world of the Phanerozoic (< 0.54 Gyr ago). The course of ocean chemistry through the Proterozoic has traditionally been explained by progressive oxygenation of the deep ocean in response to an increase in atmospheric oxygen around 2.3 Gyr ago. This postulated rise in the oxygen content of the ocean is in turn thought to have led to the oxidation of dissolved iron, Fe(II), thus ending the deposition of banded iron formations (BIF) around 1.8 Gyr ago. An alternative interpretation suggests that the increasing atmospheric oxygen levels enhanced sulphide weathering on land and the flux of sulphate to the oceans. This increased rates of sulphate reduction, resulting in Fe(II) removal in the form of pyrite as the oceans became sulphidic. Here we investigate sediments from the approximately 1.8-Gyr-old Animikie group, Canada, which were deposited during the final stages of the main global period of BIF deposition. This allows us to evaluate the two competing hypotheses for the termination of BIF deposition. We use iron-sulphur-carbon (Fe-S-C) systematics to demonstrate continued ocean anoxia after the final global deposition of BIF and show that a transition to sulphidic bottom waters was ultimately responsible for the termination of BIF deposition. Sulphidic conditions may have persisted until a second major rise in oxygen between 0.8 to 0.58 Gyr ago, possibly reducing global rates of primary production and arresting the pace of algal evolution.

  9. The transition to a sulphidic ocean ~ 1.84 billion years ago

    NASA Astrophysics Data System (ADS)

    Poulton, Simon W.; Fralick, Philip W.; Canfield, Donald E.

    2004-09-01

    The Proterozoic aeon (2.5 to 0.54 billion years (Gyr) ago) marks the time between the largely anoxic world of the Archean (> 2.5Gyr ago) and the dominantly oxic world of the Phanerozoic (< 0.54Gyr ago). The course of ocean chemistry through the Proterozoic has traditionally been explained by progressive oxygenation of the deep ocean in response to an increase in atmospheric oxygen around 2.3Gyr ago. This postulated rise in the oxygen content of the ocean is in turn thought to have led to the oxidation of dissolved iron, Fe(II), thus ending the deposition of banded iron formations (BIF) around 1.8Gyr ago. An alternative interpretation suggests that the increasing atmospheric oxygen levels enhanced sulphide weathering on land and the flux of sulphate to the oceans. This increased rates of sulphate reduction, resulting in Fe(II) removal in the form of pyrite as the oceans became sulphidic. Here we investigate sediments from the ~1.8-Gyr-old Animikie group, Canada, which were deposited during the final stages of the main global period of BIF deposition. This allows us to evaluate the two competing hypotheses for the termination of BIF deposition. We use iron-sulphur-carbon (Fe-S-C) systematics to demonstrate continued ocean anoxia after the final global deposition of BIF and show that a transition to sulphidic bottom waters was ultimately responsible for the termination of BIF deposition. Sulphidic conditions may have persisted until a second major rise in oxygen between 0.8 to 0.58Gyr ago, possibly reducing global rates of primary production and arresting the pace of algal evolution.

  10. No Photon Left Behind: How Billions of Spectral Lines are Transforming Planetary Sciences

    NASA Astrophysics Data System (ADS)

    Villanueva, Geronimo L.

    2014-06-01

    With the advent of realistic potential energy surface (PES) and dipole moment surface (DMS) descriptions, theoretically computed linelists can now synthesize accurate spectral parameters for billions of spectral lines sampling the untamed high-energy molecular domain. Being the initial driver for these databases the characterization of stellar spectra, these theoretical databases, in combination with decades of precise experimental studies (nicely compiled in community databases such as HITRAN and GEISA), are leading to unprecedented precisions in the characterization of planetary atmospheres. Cometary sciences are among the most affected by this spectroscopic revolution. Even though comets are relatively cold bodies (T˜100 K), their infrared molecular emission is mainly defined by non-LTE solar fluorescence induced by a high-energy source (Sun, T˜5600 K). In order to interpret high-resolution spectra of comets acquired with extremely powerful telescopes (e.g., Keck, VLT, NASA-IRTF), we have developed advanced non-LTE fluorescence models that integrate the high-energy dynamic range of ab-initio databases (e.g., BT2, VTT, HPT2, BYTe, TROVE) and the precision of laboratory and semi-empirical compilations (e.g., HITRAN, GEISA, CDMS, WKMC, SELP, IUPAC). These new models allow us to calculate realistic non-LTE pumps, cascades, branching-ratios, and emission rates for a broad range of excitation regimes for H2O, HDO, HCN, HNC and NH3. We have implemented elements of these compilations to the study of Mars spectra, and we are now exploring its application to modeling non-LTE emission in exoplanets. In this presentation, we present application of these advanced models to interpret highresolution spectra of comets, Mars and exoplanets.

  11. Industrial R&D Spending Reached $26.6 Billion in 1976. Science Resources Studies Highlights, May 5, 1978.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    This report presents data compiled as part of a comprehensive program to measure and analyze the nation's resources expended for research and development (R&D). Industry, which carries out 69% of the R&D in the United States, spent $26.6 billion on these activities in 1976, 10% above the 1975 level. In constant dollars, this presents an…

  12. Switching to less expensive blindness drug could save medicare part B $18 billion over a ten-year period.

    PubMed

    Hutton, David; Newman-Casey, Paula Anne; Tavag, Mrinalini; Zacks, David; Stein, Joshua

    2014-06-01

    The biologic drugs bevacizumab and ranibizumab have revolutionized treatment of diabetic macular edema and neovascular age-related macular degeneration, leading causes of blindness. Ophthalmologic use of these drugs has increased and now accounts for roughly one-sixth of the Medicare Part B drug budget. The two drugs have similar efficacy and potentially minor differences in adverse-event rates; however, at $2,023 per dose, ranibizumab costs forty times more than bevacizumab. Using modeling methods, we predict ten-year (2010-20) population-level costs and health benefits of using bevacizumab and ranibizumab. Our results show that if all patients were treated with the less expensive bevacizumab instead of current usage patterns, savings would amount to $18 billion for Medicare Part B and nearly $5 billion for patients. With an additional $6 billion savings in other health care expenses, the total savings would be almost $29 billion. Altering patterns of use with these therapies by encouraging bevacizumab use and hastening approval of biosimilar therapies would dramatically reduce spending without substantially affecting patient outcomes.

  13. $100 Billion: For Reform...or to Subsidize the Status Quo? Education Stimulus Watch. Special Report 1

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    This is the first in a quarterly series of special reports on the K-12 education implications of the federal government's economic stimulus package, the American Recovery and Reinvestment Act (ARRA). That the ARRA, which was signed into law in February, will pump nearly $100 billion--an unprecedented sum of federal money--into K-12 education is…

  14. 77 FR 3075 - Resolution Plans Required for Insured Depository Institutions With $50 Billion or More in Total...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ... Plan will describe the plan to resolve each parent holding company under the Bankruptcy Code, the Rule... insurance fund or the economy, or if the parent company has been designated as a systemically important... association is over $50 billion and receives a CAMELS rating of 3 or worse or its parent receives...

  15. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  16. Trajectory versus probability density entropy.

    PubMed

    Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  17. Strongly baryon-dominated disk galaxies at the peak of galaxy formation ten billion years ago

    NASA Astrophysics Data System (ADS)

    Genzel, R.; Schreiber, N. M. Förster; Übler, H.; Lang, P.; Naab, T.; Bender, R.; Tacconi, L. J.; Wisnioski, E.; Wuyts, S.; Alexander, T.; Beifiori, A.; Belli, S.; Brammer, G.; Burkert, A.; Carollo, C. M.; Chan, J.; Davies, R.; Fossati, M.; Galametz, A.; Genel, S.; Gerhard, O.; Lutz, D.; Mendel, J. T.; Momcheva, I.; Nelson, E. J.; Renzini, A.; Saglia, R.; Sternberg, A.; Tacchella, S.; Tadaki, K.; Wilman, D.

    2017-03-01

    In the cold dark matter cosmology, the baryonic components of galaxies—stars and gas—are thought to be mixed with and embedded in non-baryonic and non-relativistic dark matter, which dominates the total mass of the galaxy and its dark-matter halo. In the local (low-redshift) Universe, the mass of dark matter within a galactic disk increases with disk radius, becoming appreciable and then dominant in the outer, baryonic regions of the disks of star-forming galaxies. This results in rotation velocities of the visible matter within the disk that are constant or increasing with disk radius—a hallmark of the dark-matter model. Comparisons between the dynamical mass, inferred from these velocities in rotational equilibrium, and the sum of the stellar and cold-gas mass at the peak epoch of galaxy formation ten billion years ago, inferred from ancillary data, suggest high baryon fractions in the inner, star-forming regions of the disks. Although this implied baryon fraction may be larger than in the local Universe, the systematic uncertainties (owing to the chosen stellar initial-mass function and the calibration of gas masses) render such comparisons inconclusive in terms of the mass of dark matter. Here we report rotation curves (showing rotation velocity as a function of disk radius) for the outer disks of six massive star-forming galaxies, and find that the rotation velocities are not constant, but decrease with radius. We propose that this trend arises because of a combination of two main factors: first, a large fraction of the massive high-redshift galaxy population was strongly baryon-dominated, with dark matter playing a smaller part than in the local Universe; and second, the large velocity dispersion in high-redshift disks introduces a substantial pressure term that leads to a decrease in rotation velocity with increasing radius. The effect of both factors appears to increase with redshift. Qualitatively, the observations suggest that baryons in the early

  18. The Other Inconvenient Truth: Feeding 9 Billion While Sustaining the Earth System

    NASA Astrophysics Data System (ADS)

    Foley, J. A.

    2010-12-01

    As the international community focuses on climate change as the great challenge of our era, we have been largely ignoring another looming problem — the global crisis in agriculture, food security and the environment. Our use of land, particularly for agriculture, is absolutely essential to the success of the human race: we depend on agriculture to supply us with food, feed, fiber, and, increasingly, biofuels. Without a highly efficient, productive, and resilient agricultural system, our society would collapse almost overnight. But we are demanding more and more from our global agricultural systems, pushing them to their very limits. Continued population growth (adding more than 70 million people to the world every year), changing dietary preferences (including more meat and dairy consumption), rising energy prices, and increasing needs for bioenergy sources are putting tremendous pressure on the world’s resources. And, if we want any hope of keeping up with these demands, we’ll need to double the agricultural production of the planet in the next 30 to 40 years. Meeting these huge new agricultural demands will be one of the greatest challenges of the 21st century. At present, it is completely unclear how (and if) we can do it. If this wasn’t enough, we must also address the massive environmental impacts of our current agricultural practices, which new evidence indicates rival the impacts of climate change. Simply put, providing for the basic needs of 9 billion-plus people, without ruining the biosphere in the process, will be one of the greatest challenges our species has ever faced. In this presentation, I will present a new framework for evaluating and assessing global patterns of agriculture, food / fiber / fuel production, and their relationship to the earth system, particularly in terms of changing stocks and flows of water, nutrients and carbon in our planetary environment. This framework aims to help us manage the challenges of increasing global food

  19. Strongly baryon-dominated disk galaxies at the peak of galaxy formation ten billion years ago.

    PubMed

    Genzel, R; Schreiber, N M Förster; Übler, H; Lang, P; Naab, T; Bender, R; Tacconi, L J; Wisnioski, E; Wuyts, S; Alexander, T; Beifiori, A; Belli, S; Brammer, G; Burkert, A; Carollo, C M; Chan, J; Davies, R; Fossati, M; Galametz, A; Genel, S; Gerhard, O; Lutz, D; Mendel, J T; Momcheva, I; Nelson, E J; Renzini, A; Saglia, R; Sternberg, A; Tacchella, S; Tadaki, K; Wilman, D

    2017-03-15

    In the cold dark matter cosmology, the baryonic components of galaxies-stars and gas-are thought to be mixed with and embedded in non-baryonic and non-relativistic dark matter, which dominates the total mass of the galaxy and its dark-matter halo. In the local (low-redshift) Universe, the mass of dark matter within a galactic disk increases with disk radius, becoming appreciable and then dominant in the outer, baryonic regions of the disks of star-forming galaxies. This results in rotation velocities of the visible matter within the disk that are constant or increasing with disk radius-a hallmark of the dark-matter model. Comparisons between the dynamical mass, inferred from these velocities in rotational equilibrium, and the sum of the stellar and cold-gas mass at the peak epoch of galaxy formation ten billion years ago, inferred from ancillary data, suggest high baryon fractions in the inner, star-forming regions of the disks. Although this implied baryon fraction may be larger than in the local Universe, the systematic uncertainties (owing to the chosen stellar initial-mass function and the calibration of gas masses) render such comparisons inconclusive in terms of the mass of dark matter. Here we report rotation curves (showing rotation velocity as a function of disk radius) for the outer disks of six massive star-forming galaxies, and find that the rotation velocities are not constant, but decrease with radius. We propose that this trend arises because of a combination of two main factors: first, a large fraction of the massive high-redshift galaxy population was strongly baryon-dominated, with dark matter playing a smaller part than in the local Universe; and second, the large velocity dispersion in high-redshift disks introduces a substantial pressure term that leads to a decrease in rotation velocity with increasing radius. The effect of both factors appears to increase with redshift. Qualitatively, the observations suggest that baryons in the early (high

  20. Gaia Data Release 1. Astrometry: one billion positions, two million proper motions and parallaxes

    NASA Astrophysics Data System (ADS)

    Lindegren, L.; Lammers, U.; Bastian, U.; Hernández, J.; Klioner, S.; Hobbs, D.; Bombrun, A.; Michalik, D.; Ramos-Lerate, M.; Butkevich, A.; Comoretto, G.; Joliet, E.; Holl, B.; Hutton, A.; Parsons, P.; Steidelmüller, H.; Abbas, U.; Altmann, M.; Andrei, A.; Anton, S.; Bach, N.; Barache, C.; Becciani, U.; Berthier, J.; Bianchi, L.; Biermann, M.; Bouquillon, S.; Bourda, G.; Brüsemeister, T.; Bucciarelli, B.; Busonero, D.; Carlucci, T.; Castañeda, J.; Charlot, P.; Clotet, M.; Crosta, M.; Davidson, M.; de Felice, F.; Drimmel, R.; Fabricius, C.; Fienga, A.; Figueras, F.; Fraile, E.; Gai, M.; Garralda, N.; Geyer, R.; González-Vidal, J. J.; Guerra, R.; Hambly, N. C.; Hauser, M.; Jordan, S.; Lattanzi, M. G.; Lenhardt, H.; Liao, S.; Löffler, W.; McMillan, P. J.; Mignard, F.; Mora, A.; Morbidelli, R.; Portell, J.; Riva, A.; Sarasso, M.; Serraller, I.; Siddiqui, H.; Smart, R.; Spagna, A.; Stampa, U.; Steele, I.; Taris, F.; Torra, J.; van Reeven, W.; Vecchiato, A.; Zschocke, S.; de Bruijne, J.; Gracia, G.; Raison, F.; Lister, T.; Marchant, J.; Messineo, R.; Soffel, M.; Osorio, J.; de Torres, A.; O'Mullane, W.

    2016-11-01

    Context. Gaia Data Release 1 (DR1) contains astrometric results for more than 1 billion stars brighter than magnitude 20.7 based on observations collected by the Gaia satellite during the first 14 months of its operational phase. Aims: We give a brief overview of the astrometric content of the data release and of the model assumptions, data processing, and validation of the results. Methods: For stars in common with the Hipparcos and Tycho-2 catalogues, complete astrometric single-star solutions are obtained by incorporating positional information from the earlier catalogues. For other stars only their positions are obtained, essentially by neglecting their proper motions and parallaxes. The results are validated by an analysis of the residuals, through special validation runs, and by comparison with external data. Results: For about two million of the brighter stars (down to magnitude 11.5) we obtain positions, parallaxes, and proper motions to Hipparcos-type precision or better. For these stars, systematic errors depending for example on position and colour are at a level of ± 0.3 milliarcsecond (mas). For the remaining stars we obtain positions at epoch J2015.0 accurate to 10 mas. Positions and proper motions are given in a reference frame that is aligned with the International Celestial Reference Frame (ICRF) to better than 0.1 mas at epoch J2015.0, and non-rotating with respect to ICRF to within 0.03 mas yr-1. The Hipparcos reference frame is found to rotate with respect to the Gaia DR1 frame at a rate of 0.24 mas yr-1. Conclusions: Based on less than a quarter of the nominal mission length and on very provisional and incomplete calibrations, the quality and completeness of the astrometric data in Gaia DR1 are far from what is expected for the final mission products. The present results nevertheless represent a huge improvement in the available fundamental stellar data and practical definition of the optical reference frame.

  1. A 25.5 percent AM0 gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltage approach 1.120 V and efficiencies well over 25 percent (AM0) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  2. A 25.5 percent AMO gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltages approach 1.120 V and efficiencies well over 25 percent (AMO) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  3. 5 Percent Ares I Scale Model Acoustic Test: Overpressure Characterization and Analysis

    NASA Technical Reports Server (NTRS)

    Alvord, David; Casiano, Matthew; McDaniels, Dave

    2011-01-01

    During the ignition of a ducted solid rocket motor (SRM), rapid expansion of injected hot gases from the motor into a confined volume causes the development of a steep fronted wave. This low frequency transient wave propagates outward from the exhaust duct, impinging the vehicle and ground structures. An unsuppressed overpressure wave can potentially cause modal excitation in the structures and vehicle, subsequently leading to damage. This presentation details the ignition transient f indings from the 5% Ares I Scale Model Acoustic Test (ASMAT). The primary events of the ignition transient environment induced by the SRM are the ignition overpressure (IOP), duct overpressure (DOP), and source overpressure (SOP). The resulting observations include successful knockdown of the IOP environment through use of a Space Shuttle derived IOP suppression system, a potential load applied to the vehicle stemming from instantaneous asymmetrical IOP and DOP wave impingement, and launch complex geometric influences on the environment. The results are scaled to a full-scale Ares I equivalent and compared with heritage data including Ares I-X and both suppressed and unsuppressed Space Shuttle IOP environments.

  4. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...

  5. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... members. However, the participation by creditors in formulating a plan for an insolvency workout or a... receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or reorganization do... advisor is also the underwriter (without regard to whether it is a firm commitment or best...

  6. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...

  7. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... members. However, the participation by creditors in formulating a plan for an insolvency workout or a... receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or reorganization do... advisor is also the underwriter (without regard to whether it is a firm commitment or best...

  8. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...

  9. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  10. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. The continuing cost of privatization: extra payments to Medicare Advantage plans jump to $11.4 billion in 2009.

    PubMed

    Biles, Brian; Pozen, Jonah; Guterman, Stuart

    2009-05-01

    The Medicare Modernization Act of 2003 explicitly increased Medicare payments to private Medicare Advantage (MA) plans. As a result, MA plans have, for the past six years, been paid more for their enrollees than they would be expected to cost in traditional fee-for-service Medicare. Payments to MA plans in 2009 are projected to be 13 percent greater than the corresponding costs in traditional Medicare--an average of $1,138 per MA plan enrollee, for a total of $11.4 billion. Although the extra payments are used to provide enrollees additional benefits, those benefits are not available to all beneficiaries-- but they are financed by general program funds. If payments to MA plans were instead equal to the spending level under traditional Medicare, the more than $150 billion in savings over 10 years could be used to finance improved benefits for the low-income elderly and disabled, or for expanding health-insurance coverage.

  12. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  13. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  14. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  15. Survival Of Pure Disc Galaxies Over The Last 8 Billion Years

    NASA Astrophysics Data System (ADS)

    Sachdeva, Sonali

    2016-09-01

    The presence of pure disk galaxies without any bulge component, i.e., neither classical nor pseudo, poses a severe challenge not just to the hierarchical galaxy formation models but also to the theories of internal secular evolution. We discover that a significant fraction of disk galaxies ( 15-18 %) in the Hubble Deep Field (0.4 < z < 1.0) as well as in the local Universe (0.02 < z < 0.05) are such pure disk systems (PDS). We trace the evolution of this population to find how they survived the merger violence and other disk instabilities to remain dynamically undisturbed. We find that smooth accretion of cold gas via cosmic filaments is the most probable mode of their growth in mass and size since z 1. We speculate that PDSs are dynamically hotter and cushioned in massive dark matter haloes which may prevent them from undergoing strong secular evolution.

  16. A billion pixels, a billion stars

    NASA Astrophysics Data System (ADS)

    Gilmore, Gerry; van Leeuwen, Floor

    2016-09-01

    The Gaia spacecraft is conducting the most ambitious and thorough census of our galaxy ever attempted, gathering data on 100,000 stars every hour. With the mission's first major data release due this month, Gerry Gilmore and Floor van Leeuwen explain how the spacecraft works and assess its likely impact on the field of astrophysics

  17. Intergalactic Lyman continuum photon budget in the past 5 billion years

    NASA Astrophysics Data System (ADS)

    Gaikwad, Prakash; Khaire, Vikram; Choudhury, Tirthankar Roy; Srianand, Raghunathan

    2017-04-01

    We constrain the H I photoionization rate (Γ _{H I}) at z ≲ 0.45 by comparing the flux probability distribution function and power spectrum of the Lyα forest data along 82 Quasi-Stellar Object (QSO) sightlines obtained using Cosmic Origins Spectrograph with models generated from smoothed particle hydrodynamic simulations. We have developed a module named 'Code for Ionization and Temperature Evolution (CITE)' for calculating the intergalactic medium (IGM) temperature evolution from high to low redshifts by post-processing the GADGET-2 simulation outputs. Our method, that produces results consistent with other simulations, is computationally less expensive thus allowing us to explore a large parameter space. It also allows rigorous estimation of the error covariance matrix for various statistical quantities of interest. We find that the best-fitting Γ _{H I}(z) increases with z and follows (4 ± 0.1) × 10-14 (1 + z)4.99 ± 0.12 s-1. At any given z, the typical uncertainties Δ Γ _{H I} / Γ _{H I} are ∼25 per cent that contains not only the statistical errors but also those arising from possible degeneracy with the thermal history of the IGM and cosmological parameters and uncertainties in fitting the QSO continuum. These values of Γ _{H I} favour the scenario where only QSOs contribute to the ionizing background at z < 2. Our derived 3σ upper limit on average escape fraction is 0.008, consistent with measurements of low-z galaxies.

  18. UT Biomedical Informatics Lab (BMIL) probability wheel

    NASA Astrophysics Data System (ADS)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  19. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    PubMed

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  20. Derivation of quantum probability from measurement

    NASA Astrophysics Data System (ADS)

    Herbut, Fedor

    2016-05-01

    To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).

  1. Error probability performance of unbalanced QPSK receivers

    NASA Technical Reports Server (NTRS)

    Simon, M. K.

    1978-01-01

    A simple technique for calculating the error probability performance and associated noisy reference loss of practical unbalanced QPSK receivers is presented. The approach is based on expanding the error probability conditioned on the loop phase error in a power series in the loop phase error and then, keeping only the first few terms of this series, averaging this conditional error probability over the probability density function of the loop phase error. Doing so results in an expression for the average error probability which is in the form of a leading term representing the ideal (perfect synchronization references) performance plus a term proportional to the mean-squared crosstalk. Thus, the additional error probability due to noisy synchronization references occurs as an additive term proportional to the mean-squared phase jitter directly associated with the receiver's tracking loop. Similar arguments are advanced to give closed-form results for the noisy reference loss itself.

  2. UT Biomedical Informatics Lab (BMIL) Probability Wheel

    PubMed Central

    Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant,” about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences. PMID:28105462

  3. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  4. Location probability learning requires focal attention.

    PubMed

    Kabata, Takashi; Yokoyama, Takemasa; Noguchi, Yasuki; Kita, Shinichi

    2014-01-01

    Target identification is related to the frequency with which targets appear at a given location, with greater frequency enhancing identification. This phenomenon suggests that location probability learned through repeated experience with the target modulates cognitive processing. However, it remains unclear whether attentive processing of the target is required to learn location probability. Here, we used a dual-task paradigm to test the location probability effect of attended and unattended stimuli. Observers performed an attentionally demanding central-letter task and a peripheral-bar discrimination task in which location probability was manipulated. Thus, we were able to compare performance on the peripheral task when attention was fully engaged to the target (single-task condition) versus when attentional resources were drawn away by the central task (dual-task condition). The location probability effect occurred only in the single-task condition, when attention resources were fully available. This suggests that location probability learning requires attention to the target stimuli.

  5. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  6. Total variation denoising of probability measures using iterated function systems with probabilities

    NASA Astrophysics Data System (ADS)

    La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.

    2017-01-01

    In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.

  7. NREL Helps Clean Cities Displace Billions of Gallons of Petroleum, One Vehicle at a Time (Fact Sheet)

    SciTech Connect

    Not Available

    2010-10-01

    With more than 15 years and nearly 3 billion gallons of displaced petroleum under its belt, the Clean Cities program relies on the support and expertise of the National Renewable Energy Laboratory (NREL). An initiative of the U.S. Department of Energy (DOE), Clean Cities creates public-private partnerships with a common mission: to reduce petroleum consumption in the transportation sector. Since the inception of Clean Cities in 1993, NREL has played a central role in supporting the program, an effort that stems from the laboratory's strategy to put scientific innovation into action in the marketplace.

  8. 1.8 Billion Years of Detrital Zircon Recycling Calibrates a Refractory Part of Earth's Sedimentary Cycle.

    PubMed

    Hadlari, Thomas; Swindles, Graeme T; Galloway, Jennifer M; Bell, Kimberley M; Sulphur, Kyle C; Heaman, Larry M; Beranek, Luke P; Fallas, Karen M

    2015-01-01

    Detrital zircon studies are providing new insights on the evolution of sedimentary basins but the role of sedimentary recycling remains largely undefined. In a broad region of northwestern North America, this contribution traces the pathway of detrital zircon sand grains from Proterozoic sandstones through Phanerozoic strata and argues for multi-stage sedimentary recycling over more than a billion years. As a test of our hypothesis, integrated palynology and detrital zircon provenance provides clear evidence for erosion of Carboniferous strata in the northern Cordillera as a sediment source for Upper Cretaceous strata. Our results help to calibrate Earth's sedimentary cycle by showing that recycling dominates sedimentary provenance for the refractory mineral zircon.

  9. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  10. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  11. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the…

  12. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  13. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  14. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than...

  15. Teaching Statistics and Probability: 1981 Yearbook.

    ERIC Educational Resources Information Center

    Shulte, Albert P., Ed.; Smart, James R., Ed.

    This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…

  16. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  17. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  18. WPE: A Mathematical Microworld for Learning Probability

    ERIC Educational Resources Information Center

    Kiew, Su Ding; Sam, Hong Kian

    2006-01-01

    In this study, the researchers developed the Web-based Probability Explorer (WPE), a mathematical microworld and investigated the effectiveness of the microworld's constructivist learning environment in enhancing the learning of probability and improving students' attitudes toward mathematics. This study also determined the students' satisfaction…

  19. Malawian Students' Meanings for Probability Vocabulary

    ERIC Educational Resources Information Center

    Kazima, Mercy

    2007-01-01

    The paper discusses findings of a study that investigated Malawian students' meanings for some probability vocabulary. The study explores the meanings that, prior to instruction, students assign to some words that are commonly used in teaching probability. The aim is to have some insight into the meanings that students bring to the classroom. The…

  20. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We…

  2. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  3. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  4. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  5. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  6. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  7. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  8. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  9. An introductory analysis of satellite collision probabilities

    NASA Astrophysics Data System (ADS)

    Carlton-Wippern, Kitt C.

    This paper addresses a probailistic approach in assessing the probabilities of a satellite collision occurring due to relative trajectory analyses and probability density functions representing the satellites' position/momentum vectors. The paper is divided into 2 parts: Static and Dynamic Collision Probabilities. In the Static Collision Probability section, the basic phenomenon under study is: given the mean positions and associated position probability density functions for the two objects, calculate the probability that the two objects collide (defined as being within some distance of each other). The paper presents the classic Laplace problem of the probability of arrival, using standard uniform distribution functions. This problem is then extrapolated to show how 'arrival' can be classified as 'collision', how the arrival space geometries map to collision space geometries and how arbitrary position density functions can then be included and integrated into the analysis. In the Dynamic Collision Probability section, the nature of collisions based upon both trajectory and energy considerations is discussed, and that energy states alone cannot be used to completely describe whether or not a collision occurs. This fact invalidates some earlier work on the subject and demonstrates why Liouville's theorem cannot be used in general to describe the constant density of the position/momentum space in which a collision may occur. Future position probability density functions are then shown to be the convolution of the current position and momentum density functions (linear analysis), and the paper further demonstrates the dependency of the future position density functions on time. Strategies for assessing the collision probabilities for two point masses with uncertainties in position and momentum at some given time, and thes integrated with some arbitrary impact volume schema, are then discussed. This presentation concludes with the formulation of a high level design

  10. An ultraluminous quasar with a twelve-billion-solar-mass black hole at redshift 6.30.

    PubMed

    Wu, Xue-Bing; Wang, Feige; Fan, Xiaohui; Yi, Weimin; Zuo, Wenwen; Bian, Fuyan; Jiang, Linhua; McGreer, Ian D; Wang, Ran; Yang, Jinyi; Yang, Qian; Thompson, David; Beletsky, Yuri

    2015-02-26

    So far, roughly 40 quasars with redshifts greater than z = 6 have been discovered. Each quasar contains a black hole with a mass of about one billion solar masses (10(9) M Sun symbol). The existence of such black holes when the Universe was less than one billion years old presents substantial challenges to theories of the formation and growth of black holes and the coevolution of black holes and galaxies. Here we report the discovery of an ultraluminous quasar, SDSS J010013.02+280225.8, at redshift z = 6.30. It has an optical and near-infrared luminosity a few times greater than those of previously known z > 6 quasars. On the basis of the deep absorption trough on the blue side of the Lyman-α emission line in the spectrum, we estimate the proper size of the ionized proximity zone associated with the quasar to be about 26 million light years, larger than found with other z > 6.1 quasars with lower luminosities. We estimate (on the basis of a near-infrared spectrum) that the black hole has a mass of ∼1.2 × 10(10) M Sun symbol, which is consistent with the 1.3 × 10(10) M Sun symbol derived by assuming an Eddington-limited accretion rate.

  11. Compound-specific carbon and hydrogen isotope analysis of sub-parts per billion level waterborne petroleum hydrocarbons

    USGS Publications Warehouse

    Wang, Y.; Huang, Y.; Huckins, J.N.; Petty, J.D.

    2004-01-01

    Compound-specific carbon and hydrogen isotope analysis (CSCIA and CSHIA) has been increasingly used to study the source, transport, and bioremediation of organic contaminants such as petroleum hydrocarbons. In natural aquatic systems, dissolved contaminants represent the bioavailable fraction that generally is of the greatest toxicological significance. However, determining the isotopic ratios of waterborne hydrophobic contaminants in natural waters is very challenging because of their extremely low concentrations (often at sub-parts ber billion, or even lower). To acquire sufficient quantities of polycyclic aromatic hydrocarbons with 10 ng/L concentration for CSHIA, more than 1000 L of water must be extracted. Conventional liquid/liquid or solid-phase extraction is not suitable for such large volume extractions. We have developed a new approach that is capable of efficiently sampling sub-parts per billion level waterborne petroleum hydrocarbons for CSIA. We use semipermeable membrane devices (SPMDs) to accumulate hydrophobic contaminants from polluted waters and then recover the compounds in the laboratory for CSIA. In this study, we demonstrate, under a variety of experimental conditions (different concentrations, temperatures, and turbulence levels), that SPMD-associated processes do not induce C and H isotopic fractionations. The applicability of SPMD-CSIA technology to natural systems is further demonstrated by determining the ??13C and ??D values of petroleum hydrocarbons present in the Pawtuxet River, RI. Our results show that the combined SPMD-CSIA is an effective tool to investigate the source and fate of hydrophobic contaminants in the aquatic environments.

  12. White Light Demonstration of One Hundred Parts per Billion Irradiance Suppression in Air by New Starshade Occulters

    NASA Technical Reports Server (NTRS)

    Levinton, Douglas B.; Cash, Webster C.; Gleason, Brian; Kaiser, Michael J.; Levine, Sara A.; Lo, Amy S.; Schindhelm, Eric; Shipley, Ann F.

    2007-01-01

    A new mission concept for the direct imaging of exo-solar planets called the New Worlds Observer (NWO) has been proposed. The concept involves flying a meter-class space telescope in formation with a newly-conceived, specially-shaped, deployable star-occulting shade several meters across at a separation of some tens of thousands of kilometers. The telescope would make its observations from behind the starshade in a volume of high suppression of incident irradiance from the star around which planets orbit. The required level of irradiance suppression created by the starshade for an efficacious mission is of order 0.1 to 10 parts per billion in broadband light. This paper discusses the experimental setup developed to accurately measure the suppression ratio of irradiance produced at the null position behind candidate starshade forms to these levels. It also presents results of broadband measurements which demonstrated suppression levels of just under 100 parts per billion in air using the Sun as a light source. Analytical modeling of spatial irradiance distributions surrounding the null are presented and compared with photographs of irradiance captured in situ behind candidate starshades.

  13. An age difference of two billion years between a metal-rich and a metal-poor globular cluster.

    PubMed

    Hansen, B M S; Kalirai, J S; Anderson, J; Dotter, A; Richer, H B; Rich, R M; Shara, M M; Fahlman, G G; Hurley, J R; King, I R; Reitzel, D; Stetson, P B

    2013-08-01

    Globular clusters trace the formation history of the spheroidal components of our Galaxy and other galaxies, which represent the bulk of star formation over the history of the Universe. The clusters exhibit a range of metallicities (abundances of elements heavier than helium), with metal-poor clusters dominating the stellar halo of the Galaxy, and higher-metallicity clusters found within the inner Galaxy, associated with the stellar bulge, or the thick disk. Age differences between these clusters can indicate the sequence in which the components of the Galaxy formed, and in particular which clusters were formed outside the Galaxy and were later engulfed along with their original host galaxies, and which were formed within it. Here we report an absolute age of 9.9 ± 0.7 billion years (at 95 per cent confidence) for the metal-rich globular cluster 47 Tucanae, determined by modelling the properties of the cluster's white-dwarf cooling sequence. This is about two billion years younger than has been inferred for the metal-poor cluster NGC 6397 from the same models, and provides quantitative evidence that metal-rich clusters like 47 Tucanae formed later than metal-poor halo clusters like NGC 6397.

  14. A spin-down clock for cool stars from observations of a 2.5-billion-year-old cluster.

    PubMed

    Meibom, Søren; Barnes, Sydney A; Platais, Imants; Gilliland, Ronald L; Latham, David W; Mathieu, Robert D

    2015-01-29

    The ages of the most common stars--low-mass (cool) stars like the Sun, and smaller--are difficult to derive because traditional dating methods use stellar properties that either change little as the stars age or are hard to measure. The rotation rates of all cool stars decrease substantially with time as the stars steadily lose their angular momenta. If properly calibrated, rotation therefore can act as a reliable determinant of their ages based on the method of gyrochronology. To calibrate gyrochronology, the relationship between rotation period and age must be determined for cool stars of different masses, which is best accomplished with rotation period measurements for stars in clusters with well-known ages. Hitherto, such measurements have been possible only in clusters with ages of less than about one billion years, and gyrochronology ages for older stars have been inferred from model predictions. Here we report rotation period measurements for 30 cool stars in the 2.5-billion-year-old cluster NGC 6819. The periods reveal a well-defined relationship between rotation period and stellar mass at the cluster age, suggesting that ages with a precision of order 10 per cent can be derived for large numbers of cool Galactic field stars.

  15. A handful of 'antipoverty' vaccines exist for neglected diseases, but the world's poorest billion people need more.

    PubMed

    Hotez, Peter

    2011-06-01

    So-called neglected tropical diseases are the most common infections of the world's poor. Almost all of the "bottom billion"--the 1.4 billion people who live below the poverty level defined by the World Bank--suffer from one or more neglected diseases including hookworm infection, sleeping sickness, or Chagas disease. These diseases are actually a cause of poverty because of their adverse effects on child growth and development and worker productivity. Vaccines to combat such diseases have come to be known as "antipoverty vaccines." Unfortunately, the recent surge in the development and delivery of vaccines to combat the major childhood killers--such as pneumococcal pneumonia and measles--has bypassed neglected diseases. Nevertheless, some vaccines for these neglected diseases are now entering the clinical pipeline. In this article I describe how some antipoverty vaccine development is proceeding and offer recommendations for stimulating further development such as through pooled funding for innovation, developing-country manufacturers, and public-private partnerships for product development.

  16. Interaction, at Ambient Temperature and 80 °C, between Minerals and Artificial Seawaters Resembling the Present Ocean Composition and that of 4.0 Billion Years Ago

    NASA Astrophysics Data System (ADS)

    Carneiro, Cristine E. A.; Stabile, Antonio C.; Gomes, Frederico P.; da Costa, Antonio C. S.; Zaia, Cássia T. B. V.; Zaia, Dimas A. M.

    2016-10-01

    Probably one of the most important roles played by minerals in the origin of life on Earth was to pre-concentrate biomolecules from the prebiotic seas. There are other ways to pre concentrate biomolecules such as wetting/drying cycles and freezing/sublimation. However, adsorption is most important. If the pre-concentration did not occur—because of degradation of the minerals—other roles played by them such as protection against degradation, formation of polymers, or even as primitive cell walls would be seriously compromised. We studied the interaction of two artificial seawaters with kaolinite, bentonite, montmorillonite, goethite, ferrihydrite and quartz. One seawater has a major cation and anion composition similar to that of the oceans of the Earth 4.0 billion years ago (ASW 4.0 Ga). In the other, the major cations and anions are an average of the compositions of the seawaters of today (ASWT). When ASWT, which is rich in Na+ and Cl-, interacted with bentonite and montmorrilonite structural collapse occurred on the 001 plane. However, ASW 4.0 Ga, which is rich in Mg2+ and SO4 2-, did not induce this behavior. When ASW 4.0 Ga was reacted with the minerals for 24 h at room temperature and 80 °C, the release of Si and Al to the fluid was below 1 % of the amount in the minerals—meaning that dissolution of the minerals did not occur. In general, minerals adsorbed Mg2+ and K+ from the ASW 4.0 Ga and these cations could be used for the formation of polymers. Also, when the minerals were mixed with ASW 4.0 Ga at 80 °C and ASWT at room temperature or 80 °C it caused the precipitation of CaSO4•2H2O and halite, respectively. Finally, further experiments (adsorption, formation of polymers, protection of molecules against degradation, primitive cell wall formation) performed under the conditions described in this paper will probably be more representative of what happened on the prebiotic Earth.

  17. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  18. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  19. Bernoulli, Darwin, and Sagan: the probability of life on other planets

    NASA Astrophysics Data System (ADS)

    Rossmo, D. Kim

    2017-04-01

    The recent discovery that billions of planets in the Milky Way Galaxy may be in circumstellar habitable zones has renewed speculation over the possibility of extraterrestrial life. The Drake equation is a probabilistic framework for estimating the number of technological advanced civilizations in our Galaxy; however, many of the equation's component probabilities are either unknown or have large error intervals. In this paper, a different method of examining this question is explored, one that replaces the various Drake factors with the single estimate for the probability of life existing on Earth. This relationship can be described by the binomial distribution if the presence of life on a given number of planets is equated to successes in a Bernoulli trial. The question of exoplanet life may then be reformulated as follows - given the probability of one or more independent successes for a given number of trials, what is the probability of two or more successes? Some of the implications of this approach for finding life on exoplanets are discussed.

  20. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  1. The probability distribution of intense daily precipitation

    NASA Astrophysics Data System (ADS)

    Cavanaugh, Nicholas R.; Gershunov, Alexander; Panorska, Anna K.; Kozubowski, Tomasz J.

    2015-03-01

    The probability tail structure of over 22,000 weather stations globally is examined in order to identify the physically and mathematically consistent distribution type for modeling the probability of intense daily precipitation and extremes. Results indicate that when aggregating data annually, most locations are to be considered heavy tailed with statistical significance. When aggregating data by season, it becomes evident that the thickness of the probability tail is related to the variability in precipitation causing events and thus that the fundamental cause of precipitation volatility is weather diversity. These results have both theoretical and practical implications for the modeling of high-frequency climate variability worldwide.

  2. Class probability estimation for medical studies.

    PubMed

    Simon, Richard

    2014-07-01

    I provide a commentary on two papers "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. Those papers provide an up-to-date review of some popular machine learning methods for class probability estimation and compare those methods to logistic regression modeling in real and simulated datasets.

  3. Objective and subjective probability in gene expression.

    PubMed

    Velasco, Joel D

    2012-09-01

    In this paper I address the question of whether the probabilities that appear in models of stochastic gene expression are objective or subjective. I argue that while our best models of the phenomena in question are stochastic models, this fact should not lead us to automatically assume that the processes are inherently stochastic. After distinguishing between models and reality, I give a brief introduction to the philosophical problem of the interpretation of probability statements. I argue that the objective vs. subjective distinction is a false dichotomy and is an unhelpful distinction in this case. Instead, the probabilities in our models of gene expression exhibit standard features of both objectivity and subjectivity.

  4. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  5. Transition Probability and the ESR Experiment

    ERIC Educational Resources Information Center

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  6. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  7. The low synaptic release probability in vivo.

    PubMed

    Borst, J Gerard G

    2010-06-01

    The release probability, the average probability that an active zone of a presynaptic terminal releases one or more vesicles following an action potential, is tightly regulated. Measurements in cultured neurons or in slices indicate that this probability can vary greatly between synapses, but on average it is estimated to be as high as 0.5. In vivo, however, the size of synaptic potentials is relatively independent of recent history, suggesting that release probability is much lower. Possible causes for this discrepancy include maturational differences, a higher spontaneous activity, a lower extracellular calcium concentration and more prominent tonic inhibition by ambient neurotransmitters during in vivo recordings. Existing evidence thus suggests that under physiological conditions in vivo, presynaptic action potentials trigger the release of neurotransmitter much less frequently than what is observed in in vitro preparations.

  8. Classical and Quantum Spreading of Position Probability

    ERIC Educational Resources Information Center

    Farina, J. E. G.

    1977-01-01

    Demonstrates that the standard deviation of the position probability of a particle moving freely in one dimension is a function of the standard deviation of its velocity distribution and time in classical or quantum mechanics. (SL)

  9. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  10. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  11. Probability, clinical decision making and hypothesis testing

    PubMed Central

    Banerjee, A.; Jadhav, S. L.; Bhawalkar, J. S.

    2009-01-01

    Few clinicians grasp the true concept of probability expressed in the ‘P value.’ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing. PMID:21234167

  12. Grounding quantum probability in psychological mechanism.

    PubMed

    Love, Bradley C

    2013-06-01

    Pothos & Busemeyer (P&B) provide a compelling case that quantum probability (QP) theory is a better match to human judgment than is classical probability (CP) theory. However, any theory (QP, CP, or other) phrased solely at the computational level runs the risk of being underconstrained. One suggestion is to ground QP accounts in mechanism, to leverage a wide range of process-level data.

  13. A Manual for Encoding Probability Distributions.

    DTIC Science & Technology

    1978-09-01

    summary of the most significant information contained in the report. If the report contains a significant bibliography or literature survey, mention it...probability distri- bution. Some terms in the literature that are used synonymously to Encoding: Assessment, Assignment (used for single events in this...sessions conducted as parts of practical decision analyses as well as on experimental evidence in the literature . Probability encoding can be applied

  14. Imprecise Probability Methods for Weapons UQ

    SciTech Connect

    Picard, Richard Roy; Vander Wiel, Scott Alan

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  15. Probability distribution of the vacuum energy density

    SciTech Connect

    Duplancic, Goran; Stefancic, Hrvoje; Glavan, Drazen

    2010-12-15

    As the vacuum state of a quantum field is not an eigenstate of the Hamiltonian density, the vacuum energy density can be represented as a random variable. We present an analytical calculation of the probability distribution of the vacuum energy density for real and complex massless scalar fields in Minkowski space. The obtained probability distributions are broad and the vacuum expectation value of the Hamiltonian density is not fully representative of the vacuum energy density.

  16. When probability trees don't work

    NASA Astrophysics Data System (ADS)

    Chan, K. C.; Lenard, C. T.; Mills, T. M.

    2016-08-01

    Tree diagrams arise naturally in courses on probability at high school or university, even at an elementary level. Often they are used to depict outcomes and associated probabilities from a sequence of games. A subtle issue is whether or not the Markov condition holds in the sequence of games. We present two examples that illustrate the importance of this issue. Suggestions as to how these examples may be used in a classroom are offered.

  17. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  18. Constraint on a varying proton-electron mass ratio 1.5 billion years after the big bang.

    PubMed

    Bagdonaite, J; Ubachs, W; Murphy, M T; Whitmore, J B

    2015-02-20

    A molecular hydrogen absorber at a lookback time of 12.4 billion years, corresponding to 10% of the age of the Universe today, is analyzed to put a constraint on a varying proton-electron mass ratio, μ. A high resolution spectrum of the J1443+2724 quasar, which was observed with the Very Large Telescope, is used to create an accurate model of 89 Lyman and Werner band transitions whose relative frequencies are sensitive to μ, yielding a limit on the relative deviation from the current laboratory value of Δμ/μ=(-9.5 ± 5.4(stat)± 5.3(syst))×10(-6).

  19. Taking out one billion tones of carbon: the magic of China's 11thFive-Year Plan

    SciTech Connect

    Lin, Jiang; Zhou, Nan; Levine, Mark D.; Fridley, David

    2007-05-01

    China's 11th Five-Year Plan (FYP) sets an ambitious targetfor energy-efficiency improvement: energy intensity of the country sgross domestic product (GDP) should be reduced by 20 percent from 2005 to2010 (NDRC, 2006). This is the first time that a quantitative and bindingtarget has been set for energy efficiency, and signals a major shift inChina's strategic thinking about its long-term economic and energydevelopment. The 20 percent energy intensity target also translates intoan annual reduction of over one billion tons of CO2 by 2010, making theChinese effort one of most significant carbon mitigation effort in theworld today. While it is still too early to tell whether China willachieve this target, this paper attempts to understand the trend inenergy intensity in China and to explore a variety of options towardmeeting the 20 percent target using a detailed endues energymodel.

  20. Vaccine Assistance To Low- And Middle-Income Countries Increased To $3.6 Billion In 2014.

    PubMed

    Haakenstad, Annie; Birger, Maxwell; Singh, Lavanya; Liu, Patrick; Lim, Stephen; Ng, Marie; Dieleman, Joseph L

    2016-02-01

    In the 2012 Global Vaccine Action Plan, development assistance partners committed to providing sustainable financing for vaccines and expanding vaccination coverage to all children in low- and middle-income countries by 2020. To assess progress toward these goals, the Institute for Health Metrics and Evaluation produced estimates of development assistance for vaccinations. These estimates reveal major increases in the assistance provided since 2000. In 2014, $3.6 billion in development assistance for vaccinations was provided for low- and middle-income countries, up from $822 million in 2000. The funding increase was driven predominantly by the establishment of Gavi, the Vaccine Alliance, supported by the Bill & Melinda Gates Foundation and the governments of the United States and United Kingdom. Despite stagnation in total development assistance for health from donors from 2010 onward, development assistance for vaccination has continued to grow.

  1. United States menhaden oil could save billions in U.S. health care costs and improve IQ in children.

    PubMed

    Bibus, Douglas M

    2016-02-01

    The United States menhaden oil annual production is sufficient to supply all of the recommended long chain Omega-3s for Americans over 55 with coronary heart disease (CHD) and pregnant and lactating women. According to a recent study, the utilization of preventable intake levels could potentially save up to $1.7 billion annually in hospital costs alone. In addition, the remaining oil could be used to support a culture of enough Atlantic salmon to provide every pregnant and lactating woman in the U.S. with 8-12 ounces of fish per week, as recommended by the Food and Drug Administration (FDA), throughout the duration of pregnancy and lactation. Based on the FDA's quantitative assessment, this may result in a net increase of IQ by 5.5 points in children and improve their early age verbal development.

  2. The Archean Dongwanzi ophiolite complex, North China craton: 2.505-billion-year-old oceanic crust and mantle.

    PubMed

    Kusky, T M; Li, J H; Tucker, R D

    2001-05-11

    We report a thick, laterally extensive 2505 +/- 2.2-million-year-old (uranium-lead ratio in zircon) Archean ophiolite complex in the North China craton. Basal harzburgite tectonite is overlain by cumulate ultramafic rocks, a mafic-ultramafic transition zone of interlayered gabbro and ultramafic cumulates, compositionally layered olivine-gabbro and pyroxenite, and isotropic gabbro. A sheeted dike complex is rooted in the gabbro and overlain by a mixed dike-pillow lava section, chert, and banded iron formation. The documentation of a complete Archean ophiolite implies that mechanisms of oceanic crustal accretion similar to those of today were in operation by 2.5 billion years ago at divergent plate margins and that the temperature of the early mantle was not extremely elevated, as compared to the present-day temperature. Plate tectonic processes similar to those of the present must also have emplaced the ophiolite in a convergent margin setting.

  3. 1.8 Billion Years of Detrital Zircon Recycling Calibrates a Refractory Part of Earth’s Sedimentary Cycle

    PubMed Central

    Hadlari, Thomas; Swindles, Graeme T.; Galloway, Jennifer M.; Bell, Kimberley M.; Sulphur, Kyle C.; Heaman, Larry M.; Beranek, Luke P.; Fallas, Karen M.

    2015-01-01

    Detrital zircon studies are providing new insights on the evolution of sedimentary basins but the role of sedimentary recycling remains largely undefined. In a broad region of northwestern North America, this contribution traces the pathway of detrital zircon sand grains from Proterozoic sandstones through Phanerozoic strata and argues for multi-stage sedimentary recycling over more than a billion years. As a test of our hypothesis, integrated palynology and detrital zircon provenance provides clear evidence for erosion of Carboniferous strata in the northern Cordillera as a sediment source for Upper Cretaceous strata. Our results help to calibrate Earth's sedimentary cycle by showing that recycling dominates sedimentary provenance for the refractory mineral zircon. PMID:26658165

  4. Taking out 1 billion tons of CO2: The magic of China's 11th Five-Year Plan?

    SciTech Connect

    Zhou, Nan; Lin, Jiang; Zhou, Nan; Levine, Mark; Fridley, David

    2007-07-01

    China's 11th Five-Year Plan (FYP) sets an ambitious target for energy-efficiency improvement: energy intensity of the country's gross domestic product (GDP) should be reduced by 20% from 2005 to 2010 (NDRC, 2006). This is the first time that a quantitative and binding target has been set for energy efficiency, and signals a major shift in China's strategic thinking about its long-term economic and energy development. The 20% energy intensity target also translates into an annual reduction of over 1.5 billion tons of CO2 by 2010, making the Chinese effort one of most significant carbon mitigation effort in the world today. While it is still too early to tell whether China will achieve this target, this paper attempts to understand the trend in energy intensity in China and to explore a variety of options toward meeting the 20% target using a detailed end-use energy model.

  5. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  6. Beaufortian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Beaufortian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes Jurassic through lower Cretaceous (Neocomian) strata of the Kingak Shale and the overlying pebble shale unit. These strata are part of a composite total petroleum system involving hydrocarbons expelled from source rocks in three stratigraphic intervals, the Lower Jurassic part of the Kingak Shale, the Triassic Shublik Formation, and the Lower Cretaceous gamma-ray zone (GRZ) and associated strata. The potential for undiscovered oil and gas resources in the Beaufortian megasequence in NPRA was assessed by defining eight plays (assessment units), two in lower Cretaceous (Neocomian) topset seismic facies, four in Upper Jurassic topset seismic facies, one in Lower Jurassic topset seismic facies, and one in Jurassic through lower Cretaceous (Neocomian) clinoform seismic facies. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 239 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 103 million barrels. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 1,162 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 405 billion cubic feet. The Beaufortian Cretaceous Topset South Play is estimated to contain between 635 (95-percent probability) and 4,004 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 2,130 billion cubic feet. No technically recoverable oil is assessed in the Beaufortian Cretaceous Topset South Play, as it lies at depths that are entirely in the gas window. The Beaufortian Upper Jurassic Topset Northeast Play is estimated to contain between 2,744 (95-percent probability) and 8,086 (5-percent probability) million barrels of technically recoverable oil

  7. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  8. Probability detection mechanisms and motor learning.

    PubMed

    Lungu, O V; Wächter, T; Liu, T; Willingham, D T; Ashe, J

    2004-11-01

    The automatic detection of patterns or regularities in the environment is central to certain forms of motor learning, which are largely procedural and implicit. The rules underlying the detection and use of probabilistic information in the perceptual-motor domain are largely unknown. We conducted two experiments involving a motor learning task with direct and crossed mapping of motor responses in which probabilities were present at the stimulus set level, the response set level, and at the level of stimulus-response (S-R) mapping. We manipulated only one level at a time, while controlling for the other two. The results show that probabilities were detected only when present at the S-R mapping and motor levels, but not at the perceptual one (experiment 1), unless the perceptual features have a dimensional overlap with the S-R mapping rule (experiment 2). The effects of probability detection were mostly facilitatory at the S-R mapping, both facilitatory and inhibitory at the perceptual level, and predominantly inhibitory at the response-set level. The facilitatory effects were based on learning the absolute frequencies first and transitional probabilities later (for the S-R mapping rule) or both types of information at the same time (for perceptual level), whereas the inhibitory effects were based on learning first the transitional probabilities. Our data suggest that both absolute frequencies and transitional probabilities are used in motor learning, but in different temporal orders, according to the probabilistic properties of the environment. The results support the idea that separate neural circuits may be involved in detecting absolute frequencies as compared to transitional probabilities.

  9. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  10. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  11. Causal inference, probability theory, and graphical insights.

    PubMed

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.

  12. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  13. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.

  14. Brookian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Brookian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes bottomset and clinoform seismic facies of the Torok Formation (mostly Albian age) and generally coeval, topset seismic facies of the uppermost Torok Formation and the Nanushuk Group. These strata are part of a composite total petroleum system involving hydrocarbons expelled from three stratigraphic intervals of source rocks, the Lower Cretaceous gamma-ray zone (GRZ), the Lower Jurassic Kingak Shale, and the Triassic Shublik Formation. The potential for undiscovered oil and gas resources in the Brookian megasequence in NPRA was assessed by defining five plays (assessment units), one in the topset seismic facies and four in the bottomset-clinoform seismic facies. The Brookian Topset Play is estimated to contain between 60 (95-percent probability) and 465 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 239 million barrels. The Brookian Topset Play is estimated to contain between 0 (95-percent probability) and 679 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 192 billion cubic feet. The Brookian Clinoform North Play, which extends across northern NPRA, is estimated to contain between 538 (95-percent probability) and 2,257 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 1,306 million barrels. The Brookian Clinoform North Play is estimated to contain between 0 (95-percent probability) and 1,969 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 674 billion cubic feet. The Brookian Clinoform Central Play, which extends across central NPRA, is estimated to contain between 299 (95-percent probability) and 1,849 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 973

  15. CETA's $11 Billion

    ERIC Educational Resources Information Center

    Hersher, Judy

    1978-01-01

    The Comprehensive Employment and Training Act (CETA) is now before Congress for review and reenactment. This article examines previous CETA program efforts and the new provisions intended to target jobs and training to the most disadvantaged in terms of income and length of unemployment. (Author/AM)

  16. Developing a Billion Leaders

    ERIC Educational Resources Information Center

    Gergen, Christopher; Rego, Lyndon; Wright, Joel

    2014-01-01

    Intentionally developing the leadership capacity of all students is a necessary requirement for schools around the world. The Center for Creative Leadership in Greensboro, N.C., has been at the center of this work and presents three schools as examples: Ravenscroft School in Raleigh, N.C., the African Leadership Academy in Johannesburg, South…

  17. Probability, arrow of time and decoherence

    NASA Astrophysics Data System (ADS)

    Bacciagaluppi, Guido

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  18. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  19. Match probabilities in racially admixed populations.

    PubMed Central

    Lange, K

    1993-01-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors. PMID:8430693

  20. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  1. Derivation of the collision probability between orbiting objects The lifetimes of Jupiter's outer moons

    NASA Technical Reports Server (NTRS)

    Kessler, D. J.

    1981-01-01

    A general form is derived for Opik's equations relating to the probability of collision between two orbiting objects to their orbital elements, and used to determine the collisional lifetime of the eight outer moons of Jupiter. The derivation is based on a concept of spatial density, or average number of objects found in a unit volume, and results in a set of equations that are easily applied to a variety of orbital collision problems. When applied to the outer satellites, which are all in irregular orbits, the equations predict a relatively long collisional lifetime for the four retrograde moons (about 270 billon years on the average) and a shorter time for the four posigrade moons (0.9 billion years). This short time is suggestive of a past collision history, and may account for the orbiting dust detected by Pioneers 10 and 11.

  2. Exact probability distribution functions for Parrondo's games

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  3. Intrinsic Probability of a Multifractal Set

    NASA Astrophysics Data System (ADS)

    Hosokawa, Iwao

    1991-12-01

    It is shown that a self-similar measure isotropically distributed in a d-dimensional set should have its own intermittency exponents equivalent to its own generalized dimensions (in the sense of Hentschel and Procaccia), and that the intermittency exponents are completely designated by an intrinsic probability which governs the spatial distribution of the measure. Based on this, it is proven that the intrinsic probability uniquely determines the spatial distribution of the scaling index α of the measure as well as the so-called f-α spectrum of the multifractal set.

  4. Atomic transition probabilities of Nd I

    NASA Astrophysics Data System (ADS)

    Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.

    2011-12-01

    Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.

  5. Probabilities for separating sets of order statistics.

    PubMed

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  6. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  7. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  8. Determining system maintainability as a probability

    SciTech Connect

    Wright, R.E.; Atwood, C.L.

    1988-01-01

    Maintainability has often been defined in principle as the probability that a system or component can be repaired in a specific time given that it is in a failed state, but presented in practice in terms of mean-time-to-repair. In this paper, formulas are developed for maintainability as a probability, analogous to those for reliability and availability. This formulation is expressed in terms of cut sets, and leads to a natural definition of unmaintainability importance for cut sets and basic events. 6 refs.

  9. Probability in biology: overview of a comprehensive theory of probability in living systems.

    PubMed

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.

  10. Composition and syngeneity of molecular fossils from the 2.78 to 2.45 billion-year-old Mount Bruce Supergroup, Pilbara Craton, Western Australia

    NASA Astrophysics Data System (ADS)

    Brocks, Jochen J.; Buick, Roger; Logan, Graham A.; Summons, Roger E.

    2003-11-01

    Shales of very low metamorphic grade from the 2.78 to 2.45 billion-year-old (Ga) Mount Bruce Supergroup, Pilbara Craton, Western Australia, were analyzed for solvent extractable hydrocarbons. Samples were collected from ten drill cores and two mines in a sampling area centered in the Hamersley Basin near Wittenoom and ranging 200 km to the southeast, 100 km to the southwest and 70 km to the northwest. Almost all analyzed kerogenous sedimentary rocks yielded solvent extractable organic matter. Concentrations of total saturated hydrocarbons were commonly in the range of 1 to 20 ppm (μg/g rock) but reached maximum values of 1000 ppm. The abundance of aromatic hydrocarbons was ˜1 to 30 ppm. Analysis of the extracts by gas chromatography-mass spectrometry (GC-MS) and GC-MS metastable reaction monitoring (MRM) revealed the presence of n-alkanes, mid- and end-branched monomethylalkanes, ω-cyclohexylalkanes, acyclic isoprenoids, diamondoids, tri- to pentacyclic terpanes, steranes, aromatic steroids and polyaromatic hydrocarbons. Neither plant biomarkers nor hydrocarbon distributions indicative of Phanerozoic contamination were detected. The host kerogens of the hydrocarbons were depleted in 13C by 2 to 21‰ relative to n-alkanes, a pattern typical of, although more extreme than, other Precambrian samples. Acyclic isoprenoids showed carbon isotopic depletion relative to n-alkanes and concentrations of 2α-methylhopanes were relatively high, features rarely observed in the Phanerozoic but characteristic of many other Precambrian bitumens. Molecular parameters, including sterane and hopane ratios at their apparent thermal maxima, condensate-like alkane profiles, high mono- and triaromatic steroid maturity parameters, high methyladamantane and methyldiamantane indices and high methylphenanthrene maturity ratios, indicate thermal maturities in the wet-gas generation zone. Additionally, extracts from shales associated with iron ore deposits at Tom Price and Newman have

  11. Probability learning and Piagetian probability conceptions in children 5 to 12 years old.

    PubMed

    Kreitler, S; Zigler, E; Kreitler, H

    1989-11-01

    This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.

  12. A One Billion Year Martian Climate Model: The Importance of Seasonally Resolved Polar Caps and the Role of Wind

    NASA Technical Reports Server (NTRS)

    Armstrong, J. C.; Leovy, C. B.; Quinn, T. R.; Haberle, R. M.; Schaeffer, J.

    2003-01-01

    Wind deflation and deposition are powerful agents of surface change in the present Mars climate regime. Recent studies indicate that, while the distribution of regions of potential deflation (or erosion) and deposition is remarkably insensitive to changes in orbital parameters (obliquity, timing of perihelion passage, etc.), rates of aeolian surface modification may be highly sensitive to these parameters even if the atmospheric mass remains constant. But previous work suggested the atmospheric mass is likely to be sensitive to obliquity, especially if a significant mass of carbon dioxide can be stored in the regolith or deposited in the form of massive polar caps. Deflation and erosion are highly sensitive to surface pressure, so feedback between orbit variations and surface pressure can greatly enhance the sensitivity of aeolian modification rates to orbital parameters. We used statistics derived from a 1 Gyr orbital integration of the spin axis of Mars, coupled with 3D general circulation models (GCMs) at a variety of orbital conditions and pressures, to explore this feedback. We also employed a seasonally resolved 1D energy balance model to illuminate the gross characteristics of the longterm atmospheric evolution, wind erosion and deposition over one billion years. We find that seasonal polar cycles have a critical influence on the ability for the regolith to release CO2 at high obliquities, and find that the atmospheric CO2 actually decreases at high obliquities due to the cooling effect of polar deposits at latitudes where seasonal caps form. At low obliquity, the formation of massive, permanent polar caps depends critically on the values of the frost albedo, A(sub frost), and frost emissivity, E(sub frost). Using our 1D model with values of A(sub frost) = 0.67 and E(sub frost) = 0.55, matched to the NASA Ames GCM results, we find that permanent caps only form at low obliquities (< 10 degrees). Thus, contrary to expectations, the Martian atmospheric pressure

  13. Investigating Probability with the NBA Draft Lottery.

    ERIC Educational Resources Information Center

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  14. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  15. Probability distribution functions of the Grincevicjus series

    NASA Astrophysics Data System (ADS)

    Kapica, Rafal; Morawiec, Janusz

    2008-06-01

    Given a sequence ([xi]n,[eta]n) of independent identically distributed vectors of random variables we consider the Grincevicjus series and a functional-integral equation connected with it. We prove that the equation characterizes all probability distribution functions of the Grincevicjus series. Moreover, some application of this characterization to a continuous refinement equation is presented.

  16. Time Required to Compute A Posteriori Probabilities,

    DTIC Science & Technology

    The paper discusses the time required to compute a posteriori probabilities using Bayes ’ Theorem . In a two-hypothesis example it is shown that, to... Bayes ’ Theorem as the group operation. Winograd’s results concerning the lower bound on the time required to perform a group operation on a finite group using logical circuitry are therefore applicable. (Author)

  17. Interstitial lung disease probably caused by imipramine.

    PubMed

    Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H

    2014-01-01

    Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.

  18. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  19. Probability of boundary conditions in quantum cosmology

    NASA Astrophysics Data System (ADS)

    Suenobu, Hiroshi; Nambu, Yasusada

    2017-02-01

    One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.

  20. Idempotent probability measures on ultrametric spaces

    NASA Astrophysics Data System (ADS)

    Hubal, Oleksandra; Zarichnyi, Mykhailo

    2008-07-01

    Following the construction due to Hartog and Vink we introduce a metric on the set of idempotent probability measures (Maslov measures) defined on an ultrametric space. This construction determines a functor on the category of ultrametric spaces and nonexpanding maps. We prove that this functor is the functorial part of a monad on this category. This monad turns out to contain the hyperspace monad.

  1. Five-Parameter Bivariate Probability Distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  2. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  3. Geometric Probability and the Areas of Leaves

    ERIC Educational Resources Information Center

    Hoiberg, Karen Bush; Sharp, Janet; Hodgson, Ted; Colbert, Jim

    2005-01-01

    This article describes how a group of fifth-grade mathematics students measured irregularly shaped objects using geometric probability theory. After learning how to apply a ratio procedure to find the areas of familiar shapes, students extended the strategy for use with irregularly shaped objects, in this case, leaves. (Contains 2 tables and 8…

  4. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  5. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  6. Probability from a Socio-Cultural Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  7. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  8. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  9. Phonotactic Probability Effects in Children Who Stutter

    ERIC Educational Resources Information Center

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  10. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  11. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  12. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  13. Spatial Probability Cuing and Right Hemisphere Damage

    ERIC Educational Resources Information Center

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  14. Learning a Probability Distribution Efficiently and Reliably

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  15. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  16. Overcoming Challenges in Learning Probability Vocabulary

    ERIC Educational Resources Information Center

    Groth, Randall E.; Butler, Jaime; Nelson, Delmar

    2016-01-01

    Students can struggle to understand and use terms that describe probabilities. Such struggles lead to difficulties comprehending classroom conversations. In this article, we describe some specific misunderstandings a group of students (ages 11-12) held in regard to vocabulary such as "certain", "likely" and…

  17. Activities in Elementary Probability, Monograph No. 9.

    ERIC Educational Resources Information Center

    Fouch, Daniel J.

    This monograph on elementary probability for middle school, junior high, or high school consumer mathematics students is divided into two parts. Part one emphasizes lessons which cover the fundamental counting principle, permutations, and combinations. The 5 lessons of part I indicate the objectives, examples, methods, application, and problems…

  18. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  19. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  20. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  1. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    USGS Publications Warehouse

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    similar and related to three primary physiographic provinces. The final regional regression analyses resulted in three sets of equations. For Regions 1 and 2, the basin characteristics of drainage area and basin shape factor were statistically significant. For Region 3, because of the small amount of data from streamgages, only drainage area was statistically significant. Average standard errors of prediction ranged from 28.7 to 38.4 percent for flood region 1, 24.1 to 43.5 percent for flood region 2, and 25.8 to 30.5 percent for region 3. The regional regression equations are only applicable to stream sites in Missouri with flows not significantly affected by regulation, channelization, backwater, diversion, or urbanization. Basins with about 5 percent or less impervious area were considered to be rural. Applicability of the equations are limited to the basin characteristic values that range from 0.11 to 8,212.38 square miles (mi2) and basin shape from 2.25 to 26.59 for Region 1, 0.17 to 4,008.92 mi2 and basin shape 2.04 to 26.89 for Region 2, and 2.12 to 2,177.58 mi2 for Region 3. Annual peak data from streamgages were used to qualitatively assess the largest floods recorded at streamgages in Missouri since the 1915 water year. Based on existing streamgage data, the 1983 flood event was the largest flood event on record since 1915. The next five largest flood events, in descending order, took place in 1993, 1973, 2008, 1994 and 1915. Since 1915, five of six of the largest floods on record occurred from 1973 to 2012.

  2. A large population of galaxies 9 to 12 billion years back in the history of the Universe

    NASA Astrophysics Data System (ADS)

    Le Fèvre, O.; Paltani, S.; Arnouts, S.; Charlot, S.; Foucaud, S.; Ilbert, O.; McCracken, H. J.; Zamorani, G.; Bottini, D.; Garilli, B.; Le Brun, V.; Maccagni, D.; Picat, J. P.; Scaramella, R.; Scodeggio, M.; Tresse, L.; Vettolani, G.; Zanichelli, A.; Adami, C.; Bardelli, S.; Bolzonella, M.; Cappi, A.; Ciliegi, P.; Contini, T.; Franzetti, P.; Gavignaud, I.; Guzzo, L.; Iovino, A.; Marano, B.; Marinoni, C.; Mazure, A.; Meneux, B.; Merighi, R.; Pellò, R.; Pollo, A.; Pozzetti, L.; Radovich, M.; Zucca, E.; Arnaboldi, M.; Bondi, M.; Bongiorno, A.; Busarello, G.; Gregorini, L.; Lamareille, F.; Mathez, G.; Mellier, Y.; Merluzzi, P.; Ripepi, V.; Rizzo, D.

    2005-09-01

    To understand the evolution of galaxies, we need to know as accurately as possible how many galaxies were present in the Universe at different epochs. Galaxies in the young Universe have hitherto mainly been identified using their expected optical colours, but this leaves open the possibility that a significant population remains undetected because their colours are the result of a complex mix of stars, gas, dust or active galactic nuclei. Here we report the results of a flux-limited I-band survey of galaxies at look-back times of 9 to 12 billion years. We find 970 galaxies with spectroscopic redshifts between 1.4 and 5. This population is 1.6 to 6.2 times larger than previous estimates, with the difference increasing towards brighter magnitudes. Strong ultraviolet continua (in the rest frame of the galaxies) indicate vigorous star formation rates of more than 10-100 solar masses per year. As a consequence, the cosmic star formation rate representing the volume-averaged production of stars is higher than previously measured at redshifts of 3 to 4.

  3. Organic-walled microfossils in 3.2-billion-year-old shallow-marine siliciclastic deposits.

    PubMed

    Javaux, Emmanuelle J; Marshall, Craig P; Bekker, Andrey

    2010-02-18

    Although the notion of an early origin and diversification of life on Earth during the Archaean eon has received increasing support in geochemical, sedimentological and palaeontological evidence, ambiguities and controversies persist regarding the biogenicity and syngeneity of the record older than Late Archaean. Non-biological processes are known to produce morphologies similar to some microfossils, and hydrothermal fluids have the potential to produce abiotic organic compounds with depleted carbon isotope values, making it difficult to establish unambiguous traces of life. Here we report the discovery of a population of large (up to about 300 mum in diameter) carbonaceous spheroidal microstructures in Mesoarchaean shales and siltstones of the Moodies Group, South Africa, the Earth's oldest siliciclastic alluvial to tidal-estuarine deposits. These microstructures are interpreted as organic-walled microfossils on the basis of petrographic and geochemical evidence for their endogenicity and syngeneity, their carbonaceous composition, cellular morphology and ultrastructure, occurrence in populations, taphonomic features of soft wall deformation, and the geological context plausible for life, as well as a lack of abiotic explanation falsifying a biological origin. These are the oldest and largest Archaean organic-walled spheroidal microfossils reported so far. Our observations suggest that relatively large microorganisms cohabited with earlier reported benthic microbial mats in the photic zone of marginal marine siliciclastic environments 3.2 billion years ago.

  4. Sulfur isotopes of organic matter preserved in 3.45-billion-year-old stromatolites reveal microbial metabolism.

    PubMed

    Bontognali, Tomaso R R; Sessions, Alex L; Allwood, Abigail C; Fischer, Woodward W; Grotzinger, John P; Summons, Roger E; Eiler, John M

    2012-09-18

    The 3.45-billion-year-old Strelley Pool Formation of Western Australia preserves stromatolites that are considered among the oldest evidence for life on Earth. In places of exceptional preservation, these stromatolites contain laminae rich in organic carbon, interpreted as the fossil remains of ancient microbial mats. To better understand the biogeochemistry of these rocks, we performed microscale in situ sulfur isotope measurements of the preserved organic sulfur, including both Δ(33)S and . This approach allows us to tie physiological inference from isotope ratios directly to fossil biomass, providing a means to understand sulfur metabolism that is complimentary to, and independent from, inorganic proxies (e.g., pyrite). Δ(33)S values of the kerogen reveal mass-anomalous fractionations expected of the Archean sulfur cycle, whereas values show large fractionations at very small spatial scales, including values below -15‰. We interpret these isotopic patterns as recording the process of sulfurization of organic matter by H(2)S in heterogeneous mat pore-waters influenced by respiratory S metabolism. Positive Δ(33)S anomalies suggest that disproportionation of elemental sulfur would have been a prominent microbial process in these communities.

  5. Decimetre-scale multicellular eukaryotes from the 1.56-billion-year-old Gaoyuzhuang Formation in North China

    PubMed Central

    Zhu, Shixing; Zhu, Maoyan; Knoll, Andrew H.; Yin, Zongjun; Zhao, Fangchen; Sun, Shufen; Qu, Yuangao; Shi, Min; Liu, Huan

    2016-01-01

    Fossils of macroscopic eukaryotes are rarely older than the Ediacaran Period (635–541 million years (Myr)), and their interpretation remains controversial. Here, we report the discovery of macroscopic fossils from the 1,560-Myr-old Gaoyuzhuang Formation, Yanshan area, North China, that exhibit both large size and regular morphology. Preserved as carbonaceous compressions, the Gaoyuzhuang fossils have statistically regular linear to lanceolate shapes up to 30 cm long and nearly 8 cm wide, suggesting that the Gaoyuzhuang fossils record benthic multicellular eukaryotes of unprecedentedly large size. Syngenetic fragments showing closely packed ∼10 μm cells arranged in a thick sheet further reinforce the interpretation. Comparisons with living thalloid organisms suggest that these organisms were photosynthetic, although their phylogenetic placement within the Eukarya remains uncertain. The new fossils provide the strongest evidence yet that multicellular eukaryotes with decimetric dimensions and a regular developmental program populated the marine biosphere at least a billion years before the Cambrian Explosion. PMID:27186667

  6. The economic downturn and its lingering effects reduced medicare spending growth by $4 billion in 2009-12.

    PubMed

    Dranove, David; Garthwaite, Craig; Ody, Christopher

    2015-08-01

    Previous work has found a strong connection between the most recent economic recession and reductions in private health spending. However, the effect of economic downturns on Medicare spending is less clear. In contrast to studies involving earlier time periods, our study found that when the macroeconomy slowed during the Great Recession of 2007-09, so did Medicare spending growth. A small (14 percent) but significant share of the decline in Medicare spending growth from 2009 to 2012 relative to growth from 2004 to 2009 can be attributed to lingering effects of the recession. Absent the economic downturn, Medicare spending would have been $4 billion higher in 2009-12. A major reason for the relatively small impact of the macroeconomy is the relative lack of labor-force participation among people ages sixty-five and older. We estimate that if they had been working at the same rate as the nonelderly before the recession, the effect of the downturn on Medicare spending growth would have been twice as large.

  7. Mobile hydrocarbon microspheres from >2-billion-year-old carbon-bearing seams in the South African deep subsurface.

    PubMed

    Wanger, G; Moser, D; Hay, M; Myneni, S; Onstott, T C; Southam, G

    2012-11-01

    By ~2.9 Ga, the time of the deposition of the Witwatersrand Supergroup, life is believed to have been well established on Earth. Carbon remnants of the microbial biosphere from this time period are evident in sediments from around the world. In the Witwatersrand Supergroup, the carbonaceous material is often concentrated in seams, closely associated with the gold deposits and may have been a mobile phase 2 billion years ago. Whereas today the carbon in the Witwatersrand Supergroup is presumed to be immobile, hollow hydrocarbon spheres ranging in size from <1 μm to >50 μm were discovered emanating from a borehole drilled through the carbon-bearing seams suggesting that a portion of the carbon may still be mobile in the deep subsurface. ToF-SIMS and STXM analyses revealed that these spheres contain a suite of alkane, alkenes, and aromatic compounds consistent with the described organic-rich carbon seams within the Witwatersrand Supergroup's auriferous reef horizons. Analysis by electron microscopy and ToF-SIMS, however, revealed that these spheres, although most likely composed of biogenic carbon and resembling biological organisms, do not retain any true structural, that is, fossil, information and were formed by an abiogenic process.

  8. Malthus is still wrong: we can feed a world of 9-10 billion, but only by reducing food demand.

    PubMed

    Smith, Pete

    2015-08-01

    In 1798, Thomas Robert Malthus published 'An essay on the principle of population' in which he concluded that: 'The power of population is so superior to the power of the earth to produce subsistence for man, that premature death must in some shape or other visit the human race.' Over the following century he was criticised for underestimating the potential for scientific and technological innovation to provide positive change. Since then, he has been proved wrong, with a number of papers published during the past few decades pointing out why he has been proved wrong so many times. In the present paper, I briefly review the main changes in food production in the past that have allowed us to continue to meet ever growing demand for food, and I examine the possibility of these same innovations delivering food security in the future. On the basis of recent studies, I conclude that technological innovation can no longer be relied upon to prove Malthus wrong as we strive to feed 9-10 billion people by 2050. Unless we are prepared to accept a wide range of significant, undesirable environmental consequences, technology alone cannot provide food security in 2050. Food demand, particularly the demand for livestock products, will need to be managed if we are to continue to prove Malthus wrong into the future.

  9. Decimetre-scale multicellular eukaryotes from the 1.56-billion-year-old Gaoyuzhuang Formation in North China.

    PubMed

    Zhu, Shixing; Zhu, Maoyan; Knoll, Andrew H; Yin, Zongjun; Zhao, Fangchen; Sun, Shufen; Qu, Yuangao; Shi, Min; Liu, Huan

    2016-05-17

    Fossils of macroscopic eukaryotes are rarely older than the Ediacaran Period (635-541 million years (Myr)), and their interpretation remains controversial. Here, we report the discovery of macroscopic fossils from the 1,560-Myr-old Gaoyuzhuang Formation, Yanshan area, North China, that exhibit both large size and regular morphology. Preserved as carbonaceous compressions, the Gaoyuzhuang fossils have statistically regular linear to lanceolate shapes up to 30 cm long and nearly 8 cm wide, suggesting that the Gaoyuzhuang fossils record benthic multicellular eukaryotes of unprecedentedly large size. Syngenetic fragments showing closely packed ∼10 μm cells arranged in a thick sheet further reinforce the interpretation. Comparisons with living thalloid organisms suggest that these organisms were photosynthetic, although their phylogenetic placement within the Eukarya remains uncertain. The new fossils provide the strongest evidence yet that multicellular eukaryotes with decimetric dimensions and a regular developmental program populated the marine biosphere at least a billion years before the Cambrian Explosion.

  10. Sulfur isotopes of organic matter preserved in 3.45-billion-year-old stromatolites reveal microbial metabolism

    PubMed Central

    Bontognali, Tomaso R. R.; Sessions, Alex L.; Allwood, Abigail C.; Fischer, Woodward W.; Grotzinger, John P.; Summons, Roger E.; Eiler, John M.

    2012-01-01

    The 3.45-billion-year-old Strelley Pool Formation of Western Australia preserves stromatolites that are considered among the oldest evidence for life on Earth. In places of exceptional preservation, these stromatolites contain laminae rich in organic carbon, interpreted as the fossil remains of ancient microbial mats. To better understand the biogeochemistry of these rocks, we performed microscale in situ sulfur isotope measurements of the preserved organic sulfur, including both Δ33S and . This approach allows us to tie physiological inference from isotope ratios directly to fossil biomass, providing a means to understand sulfur metabolism that is complimentary to, and independent from, inorganic proxies (e.g., pyrite). Δ33S values of the kerogen reveal mass-anomalous fractionations expected of the Archean sulfur cycle, whereas values show large fractionations at very small spatial scales, including values below -15‰. We interpret these isotopic patterns as recording the process of sulfurization of organic matter by H2S in heterogeneous mat pore-waters influenced by respiratory S metabolism. Positive Δ33S anomalies suggest that disproportionation of elemental sulfur would have been a prominent microbial process in these communities. PMID:22949693

  11. A large population of galaxies 9 to 12 billion years back in the history of the Universe.

    PubMed

    Le Fèvre, O; Paltani, S; Arnouts, S; Charlot, S; Foucaud, S; Ilbert, O; McCracken, H J; Zamorani, G; Bottini, D; Garilli, B; Le Brun, V; Maccagni, D; Picat, J P; Scaramella, R; Scodeggio, M; Tresse, L; Vettolani, G; Zanichelli, A; Adami, C; Bardelli, S; Bolzonella, M; Cappi, A; Ciliegi, P; Contini, T; Franzetti, P; Gavignaud, I; Guzzo, L; Iovino, A; Marano, B; Marinoni, C; Mazure, A; Meneux, B; Merighi, R; Pellò, R; Pollo, A; Pozzetti, L; Radovich, M; Zucca, E; Arnaboldi, M; Bondi, M; Bongiorno, A; Busarello, G; Gregorini, L; Lamareille, F; Mathez, G; Mellier, Y; Merluzzi, P; Ripepi, V; Rizzo, D

    2005-09-22

    To understand the evolution of galaxies, we need to know as accurately as possible how many galaxies were present in the Universe at different epochs. Galaxies in the young Universe have hitherto mainly been identified using their expected optical colours, but this leaves open the possibility that a significant population remains undetected because their colours are the result of a complex mix of stars, gas, dust or active galactic nuclei. Here we report the results of a flux-limited I-band survey of galaxies at look-back times of 9 to 12 billion years. We find 970 galaxies with spectroscopic redshifts between 1.4 and 5. This population is 1.6 to 6.2 times larger than previous estimates, with the difference increasing towards brighter magnitudes. Strong ultraviolet continua (in the rest frame of the galaxies) indicate vigorous star formation rates of more than 10-100 solar masses per year. As a consequence, the cosmic star formation rate representing the volume-averaged production of stars is higher than previously measured at redshifts of 3 to 4.

  12. Multi million-to-Billion Atom Molecular Dynamics Simulations of Cavitation-Induced Damage on a Silica Slab

    NASA Astrophysics Data System (ADS)

    Shekhar, Adarsh; Nomura, Ken-Ichi; Kalia, Rajiv; Nakano, Aiichiro; Vashishta, Priya

    2012-02-01

    Cavitation bubble collapse causes severe damage to materials. For example, cavitation erosion is a major threat to the safety of nuclear power plants. The cavitation bubbles may also be utilized for preventing stress corrosion cracking with water jet peening technology. We have performed multi million-to-billion atoms molecular dynamics simulations to investigate the shock-induced cavitation damage mechanism on an amorphous silica slab in water. The system consists of a 60nm thick silica slab immersed in water in an MD box of dimension 285 x 200 x 200 nm3. A nanobubble is created by removing water molecules within a sphere of radius 100 nm. To apply a planar shock, we assign a uniform particle velocity vp on the entire system towards a planar momentum mirror. We have performed the simulation with two kinds of bubbles, an empty bubble and a bubble filled with inert gas. The simulation results reveal nanojet formation during bubble collapse causing damage on the silica surface; however, the damage was significantly reduced in the case of the filled bubble. We will discuss the effect of the presence of inter gas inside the nanobubble on the pressure distribution, the extent of damage, and collapse behavior corresponding the shock front.

  13. Evidence from massive siderite beds for a CO2-rich atmosphere before approximately 1.8 billion years ago

    NASA Technical Reports Server (NTRS)

    Ohmoto, Hiroshi; Watanabe, Yumiko; Kumazawa, Kazumasa

    2004-01-01

    It is generally thought that, in order to compensate for lower solar flux and maintain liquid oceans on the early Earth, methane must have been an important greenhouse gas before approximately 2.2 billion years (Gyr) ago. This is based upon a simple thermodynamic calculation that relates the absence of siderite (FeCO3) in some pre-2.2-Gyr palaeosols to atmospheric CO2 concentrations that would have been too low to have provided the necessary greenhouse effect. Using multi-dimensional thermodynamic analyses and geological evidence, we show here that the absence of siderite in palaeosols does not constrain atmospheric CO2 concentrations. Siderite is absent in many palaeosols (both pre- and post-2.2-Gyr in age) because the O2 concentrations and pH conditions in well-aerated soils have favoured the formation of ferric (Fe3+)-rich minerals, such as goethite, rather than siderite. Siderite, however, has formed throughout geological history in subsurface environments, such as euxinic seas, where anaerobic organisms created H2-rich conditions. The abundance of large, massive siderite-rich beds in pre-1.8-Gyr sedimentary sequences and their carbon isotope ratios indicate that the atmospheric CO2 concentration was more than 100 times greater than today, causing the rain and ocean waters to be more acidic than today. We therefore conclude that CO2 alone (without a significant contribution from methane) could have provided the necessary greenhouse effect to maintain liquid oceans on the early Earth.

  14. Impacts of a 32-billion-gallon bioenergy landscape on land and fossil fuel use in the US

    NASA Astrophysics Data System (ADS)

    Hudiburg, Tara W.; Wang, Weiwei; Khanna, Madhu; Long, Stephen P.; Dwivedi, Puneet; Parton, William J.; Hartman, Melannie; Delucia, Evan H.

    2016-01-01

    Sustainable transportation biofuels may require considerable changes in land use to meet mandated targets. Understanding the possible impact of different policies on land use and greenhouse gas emissions has typically proceeded by exploring either ecosystem or economic modelling. Here we integrate such models to assess the potential for the US Renewable Fuel Standard to reduce greenhouse gas emissions from the transportation sector through the use of cellulosic biofuels. We find that 2022 US emissions are decreased by 7.0 ± 2.5% largely through gasoline displacement and soil carbon storage by perennial grasses. If the Renewable Fuel Standard is accompanied by a cellulosic biofuel tax credit, these emissions could be reduced by 12.3 ± 3.4%. Our integrated approach indicates that transitioning to cellulosic biofuels can meet a 32-billion-gallon Renewable Fuel Standard target with negligible effects on food crop production, while reducing fossil fuel use and greenhouse gas emissions. However, emissions savings are lower than previous estimates that did not account for economic constraints.

  15. Assessment of potential oil and gas resources in source rocks of the Alaska North Slope, 2012

    USGS Publications Warehouse

    Houseknecht, David W.; Rouse, William A.; Garrity, Christopher P.; Whidden, Katherine J.; Dumoulin, Julie A.; Schenk, Christopher J.; Charpentier, Ronald R.; Cook, Troy A.; Gaswirth, Stephanie B.; Kirschbaum, Mark A.; Pollastro, Richard M.

    2012-01-01

    The U.S. Geological Survey estimated potential, technically recoverable oil and gas resources for source rocks of the Alaska North Slope. Estimates (95-percent to 5-percent probability) range from zero to 2 billion barrels of oil and from zero to nearly 80 trillion cubic feet of gas.

  16. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  17. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  18. You Say "Probable" and I Say "Likely": Improving Interpersonal Communication With Verbal Probability Phrases

    ERIC Educational Resources Information Center

    Karelitz, Tzur M.; Budescu, David V.

    2004-01-01

    When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…

  19. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in…

  20. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  1. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  2. Cheating Probabilities on Multiple Choice Tests

    NASA Astrophysics Data System (ADS)

    Rizzuto, Gaspard T.; Walters, Fred

    1997-10-01

    This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.

  3. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  4. A probability distribution model for rain rate

    NASA Technical Reports Server (NTRS)

    Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.

    1994-01-01

    A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.

  5. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  6. Complex analysis methods in noncommutative probability

    NASA Astrophysics Data System (ADS)

    Teodor Belinschi, Serban

    2006-02-01

    In this thesis we study convolutions that arise from noncommutative probability theory. We prove several regularity results for free convolutions, and for measures in partially defined one-parameter free convolution semigroups. We discuss connections between Boolean and free convolutions and, in the last chapter, we prove that any infinitely divisible probability measure with respect to monotonic additive or multiplicative convolution belongs to a one-parameter semigroup with respect to the corresponding convolution. Earlier versions of some of the results in this thesis have already been published, while some others have been submitted for publication. We have preserved almost entirely the specific format for PhD theses required by Indiana University. This adds several unnecessary pages to the document, but we wanted to preserve the specificity of the document as a PhD thesis at Indiana University.

  7. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

  8. Approximate probability distributions of the master equation.

    PubMed

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  9. Transit probabilities for debris around white dwarfs

    NASA Astrophysics Data System (ADS)

    Lewis, John Arban; Johnson, John A.

    2017-01-01

    The discovery of WD 1145+017 (Vanderburg et al. 2015), a metal-polluted white dwarf with an infrared-excess and transits confirmed the long held theory that at least some metal-polluted white dwarfs are actively accreting material from crushed up planetesimals. A statistical understanding of WD 1145-like systems would inform us on the various pathways for metal-pollution and the end states of planetary systems around medium- to high-mass stars. However, we only have one example and there are presently no published studies of transit detection/discovery probabilities for white dwarfs within this interesting regime. We present a preliminary look at the transit probabilities for metal-polluted white dwarfs and their projected space density in the Solar Neighborhood, which will inform future searches for analogs to WD 1145+017.

  10. Volcano shapes, entropies, and eruption probabilities

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  11. Probability of identity by descent in metapopulations.

    PubMed Central

    Kaj, I; Lascoux, M

    1999-01-01

    Equilibrium probabilities of identity by descent (IBD), for pairs of genes within individuals, for genes between individuals within subpopulations, and for genes between subpopulations are calculated in metapopulation models with fixed or varying colony sizes. A continuous-time analog to the Moran model was used in either case. For fixed-colony size both propagule and migrant pool models were considered. The varying population size model is based on a birth-death-immigration (BDI) process, to which migration between colonies is added. Wright's F statistics are calculated and compared to previous results. Adding between-island migration to the BDI model can have an important effect on the equilibrium probabilities of IBD and on Wright's index. PMID:10388835

  12. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  13. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  14. Computing association probabilities using parallel Boltzmann machines.

    PubMed

    Iltis, R A; Ting, P Y

    1993-01-01

    A new computational method is presented for solving the data association problem using parallel Boltzmann machines. It is shown that the association probabilities can be computed with arbitrarily small errors if a sufficient number of parallel Boltzmann machines are available. The probability beta(i)(j) that the i th measurement emanated from the jth target can be obtained simply by observing the relative frequency with which neuron v(i,j) in a two-dimensional network is on throughout the layers. Some simple tracking examples comparing the performance of the Boltzmann algorithm to the exact data association solution and with the performance of an alternative parallel method using the Hopfield neural network are also presented.

  15. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  16. The Origin of Probability and Entropy

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2008-11-01

    Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.

  17. Calculating Cumulative Binomial-Distribution Probabilities

    NASA Technical Reports Server (NTRS)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  18. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  19. SureTrak Probability of Impact Display

    NASA Technical Reports Server (NTRS)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  20. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  1. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  2. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  3. Theoretical Analysis of Rain Attenuation Probability

    NASA Astrophysics Data System (ADS)

    Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan

    2007-07-01

    Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.

  4. The Probability Distribution of Daily Streamflow

    NASA Astrophysics Data System (ADS)

    Blum, A.; Vogel, R. M.

    2015-12-01

    Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.

  5. Probability of metastable states in Yukawa clusters

    NASA Astrophysics Data System (ADS)

    Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael

    2008-11-01

    Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)

  6. Atomic Transition Probabilities for Rare Earths

    NASA Astrophysics Data System (ADS)

    Curry, J. J.; Anderson, Heidi M.; den Hartog, E. A.; Wickliffe, M. E.; Lawler, J. E.

    1996-10-01

    Accurate absolute atomic transition probabilities for selected neutral and singly ionized rare earth elements including Tm, Dy, and Ho are being measured. The increasing use of rare earths in high intensity discharge lamps provides motivation; the data are needed for diagnosing and modeling the lamps. Radiative lifetimes, measured using time resolved laser induced fluorescence (LIF), are combined with branching fractions, measured using a large Fourier transform spectrometer (FTS), to determine accurate absolute atomic transition probabilities. More than 15,000 LIF decay curves from Tm and Dy atoms and ions in slow beams have been recorded and analyzed. Radiative lifetimes for 298 levels of TmI and TmII and for 450 levels of DyI and DyII are determined. Branching fractions are extracted from spectra recorded using the 1.0 m FTS at the National Solar Observatory. Branching fractions and absolute transition probabilities for 500 of the strongest TmI and TmII lines are complete. Representative lifetime and branching fraction data will be presented and discussed. Supported by Osram Sylvania Inc. and the NSF.

  7. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  8. A quantum probability model of causal reasoning.

    PubMed

    Trueblood, Jennifer S; Busemeyer, Jerome R

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  9. The probability and severity of decompression sickness

    PubMed Central

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p << 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  10. The probability and severity of decompression sickness.

    PubMed

    Howle, Laurens E; Weber, Paul W; Hada, Ethan A; Vann, Richard D; Denoble, Petar J

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild-Type I (manifestations 4-6)-and serious-Type II (manifestations 1-3). Additionally, we considered an alternative grouping of mild-Type A (manifestations 3-6)-and serious-Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p < 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of 'mild' DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed.

  11. Evaluation testing of a portable vapor detector for Part-Per-Billion (PPB) level UDMH and N2H4

    NASA Technical Reports Server (NTRS)

    Curran, Dan; Lueck, Dale E.

    1995-01-01

    Trace level detection of hydrazine (N2H4), monomethyl hydrazine (MMH) and unsymmetrical dimethylhydrazine (UDMH) has been receiving increased attention over the past several years. In May 1995 the American Conference of Government Industrial Hygienists (ACGIH) lowered their acceptable threshold limit value (TLV) from 100 parts-per-billion (ppb) to 10 ppb. Several types of ppb-level detectors are being developed by the United States Air Force (USAF) Space and Missile Systems Center (SMSC). A breadboard version of a portable, lightweight hydrazine detection sensor was developed and produced by Giner Corp. for the USAF. This sensor was designed for ppb level UDMH and N2H4 vapor detection in near real-time. This instrument employs electrochemical sensing, utilizing a three electrode cell with an anion-exchange polymer electrolyte membrane as the only electrolyte in the system. The sensing, counter and reference electrodes are bonded to the membrane forming a single component. The only liquid required to maintain the sensor is deionized water which hydrates the membrane. At the request of the USAF SMSC, independent testing and evaluation of the breadboard instrument was performed at NASA's Toxic Vapor Detection Laboratory (TVDL) for response to ppb-level N2H4 and UDMH and MMH. The TVDL, located at Kennedy Space Center (KSC) has the unique ability to generate calibrated sample vapor streams of N2H4, UDMH, and MMH over a range from less than 10 ppb to thousands of parts per million (ppm) with full environmental control of relative humidity (0-90%) and temperature (0-50 C). The TVDL routinely performs these types of tests. Referenced sensors were subjected to extensive testing, including precision, linearity, response/recovery times, zero and span drift, humidity and temperature effects as well as ammonia interference. Results of these tests and general operation characteristics are reported.

  12. Validation of an evacuated canister method for measuring part-per-billion levels of chemical warfare agent simulants.

    PubMed

    Coffey, Christopher C; LeBouf, Ryan F; Calvert, Catherine A; Slaven, James E

    2011-08-01

    The National Institute for Occupational Safety and Health (NIOSH) research on direct-reading instruments (DRIs) needed an instantaneous sampling method to provide independent confirmation of the concentrations of chemical warfare agent (CWA) simulants. It was determined that evacuated canisters would be the method of choice. There is no method specifically validated for volatile organic compounds (VOCs) in the NIOSH Manual of Analytical Methods. The purpose of this study was to validate an evacuated canister method for sampling seven specific VOCs that can be used as a simulant for CWA agents (cyclohexane) or influence the DRI measurement of CWA agents (acetone, chloroform, methylene chloride, methyl ethyl ketone, hexane, and carbon tetrachloride [CCl4]). The method used 6-L evacuated stainless-steel fused silica-lined canisters to sample the atmosphere containing VOCs. The contents of the canisters were then introduced into an autosampler/preconcentrator using a microscale purge and trap (MPT) method. The MPT method trapped and concentrated the VOCs in the air sample and removed most of the carbon dioxide and water vapor. After preconcentration, the samples were analyzed using a gas chromatograph with a mass selective detector. The method was tested, evaluated, and validated using the NIOSH recommended guidelines. The evaluation consisted of determining the optimum concentration range for the method; the sample stability over 30 days; and the accuracy, precision, and bias of the method. This method meets the NIOSH guidelines for six of the seven compounds (excluding acetone) tested in the range of 2.3-50 parts per billion (ppb), making it suitable for sampling of these VOCs at the ppb level.

  13. New Schools, Overcrowding Relief, and Achievement Gains in Los Angeles--Strong Returns from a $19.5 Billion Investment. Policy Brief 12-2

    ERIC Educational Resources Information Center

    Welsh, William; Coghlan, Erin; Fuller, Bruce; Dauter, Luke

    2012-01-01

    Aiming to relieve overcrowded schools operating on multiple tracks, the Los Angeles Unified School District (LAUSD) has invested more than $19 billion to build 130 new facilities over the past decade. District leaders asked researchers at Berkeley to estimate the achievement effects of this massive initiative--benefits that may stem from entering…

  14. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  15. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  16. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    NASA Astrophysics Data System (ADS)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  17. Elemental mercury poisoning probably causes cortical myoclonus.

    PubMed

    Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B

    2007-10-15

    Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.

  18. The Prediction of Spatial Aftershock Probabilities (PRESAP)

    NASA Astrophysics Data System (ADS)

    McCloskey, J.

    2003-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the

  19. Probable interaction between trazodone and carbamazepine.

    PubMed

    Sánchez-Romero, A; Mayordomo-Aranda, A; García-Delgado, R; Durán-Quintana, J A

    2011-06-01

    The need to maintain long-term treatment of chronic pathologies makes the appearance of interactions possible when such therapies incorporate other drugs to deal with the aggravation of the same or other intercurrent pathologies. A case is presented in which the addition of trazodone to a chronic treatment with carbamazepine (CBZ) is associated with symptoms typical for intoxication by this antiepileptic, accompanied by a raised serum concentration. When the trazodone was suspended, these symptoms lessened and the concentration of CBZ decreased progressively, suggesting a probable interaction between the 2 drugs.

  20. Atomic transition probabilities of Gd i

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.

    2011-05-01

    Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.

  1. Atomic transition probabilities of Er i

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.

    2010-12-01

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  2. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  3. Probability density functions in turbulent channel flow

    NASA Technical Reports Server (NTRS)

    Dinavahi, Surya P. G.

    1992-01-01

    The probability density functions (pdf's) of the fluctuating velocity components, as well as their first and second derivatives, are calculated using data from the direct numerical simulations (DNS) of fully developed turbulent channel flow. It is observed that, beyond the buffer region, the pdf of each of these quantities is independent of the distance from the channel wall. It is further observed that, beyond the buffer region, the pdf's for all the first derivatives collapse onto a single universal curve and those of the second derivatives also collapse onto another universal curve, irrespective of the distance from the wall. The kinetic-energy dissipation rate exhibits log normal behavior.

  4. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.

  5. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  6. An all-timescales rainfall probability distribution

    NASA Astrophysics Data System (ADS)

    Papalexiou, S. M.; Koutsoyiannis, D.

    2009-04-01

    The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.

  7. Computation-distributed probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Wang, Junjie; Zhao, Lingling; Su, Xiaohong; Shi, Chunmei; Ma, JiQuan

    2016-12-01

    Particle probability hypothesis density filtering has become a promising approach for multi-target tracking due to its capability of handling an unknown and time-varying number of targets in a nonlinear, non-Gaussian system. However, its computational complexity linearly increases with the number of obtained observations and the number of particles, which can be very time consuming, particularly when numerous targets and clutter exist in the surveillance region. To address this issue, we present a distributed computation particle probability hypothesis density(PHD) filter for target tracking. It runs several local decomposed particle PHD filters in parallel while processing elements. Each processing element takes responsibility for a portion of particles but all measurements and provides local estimates. A central unit controls particle exchange among the processing elements and specifies a fusion rule to match and fuse the estimates from different local filters. The proposed framework is suitable for parallel implementation. Simulations verify that the proposed method can significantly accelerate and maintain a comparative accuracy compared to the standard particle PHD filter.

  8. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasability of a Billion-Ton Annual Supply

    SciTech Connect

    Perlack, R.D.

    2005-12-15

    land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country's present petroleum consumption--the goal set by the Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  9. Dipolar geomagnetic field and low orbital obliquity during the last two billion years: Evidence from paleomagnetism of evaporite basins

    NASA Astrophysics Data System (ADS)

    Evans, D. A.

    2006-05-01

    Paleomagnetism of climatically sensitive sedimentary rock types, such as glacial deposits and evaporites, can test the uniformitarianism of ancient geomagnetic fields and paleoclimatic zones. Precambrian glacial deposits laid down in near-equatorial paleomagnetic latitudes indicate a paleoclimatic paradox that can be explained either by Snowball Earth episodes, or high orbital obliquity, or dramatically non-uniformitarian geomagnetic fields. Here I present the first global paleomagnetic compilation of the Earth's entire basin-scale evaporite record. Evaporation exceeds precipitation in today's subtropical desert belts, generally within a zone of 15-35° from the equator. Assuming a geocentric axial dipole (GAD) magnetic field for Cenozoic- Mesozoic time, evaporite basins of the past 250 Myr have a volume-weighted mean paleolatitude of 23±4°, also squarely within the subtropics. Carboniferous-Permian evaporites have an indistinguishable weighted-mean paleolatitude of 22±4°, which does not change significantly when recently hypothesized octupolar field components are included in the calculations. Early Paleozoic (including late Ediacaran) evaporites are lower-latitude (weighted mean 10±5°), but detailed analyses of individual examples show this cannot be attributed solely to nondipolar field components or sedimentary inclination biases; the cause may be due to particular paleogeographic effects on regional tropical climates, or incomplete sampling by the paleomagnetic data. Proterozoic (pre-Ediacaran) evaporite basins have a volume- weighted mean inclination of 33±4°, which would correspond to a mean paleolatitude of 18±3° for a pure GAD field. This latter mean is indistinguishable, within error, from the Cenozoic-Mesozoic mean and demonstrates the success of the GAD model as a first-order description of the geomagnetic field for the last two billion years. Also, general circulation climate models of a high-obliquity Earth predict either no strong zonal

  10. Who'll have to pay? The cost of dealing with AIDS in Asia will run into the billions.

    PubMed

    1993-11-03

    In September 1993, at a meeting funded by the Asian Development Bank and the United Nations Development Program, researchers, economists, and government health officials from China, India, Indonesia, South Korea, Burma, the Philippines, Sri Lanka, and Thailand met to discuss the economic effects of human immunodeficiency virus (HIV) and acquired immunodeficiency syndrome (AIDS) on Asia. The World Health Organization (WHO) places the estimate of the number of people in India who are infected with HIV at around 1 million. However, Jacob John of Vellore Medical College (who first discovered the virus in India) places the estimate at higher than 2.5 million with an increase to 9-18 million by the year 2000. Charles Myers of Harvard University, Mechai Viravaidya of Bangkok's Population and Community Development Association, and Stasia Obremskey ( a health and development consultant) predict 3.4-4.3 million Thais will be infected by that year. According to Obremskey, the number of AIDS cases will reach 650,000, of which 500,000 will die. Health care for full-blown AIDS costs $1016/yr, while lost productivity due to early death costs $22,000 per victim. Myers, Mechai and Obremskey state that Thailand could prevent 3.5 million cases and save $5.1 billion, if people ceased high-risk behavior and the treatment of sexually transmitted diseases was given the highest priority. In the Philippines there are only 416 reported cases of HIV and AIDS, but Dennis Maducduc of the Department of Health AIDS program states that Filipinos are secretive about this, and Orville Solon of the University of the Philippines suggests there are 100 cases for each reported case. Solon believes $15 million has been lost due to infection and death of overseas contract workers who account for 8% of the country's foreign exchange earnings. New studies in Africa, where, as in Thailand, mortality is less than predicted, suggest a less virulent strain of HIV. This apparent fact and prevention, especially

  11. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  12. Significance of "high probability/low damage" versus "low probability/high damage" flood events

    NASA Astrophysics Data System (ADS)

    Merz, B.; Elmer, F.; Thieken, A. H.

    2009-06-01

    The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.

  13. Not All Probabilities Are Equivalent: Evidence From Orientation Versus Spatial Probability Learning.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2017-02-23

    Frequently targets are detected faster, probable locations searched earlier, and likely orientations estimated more precisely. Are these all consequences of a single, domain-general "attentional" effect? To examine this issue, participants were shown brief instances of spatial gratings, and were tasked to draw their location and orientation. Unknown to participants, either the location or orientation probability of these gratings were manipulated. While orientation probability affected the precision of orientation reports, spatial probability did not. Further, utilising lowered stimulus contrast (via a staircase procedure) and a combination of behavioral precision and confidence self-report, we clustered trials with perceived stimuli from trials where the target was not detected: Spatial probability only modulated the likelihood of stimulus detection, but not did not modulate perceptual precision. Even when no physical attentional cues are present, acquired probabilistic information on space versus orientation leads to separable 'attention-like' effects on behaviour. We discuss how this could be linked to distinct underlying neural mechanisms. (PsycINFO Database Record

  14. A reconstruction of Archean biological diversity based on molecular fossils from the 2.78 to 2.45 billion-year-old Mount Bruce Supergroup, Hamersley Basin, Western Australia

    NASA Astrophysics Data System (ADS)

    Brocks, Jochen J.; Buick, Roger; Summons, Roger E.; Logan, Graham A.

    2003-11-01

    Bitumens extracted from 2.7 to 2.5 billion-year-old (Ga) shales of the Fortescue and Hamersley Groups in the Pilbara Craton, Western Australia, contain traces of molecular fossils. Based on a combination of molecular characteristics typical of many Precambrian bitumens, their consistently and unusually high thermal maturities, and their widespread distribution throughout the Hamersley Basin, the bitumens can be characterized as 'probably of Archean age'. Accepting this interpretation, the biomarkers open a new window on Archean biodiversity. The presence of hopanes in the Archean rocks confirms the antiquity of the domain Bacteria, and high relative concentrations of 2α-methylhopanes indicate that cyanobacteria were important primary producers. Oxygenic photosynthesis therefore evolved > 2.7 Ga ago, and well before independent evidence suggests significant levels of oxygen accumulated in the atmosphere. Moreover, the abundance of cyanobacterial biomarkers in shales interbedded with oxide-facies banded iron formations (BIF) indicates that although some Archean BIF might have been formed by abiotic photochemical processes or anoxygenic phototrophic bacteria, those in the Hamersley Group formed as a direct consequence of biological oxygen production. Biomarkers of the 3β-methylhopane series suggest that microaerophilic heterotrophic bacteria, probably methanotrophs or methylotrophs, were active in late Archean environments. The presence of steranes in a wide range of structures with relative abundances like those from late Paleoproterozoic to Phanerozoic sediments is convincing evidence for the existence of eukaryotes in the late Archean, 900 Ma before visible fossil evidence indicates that the lineage arose. Sterol biosynthesis in extant eukaryotes requires molecular oxygen. The presence of steranes together with biomarkers of oxygenic photosynthetic cyanobacteria suggests that the concentration of dissolved oxygen in some regions of the upper water column was

  15. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  16. Economic choices reveal probability distortion in macaque monkeys.

    PubMed

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing.

  17. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-11

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.

  18. Quantum probabilities for inflation from holography

    SciTech Connect

    Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  19. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2006-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  20. Probability-one homotopies in computational science

    NASA Astrophysics Data System (ADS)

    Watson, Layne T.

    2002-03-01

    Probability-one homotopy algorithms are a class of methods for solving nonlinear systems of equations that, under mild assumptions, are globally convergent for a wide range of problems in science and engineering. Convergence theory, robust numerical algorithms, and production quality mathematical software exist for general nonlinear systems of equations, and special cases such as Brouwer fixed point problems, polynomial systems, and nonlinear constrained optimization. Using a sample of challenging scientific problems as motivation, some pertinent homotopy theory and algorithms are presented. The problems considered are analog circuit simulation (for nonlinear systems), reconfigurable space trusses (for polynomial systems), and fuel-optimal orbital rendezvous (for nonlinear constrained optimization). The mathematical software packages HOMPACK90 and POLSYS_PLP are also briefly described.

  1. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  2. Microtechnique for most-probable-number analysis.

    PubMed

    Rowe, R; Todd, R; Waide, J

    1977-03-01

    A microtechnique based on the most-probable-number (MPN) method has been developed for the enumeration of the ammonium-oxidizing population in soil samples. An MPN table for a research design ([8 by 12] i.e., 12 dilutions, 8 replicates per dilution) is presented. A correlation of 0.68 was found between MPNs determined by the microtechnique and the standard tube technique. Higher MPNs were obtained with the microtechnique with increased accuracy in endpoint determinations being a possible cause. Considerable savings of time, space, equipment, and reagents are observed using this method. The microtechnique described may be adapted to other microbial populations using various types of media and endpoint determinations.

  3. Continuity of percolation probability on hyperbolic graphs

    NASA Astrophysics Data System (ADS)

    Wu, C. Chris

    1997-05-01

    Let T k be a forwarding tree of degree k where each vertex other than the origin has k children and one parent and the origin has k children but no parent ( k≥2). Define G to be the graph obtained by adding to T k nearest neighbor bonds connecting the vertices which are in the same generation. G is regarded as a discretization of the hyperbolic plane H 2 in the same sense that Z d is a discretization of R d . Independent percolation on G has been proved to have multiple phase transitions. We prove that the percolation probability O(p) is continuous on [0,1] as a function of p.

  4. Probability and delay discounting of erotic stimuli.

    PubMed

    Lawyer, Steven R

    2008-09-01

    Adult undergraduate men (n=38) and women (n=33) were categorized as erotica "users" (n=34) and "non-users" (n=37) based on their responses to screening questions and completed computerized delay and probability discounting tasks concerning hypothetical money and erotica. Erotica users discounted the value of erotica similarly to money on three of the four erotica tasks; erotica non-users discounted the value of money consistent with erotica users, but not the value of erotica. Erotica users were disproportionately male, scored higher on several psychometric measures of sexuality-related constructs, and exhibited more impulsive choice patterns on the delay discounting for money task than erotica non-users did. These findings suggest that discounting processes generalize to erotic outcomes for some individuals.

  5. Trending in Probability of Collision Measurements

    NASA Technical Reports Server (NTRS)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  6. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2004-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  7. Microtechnique for Most-Probable-Number Analysis

    PubMed Central

    Rowe, R.; Todd, R.; Waide, J.

    1977-01-01

    A microtechnique based on the most-probable-number (MPN) method has been developed for the enumeration of the ammonium-oxidizing population in soil samples. An MPN table for a research design ([8 by 12] i.e., 12 dilutions, 8 replicates per dilution) is presented. A correlation of 0.68 was found between MPNs determined by the microtechnique and the standard tube technique. Higher MPNs were obtained with the microtechnique with increased accuracy in endpoint determinations being a possible cause. Considerable savings of time, space, equipment, and reagents are observed using this method. The microtechnique described may be adapted to other microbial populations using various types of media and endpoint determinations. Images PMID:16345226

  8. Parabolic Ejecta Features on Titan? Probably Not

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.; Melosh, H. J.

    1996-03-01

    Radar mapping of Venus by Magellan indicated a number of dark parabolic features, associated with impact craters. A suggested mechanism for generating such features is that ejecta from the impact event is 'winnowed' by the zonal wind field, with smaller ejecta particles falling out of the atmosphere more slowly, and hence drifting further. What discriminates such features from simple wind streaks is the 'stingray' or parabolic shape. This is due to the ejecta's spatial distribution prior to being winnowed during fallout, and this distribution is generated by the explosion plume of the impact piercing the atmosphere, allowing the ejecta to disperse pseudoballistically before re-entering the atmosphere, decelerating to terminal velocity and then being winnowed. Here we apply this model to Titan, which has a zonal wind field similar to that of Venus. We find that Cassini will probably not find parabolic features, as the winds stretch the deposition so far that ejecta will form streaks or bands instead.

  9. Bayesian probability approach to ADHD appraisal.

    PubMed

    Robeva, Raina; Penberthy, Jennifer Kim

    2009-01-01

    Accurate diagnosis of attentional disorders such as attention-deficit hyperactivity disorder (ADHD) is imperative because there are multiple negative psychosocial sequelae related to undiagnosed and untreated ADHD. Early and accurate detection can lead to effective intervention and prevention of negative sequelae. Unfortunately, diagnosing ADHD presents a challenge to traditional assessment paradigms because there is no single test that definitively establishes its presence. Even though ADHD is a physiologically based disorder with a multifactorial etiology, the diagnosis has been traditionally based on a subjective history of symptoms. In this chapter we outline a stochastic method that utilizes a Bayesian interface for quantifying and assessing ADHD. It can be used to combine of a variety of psychometric tests and physiological markers into a single standardized instrument that, on each step, refines a probability for ADHD for each individual based on information provided by the individual assessments. The method is illustrated with data from a small study of six college female students with ADHD and six matched controls in which the method achieves correct classification for all participants, where none of the individual assessments was capable of achieving perfect classification. Further, we provide a framework for applying this Bayesian method for performing meta-analysis of data obtained from disparate studies and using disparate tests for ADHD based on calibration of the data into a unified probability scale. We use this method to combine data from five studies that examine the diagnostic abilities of different behavioral rating scales and EEG assessments of ADHD, enrolling a total of 56 ADHD and 55 control subjects of different age groups and gender.

  10. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  11. a Billion Years of Steady-State Crustal Growth; the 1.9-0.9 GA Evolution of SW Fennoscandia

    NASA Astrophysics Data System (ADS)

    Slagstad, T.; Roberts, N. M.; Andersen, T.; Sauer, S.; Røhr, T. S.

    2012-12-01

    continent-continent collision, but has more recently been placed in the context of orogenesis within an accretionary orogen. Either way, the 1.02-0.98 Ga period was an extensive compressional phase that saw the final accretion and stabilization of all existing fragments of this margin. An extensional phase resumed after 0.96 Ga, and subduction either ceased altogether or jumped outboard again. This billion year period features steady-state crustal growth for much of it. Variations in trench-continent movement (tectonic switching) produce periods of more crustal reworking alternating with periods of greater juvenile addition; alternating compressional and extensional environments stabilize as well as produce arc-derived continental crust. The preserved record, however, is not representative of the crust produced; whereas the dominantly extensional period has produced a large expanse of crust that is partially preserved, a dominantly compressional environment removed crust to the mantle. Although a full continent-continent collision may not have occurred, the regional geodynamics, possibly related to Rodinia formation, have biased the preserved record of crustal growth. Our model supports the concept of steady-state crustal growth, and preservation bias that leaves an episodic record.

  12. Recrystallized Granite Surface Fissures Of The Wasatch Range, Produced Not Later Than 1/4 Billion Years Ago

    NASA Astrophysics Data System (ADS)

    McDonald, Keith L.

    2000-05-01

    Our studies of numerous recrystallized fissures in 4 granite plutons of Wasatch Range, namely, Mount Tuscarora-Wolverine-Millicent,^1,6,7 Bonanza Peak-Midway,^2 Little Cottonwood Canyon and Ferguson Canyon plutons, all of which formed magma chambers reaching Earth-atmosphere interface, establish that they resulted from high thermal gradients rather than passages of earthquake waves. Magma chambers formed, solidified during Permo-Caroniferous Ice Age(roughly, 1/3...1/4 billion yr ago, a time interval preceding period of extrusion of Rocky Mountains, 10^8 yr ago), and while fluid, belched lava flows^5 extending over its reservoir walls to run hundred of meters. We have shown how the magma melts, dilutes and replaces overlying metamorphic rock^7 to reach Earth's surface so that a pluton containing large amounts of dross(Fe-ores, etc.) had a short fluid lifetime. We also described how offshoots from a long-running main fissure form acute angles with that fissure.^3 Such recrystallized fissures, reaching depths of perhaps 100 m, have initial fractures near time of solidification of their uppermost portion of magma chamber while still hot(<= 1600^oF), a time when max. stresses occur near granite surface due to high thermal gradients, owing to snow coverage, cold water contacts due to rain, stream flow over granite surface, partial coverage by ocean, etc., wherever heat sinks might occur, during P-C ice age--when region of Wasatch Range existed at sea level, Salt Lake Valley being covered entirely by ocean water and region east of Wasatch Bouleuard rising gently above Pacific Ocean to elevations of possibly 500-1000 ft, say, at a distance of 10-15 mi to east. This fact is implied by Chinese Wall of white limestone on Grandeur Peak, unequivocally, and similarly another in Neff's Canyon running e. from n. ridge of 9200 ft. saddle-summit, as well as a dozen other ancient calcified stream beds emptying into ocean to w., in Salt Lake Valley. This existed prior to regional

  13. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply, April 2005

    SciTech Connect

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country’s present petroleum consumption – the goal set by the Biomass R&D Technical Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  14. Medical migration. A study projects Americans spending up to $68 billion abroad by 2010 for treatment, but some doubt the trend's momentum.

    PubMed

    Rhea, Shawn

    2008-05-05

    As soaring costs are sending Americans abroad for healthcare, a new study projects they'll spend $68 billion annually by 2010 for overseas treatment. Besides that cost, it could pressure U.S. providers to deliver quality for less. "As you see other cultures where the outcomes look very favorable, then we'll see more people questioning why we can't provide that level of care at similar costs," says Jeffrey Moe, left.

  15. Effects of pH changes of stock normal saline solution on 5 percent red cell suspension.

    PubMed

    Martin, Gregorio L; Caraan, P J M; Chua, Jared Jenrik S; Crescini, Jules Albert L; Diokno, Joseph Martin C; Javier, Christopher Bernard D; Reyes, Kristina Barbara O; Soliven, Rianne Y

    2014-01-01

    The red cell suspension (RCS) is a universally used indicator system to demonstrate antigen and antibody reactions in vitro.Saline solutions that are used in its preparation are preferred to be fresh to avoid changes in pH that may affect the results. Thus,buffered saline such as phosphate-buffered saline (PBS) is the ideal diluent because its pH is maintained for a certain period. However,normal saline solution (NSS) is more commonly used because it is inexpensive and easy to make. pH changes in the saline solutions and the RCSs were monitored for 1 week. Macroscopic examination of changes in degree of redness of RCS was also observed. Red blood cell (RBC) indices of the ethylenediaminetetraacetic acid(EDT A)-anticoagulated blood used in the preparation of the RCS were measured in the performance of an automated complete blood count. Qualitative examination of the crenation of RBCs was done on the prepared blood smears and graded by three registered medical technologists. Percentage crenation was then determined using an improved Neubauer counting chamber. Three trials were performed, and results were averaged. Statistical analysis showed hat there were significant differences in the average pH of PBS and NSS and the average pH of RCS-PBS and RCS-NSS over 1 week. RBC indices measured in EDTA-anticoagulated donor blood showed no significant difference. Macroscopic examination of changes in degree of redness of the RCS showed that color darkened over 1 week but only by a small degree. Qualitative and quantitative examination of crenation of RBCs in RCS-PBS and RCS-NSS both showed no significant differences over 1 week. The experimental group (RCS-NSS) continuously showed a higher grade of crenation than the control group (RCS-PBS). Crenation of RBCs still manifests microscopically despite the lack of a significant relationship between the pH of the saline solutions and the degree and percentage of crenation. Crenation, therefore,cannot be attributed to pH alone but occurs as a result of other factors.

  16. A resolution to express the sense of the Senate in support of reducing its budget by at least 5 percent.

    THOMAS, 112th Congress

    Sen. Wicker, Roger F. [R-MS

    2011-03-08

    03/16/2011 Resolution agreed to in Senate without amendment and with a preamble by Unanimous Consent. (text: CR S1768) (All Actions) Tracker: This bill has the status Passed SenateHere are the steps for Status of Legislation:

  17. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    SciTech Connect

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  18. Transit probabilities around hypervelocity and runaway stars

    NASA Astrophysics Data System (ADS)

    Fragione, G.; Ginsburg, I.

    2017-04-01

    In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. Along with transits, the Doppler technique remains an invaluable tool for discovering planets. The next generation of spectrographs, such as G-CLEF, promise precision radial velocity measurements. In this paper, we explore the possibility of detecting planets around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multiplanetary transit is 10-3 ≲ P ≲ 10-1. We therefore need to observe ∼10-1000 high-velocity stars to spot a transit. However, even if transits are rare around runaway and hypervelocity stars, the chances of detecting such planets using radial velocity surveys is high. We predict that the European Gaia satellite, along with TESS and the new-generation spectrographs G-CLEF and ESPRESSO, will spot planetary systems orbiting high-velocity stars.

  19. Essays on probability elicitation scoring rules

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  20. Probability judgments under ambiguity and conflict.

    PubMed

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.