Science.gov

Sample records for 5-percent probability billion

  1. 26 CFR 301.6226(b)-1 - 5-percent group.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... beginning prior to October 4, 2001, see § 301.6226(b)-1T contained in 26 CFR part 1, revised April 1, 2001. ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false 5-percent group. 301.6226(b)-1 Section 301.6226... ADMINISTRATION PROCEDURE AND ADMINISTRATION Assessment In General § 301.6226(b)-1 5-percent group. (a) In...

  2. 26 CFR 301.6226(b)-1 - 5-percent group.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... beginning prior to October 4, 2001, see § 301.6226(b)-1T contained in 26 CFR part 1, revised April 1, 2001. ... 26 Internal Revenue 18 2011-04-01 2011-04-01 false 5-percent group. 301.6226(b)-1 Section 301.6226... ADMINISTRATION PROCEDURE AND ADMINISTRATION Assessment In General § 301.6226(b)-1 5-percent group. (a) In...

  3. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...

  4. Missing billions.

    PubMed

    Conly, S

    1997-01-01

    This article discusses funding of population programs that support the Cairo International Conference on Population and Development's Plan of Action. The Plan of Action calls for a quadrupling of annual financial commitments for population programs to $17 billion by the year 2000 and $22 billion by 2015. The increased expenditures would cover the increased demand for services from unmet need and population growth. Donor countries are expected to increase their share from the current 25% to about 33%, or $5.7 billion by the year 2000. The estimates are in 1993 constant dollars. $17 billion is less than the $40 billion that is spent worldwide on playing golf. During 1993-94, general donor support increased to $1.2 billion. Denmark, Germany, Japan, the Netherlands, the United Kingdom, and the United States increased their support. The United States doubled its support for population programs during 1992-95 to $583 million. During 1996-97 the US Congress cut funding back to the 1995 level. France, Italy, Spain, Belgium, and Austria have lagged in support for population programs in the present and the past. Equal burden sharing would require the US to increase funding to $1.9 billion. Developed country assistance declined to the lowest share of combined gross national product since 1970. This shifts the burden to multilateral sources. The European Union is committed to increasing its funding, and the World Bank increased funding for population and reproductive health to about $600 million in 1996 from $424 million in 1994. Bangladesh, China, India, Indonesia, Mexico, South Africa, and Turkey spent 85% of all government expenditures on family planning in developing countries. External donors in Africa are the main support of family planning. Private consumers in Latin America pay most of the costs of family planning. External assistance will be needed for some time. PMID:12321013

  5. Median CBO Salary Rises by 4.5 Percent, Annual Study Finds.

    ERIC Educational Resources Information Center

    Business Officer, 1997

    1997-01-01

    An annual national survey of college and university salaries found chief business officers' salaries rose 4.5 percent in 1996-97, less than the previous year. Salaries of women and minority CBOs continued to gain equity with that of men. Rates of increase varied by institution type. Salary gains for all administrative job types were less than in…

  6. The JPL 1.5-meter clear aperture antenna with 84.5 percent efficiency

    NASA Astrophysics Data System (ADS)

    Cha, A. G.

    1983-05-01

    The theoretical and experimental study of a 1.5-meter offset dual shaped reflector at 31.4 GHz is detailed. An efficiency of 84.5 percent, a likely new record for reflector antennas, was ascertained through careful measurements. For larger low noise reflector systems, a 2- to 3-dB improvement in G/T performance over the state of the art ultra low noise ground stations and 90 percent or better aperture efficiency now appear feasible.

  7. The JPL 1.5-meter Clear Aperture Antenna with 84.5 Percent Efficiency

    NASA Technical Reports Server (NTRS)

    Cha, A. G.

    1983-01-01

    The theoretical and experimental study of a 1.5-meter offset dual shaped reflector at 31.4 GHz is detailed. An efficiency of 84.5 percent, a likely new record for reflector antennas, was ascertained through careful measurements. For larger low noise reflector systems, a 2- to 3-dB improvement in G/T performance over the state of the art ultra low noise ground stations and 90 percent or better aperture efficiency now appear feasible.

  8. Evaluation of EA-934NA with 2.5 percent Cab-O-Sil

    NASA Technical Reports Server (NTRS)

    Caldwell, Gordon A.

    1990-01-01

    Currently, Hysol adhesive EA-934NA is used to bond the Field Joint Protection System on the Shuttle rocket motors at Kennedy Space Center. However, due to processing problems, an adhesive with a higher viscosity is needed to alleviate these difficulties. One possible solution is to add Cab-O-Sil to the current adhesive. The adhesive strength and bond strengths that can be obtained when 2.5 percent Cab-O-Sil is added to adhesive EA-934NA are examined and tested over a range of test temperatures from -20 to 300 F. Tensile adhesion button and lap shear specimens were bonded to D6AC steel and uniaxial tensile specimens (testing for strength, initial tangent modulus, elongation and Poisson's ratio) were prepared using Hysol adhesive EA-934NA with 2.5 percent Cab-O-Sil added. These specimens were tested at -20, 20, 75, 100, 125, 150, 200, 250, and 300 F, respectively. Additional tensile adhesion button specimens bonding Rust-Oleum primed and painted D6AC steel to itself and to cork using adhesive EA-934NA with 2.5 percent Cab-O-Sil added were tested at 20, 75, 125, 200, and 300 F, respectively. Results generally show decreasing strength values with increasing test temperatures. The bond strengths obtained using cork as a substrate were totally dependent on the cohesive strength of the cork.

  9. Hemodynamic, hematologic and eicosanoid mediated mechanisms in 7.5 percent sodium chloride treatment of uncontrolled hemorrhagic shock.

    PubMed

    Rabinovici, R; Yue, T L; Krausz, M M; Sellers, T S; Lynch, K M; Feuerstein, G

    1992-10-01

    Hypertonic saline solution (HTS) (7.5 percent sodium chloride [NaCl]) treatment (5 milliliters per kilogram) of rats subjected to uncontrolled hemorrhagic shock (n = 7) caused an initial partial recovery of blood pressure (+38 +/- 5 percent, p<0.05) and cardiac index (+48 +/- 6 percent, p<0.01) followed by increased bleeding (+53 +/- 5 percent versus rats treated with 0.9 percent NaCl, p<0.05), secondary shock (mean arterial pressure [MAP] 23 +/- 7 millimeters of mercury, p<0.01) and decreased survival (-54 +/- 15 minutes versus control, p<0.05). The increased blood loss resulted from: 1, increased vascular pressure and vasodilatation (total peripheral resistance index -27 +/- 5 percent, p<0.05), as initial bleeding occurred when MAP and cardiac index are increased compared with the control group (+88 +/- 10 percent, p<0.05 and +82 +/- 7 percent, p<0.01, respectively) and as the concomitant infusion of angiotensin II, a potent vasoconstrictor, delayed the HTS-induced bleeding (resumed at 60 minutes), and 2, a defect in platelet aggregation reflected by decreased adenosine diphosphate (ADP)-induced maximal aggregation (-79 percent versus rats treated with 0.9 percent NaCl, p<0.05) and increased EC50 of ADP (+159 percent, p<0.05). These hemodynamic and hematologic responses might be mediated at least in part by prostacyclin, a vasodilator and antiplatelet aggregator, as HTS-treated rats markedly elevated the 6-keto-PGF1 alpha per thromboxane B2 ratio (+140 +/- 12 percent, p<0.01) and pretreatment with indomethacin decreased blood loss and improved MAP and survival. These data point out potential untoward hemodynamic and hematologic consequences of HTS treatment in traumatic injury in which control of bleeding cannot be confirmed. PMID:1411892

  10. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches 2... reduced to less than 2.0 percent within 30 minutes, or if methane levels reach 2.5 percent, all...

  11. 30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches 2... reduced to less than 2.0 percent within 30 minutes, or if methane levels reach 2.5 percent, all...

  12. 25 CFR 134.1 - Partial reimbursement of irrigation charges; 5 percent per annum of cost of system, June 30, 1920.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Partial reimbursement of irrigation charges; 5 percent..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES PARTIAL PAYMENT CONSTRUCTION CHARGES ON INDIAN IRRIGATION PROJECTS § 134.1 Partial reimbursement of irrigation charges; 5 percent per annum of cost of system,...

  13. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  14. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  15. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  16. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  17. 30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... methane is reduced to less than 0.5 percent, electrical power shall be deenergized in affected areas, except power to monitoring equipment determined by MSHA to be intrinsically safe under 30 CFR part...

  18. Nine billion or bust?

    NASA Astrophysics Data System (ADS)

    nerd, nerd; Pepperday, Mike; Szautner, a. a. z.

    2014-02-01

    In reply to a review of Tony Ryan and Steve McKevitt's book Project Sunshine, which explores ways in which the Earth could support a future population of nine billion people (Letting the sunshine in, November 2013 pp50-51, http://ow.ly/r0FTM).

  19. 25 CFR 134.1 - Partial reimbursement of irrigation charges; 5 percent per annum of cost of system, June 30, 1920.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... per annum of cost of system, June 30, 1920. 134.1 Section 134.1 Indians BUREAU OF INDIAN AFFAIRS... PROJECTS § 134.1 Partial reimbursement of irrigation charges; 5 percent per annum of cost of system, June... certain ones mentioned therein, where approved by the Department June 21, 1920, and require that...

  20. Life with Four Billion Atoms

    SciTech Connect

    Knight, Thomas

    2013-04-10

    Today it is commonplace to design and construct single silicon chips with billions of transistors. These are complex systems, difficult (but possible) to design, test, and fabricate. Remarkably, simple living systems can be assembled from a similar number of atoms, most of them in water molecules. In this talk I will present the current status of our attempts at full understanding and complexity reduction of one of the simplest living systems, the free-living bacterial species Mesoplasma florum. This 400 nm diameter cell thrives and replicates every 40 minutes with a genome of only 800 kilobases. Our recent experiments using transposon gene knockouts identified 354 of 683 annotated genes as inessential in laboratory culture when inactivated individually. While a functional redesigned genome will certainly not remove all of those genes, this suggests that roughly half the genome can be removed in an intentional redesign. I will discuss our recent knockout results and methodology, and our future plans for Genome re-engineering using targeted knock-in/knock-out double recombination; whole cell metabolic models; comprehensive whole cell metabolite measurement techniques; creation of plug-and-play metabolic modules for the simplified organism; inherent and engineered biosafety control mechanisms. This redesign is part of a comprehensive plan to lay the foundations for a new discipline of engineering biology. Engineering biological systems requires a fundamentally different viewpoint from that taken by the science of biology. Key engineering principles of modularity, simplicity, separation of concerns, abstraction, flexibility, hierarchical design, isolation, and standardization are of critical importance. The essence of engineering is the ability to imagine, design, model, build, and characterize novel systems to achieve specific goals. Current tools and components for these tasks are primitive. Our approach is to create and distribute standard biological parts

  1. Why Probability?

    ERIC Educational Resources Information Center

    Weatherly, Myra S.

    1984-01-01

    Instruction in mathematical probability to enhance higher levels of critical and creative thinking with gifted students is described. Among thinking skills developed by such an approach are analysis, synthesis, evaluation, fluency, and complexity. (CL)

  2. Countdown to Six Billion Teaching Kit.

    ERIC Educational Resources Information Center

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  3. Spend Billions and They Will Come

    ERIC Educational Resources Information Center

    Fox, Bette-Lee

    2004-01-01

    People look at one billion dollars in one of two ways: if it is the result of the long, hard effort of years of fundraising, they rejoice; if it signifies an astronomical budget deficit, they cringe. How, then, should people respond as a community to reaching the $1 billion mark ($1,242,436,438, to be exact) in this year's spending for public…

  4. Impingement of Cloud Droplets on 36.5-Percent-Thick Joukowski Airfoil at Zero Angle of Attack and Discussion of Use as Cloud Measuring Instrument in Dye-Tracer Technique

    NASA Technical Reports Server (NTRS)

    Brun, R. J.; Vogt, Dorothea E.

    1957-01-01

    The trajectories of droplets i n the air flowing past a 36.5-percent-thick Joukowski airfoil at zero angle of attack were determined. The amount of water i n droplet form impinging on the airfoil, the area of droplet impingement, and the rate of droplet impingement per unit area on the airfoil surface were calculated from the trajectories and cover a large range of flight and atmospheric conditions. With the detailed impingement information available, the 36.5-percent-thick Joukowski airfoil can serve the dual purpose of use as the principal element in instruments for making measurements in clouds and of a basic shape for estimating impingement on a thick streamlined body. Methods and examples are presented for illustrating some limitations when the airfoil is used as the principal element in the dye-tracer technique.

  5. Atmospheric oxygenation three billion years ago.

    PubMed

    Crowe, Sean A; Døssing, Lasse N; Beukes, Nicolas J; Bau, Michael; Kruger, Stephanus J; Frei, Robert; Canfield, Donald E

    2013-09-26

    It is widely assumed that atmospheric oxygen concentrations remained persistently low (less than 10(-5) times present levels) for about the first 2 billion years of Earth's history. The first long-term oxygenation of the atmosphere is thought to have taken place around 2.3 billion years ago, during the Great Oxidation Event. Geochemical indications of transient atmospheric oxygenation, however, date back to 2.6-2.7 billion years ago. Here we examine the distribution of chromium isotopes and redox-sensitive metals in the approximately 3-billion-year-old Nsuze palaeosol and in the near-contemporaneous Ijzermyn iron formation from the Pongola Supergroup, South Africa. We find extensive mobilization of redox-sensitive elements through oxidative weathering. Furthermore, using our data we compute a best minimum estimate for atmospheric oxygen concentrations at that time of 3 × 10(-4) times present levels. Overall, our findings suggest that there were appreciable levels of atmospheric oxygen about 3 billion years ago, more than 600 million years before the Great Oxidation Event and some 300-400 million years earlier than previous indications for Earth surface oxygenation. PMID:24067713

  6. Billion shot flashlamp for spaceborne lasers

    NASA Technical Reports Server (NTRS)

    Richter, Linda; Schuda, Felix; Degnan, John

    1990-01-01

    A billion-shot flashlamp developed under a NASA contract for spaceborne laser missions is presented. Lifetime-limiting mechanisms are identified and addressed. Two energy loadings of 15 and 44 Joules were selected for the initial accelerated life testing. A fluorescence-efficiency test station was used for measuring the useful-light output degradation of the lamps. The design characteristics meeting NASA specifications are outlined. Attention is focused on the physical properties of tungsten-matrix cathodes, the chemistry of dispenser cathodes, and anode degradation. It is reported that out of the total 83 lamps tested in the program, 4 lamps reached a billion shots and one lamp is beyond 1.7 billion shots, while at 44 Joules, 4 lamps went beyond 100 million shots and one lamp reached 500 million shots.

  7. Where Have All the Billions Gone?

    ERIC Educational Resources Information Center

    Leask, Linda; And Others

    1987-01-01

    Providing a basis to help Alaskans determine future spending levels and priorities, this report traces how the state spent more than $26 billion in general funds from fiscal years 1981 through 1986 before oil prices crashed and brought state revenues tumbling down with them. Figures indicate that cumulative general fund expenditures over the…

  8. Thirteen billion years in half an hour

    NASA Astrophysics Data System (ADS)

    Bassett, Bruce A.

    2005-10-01

    We take a high-speed tour of the approximately thirteen billion-year history of our universe focusing on unsolved mysteries and the key events that have sculpted and shaped it - from inflation in the first split second to the dark energy which is currently causing the expansion of the cosmos to accelerate.

  9. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma. PMID:27477933

  10. Four billion people facing severe water scarcity.

    PubMed

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare. PMID:26933676

  11. Four billion people facing severe water scarcity

    PubMed Central

    Mekonnen, Mesfin M.; Hoekstra, Arjen Y.

    2016-01-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare. PMID:26933676

  12. Teledesic pushes $9-billion, 900-satellite system

    NASA Astrophysics Data System (ADS)

    1994-03-01

    Teledesic Corp. is seeking FCC approval to deploy a communication satellite system, costing $9 billion and using more than 900 satellites in low Earth orbit. This system would provide telephone and broadband data service to remote areas and developing countries. The two major stockholders in Teledesic are William Gates (of Microsoft Corp.) and Craig McCaw (of McCaw Cellular Communications). Each satellite would act as a node in a packet-switching network. The satellites would provide continuous global coverage.

  13. The nonprofit sector's $100 billion opportunity.

    PubMed

    Bradley, Bill; Jansen, Paul; Silverman, Les

    2003-05-01

    Imagine what an extra $100 billion a year could do for philanthropic and other nonprofit institutions. According to a new study, the nonprofit sector could free that amount--maybe even more--by making five changes in the way it operates. The study asked two central questions: Does the sector's money flow from its source to its ultimate use as efficiently and effectively as possible? If not, where are the big opportunities to increase social benefit? According to former senator Bill Bradley and McKinsey's Paul Jansen and Les Silverman, nonprofits could save roughly $25 billion a year by changing the way they raise funds. By distributing funds more quickly, they could put an extra $30 billion to work. Organizations could generate more than $60 billion a year by streamlining and restructuring the way in which they provide services and by reducing administrative costs. And they could free up even more money--an amount impossible to estimate--by better allocating funds among service providers. The authors admit that making those changes won't be easy. The nonprofit world, historically seen as a collection of locally focused charities, has become an enormous sector, but it lacks the managerial processes and incentives that help keep the for-profit world on track. And when the baby boomers start to retire in less than a decade, public budgets will be squeezed even more than they are today. If the nonprofit sector is to help the nation cope with the stresses ahead, it must become more efficient and challenge its traditional concepts of stewardship. PMID:12747166

  14. Medicare Spends Billions on Chronic Kidney Disease, Study Finds

    MedlinePlus

    ... nlm.nih.gov/medlineplus/news/fullstory_158020.html Medicare Spends Billions on Chronic Kidney Disease, Study Finds ... affects nearly 14 percent of Americans and costs Medicare billions of dollars a year, a new study ...

  15. High-Reynolds-Number Test of a 5-Percent-Thick Low-Aspect-Ratio Semispan Wing in the Langley 0.3-Meter Transonic Cryogenic Tunnel: Wing Pressure Distributions

    NASA Technical Reports Server (NTRS)

    Chu, Julio; Lawing, Pierce L.

    1990-01-01

    A high Reynolds number test of a 5 percent thick low aspect ratio semispan wing was conducted in the adaptive wall test section of the Langley 0.3 m Transonic Cryogenic Tunnel. The model tested had a planform and a NACA 64A-105 airfoil section that is similar to that of the pressure instrumented canard on the X-29 experimental aircraft. Chordwise pressure data for Mach numbers of 0.3, 0.7, and 0.9 were measured for an angle-of-attack range of -4 to 15 deg. The associated Reynolds numbers, based on the geometric mean chord, encompass most of the flight regime of the canard. This test was a free transition investigation. A summary of the wing pressures are presented without analysis as well as adapted test section top and bottom wall pressure signatures. However, the presented graphical data indicate Reynolds number dependent complex leading edge separation phenomena. This data set supplements the existing high Reynolds number database and are useful for computational codes comparison.

  16. Delivering on Obama's renewables promise will cost billions

    SciTech Connect

    2009-04-15

    For wind energy in the eastern half of the U.S., costs would be $50 billion to $80 billion for transmission lines, in addition to the $700 billion to $1.1 trillion to build the wind farms to generate power.

  17. Endemic Cardiovascular Diseases of the Poorest Billion.

    PubMed

    Kwan, Gene F; Mayosi, Bongani M; Mocumbi, Ana O; Miranda, J Jaime; Ezzati, Majid; Jain, Yogesh; Robles, Gisela; Benjamin, Emelia J; Subramanian, S V; Bukhman, Gene

    2016-06-14

    The poorest billion people are distributed throughout the world, though most are concentrated in rural sub-Saharan Africa and South Asia. Cardiovascular disease (CVD) data can be sparse in low- and middle-income countries beyond urban centers. Despite this urban bias, CVD registries from the poorest countries have long revealed a predominance of nonatherosclerotic stroke, hypertensive heart disease, nonischemic and Chagas cardiomyopathies, rheumatic heart disease, and congenital heart anomalies, among others. Ischemic heart disease has been relatively uncommon. Here, we summarize what is known about the epidemiology of CVDs among the world's poorest people and evaluate the relevance of global targets for CVD control in this population. We assessed both primary data sources, and the 2013 Global Burden of Disease Study modeled estimates in the world's 16 poorest countries where 62% of the population are among the poorest billion. We found that ischemic heart disease accounted for only 12% of the combined CVD and congenital heart anomaly disability-adjusted life years (DALYs) in the poorest countries, compared with 51% of DALYs in high-income countries. We found that as little as 53% of the combined CVD and congenital heart anomaly burden (1629/3049 DALYs per 100 000) was attributed to behavioral or metabolic risk factors in the poorest countries (eg, in Niger, 82% of the population among the poorest billion) compared with 85% of the combined CVD and congenital heart anomaly burden (4439/5199 DALYs) in high-income countries. Further, of the combined CVD and congenital heart anomaly burden, 34% was accrued in people under age 30 years in the poorest countries, while only 3% is accrued under age 30 years in high-income countries. We conclude although the current global targets for noncommunicable disease and CVD control will help diminish premature CVD death in the poorest populations, they are not sufficient. Specifically, the current framework (1) excludes deaths of

  18. Simulating Billion-Task Parallel Programs

    SciTech Connect

    Perumalla, Kalyan S; Park, Alfred J

    2014-01-01

    In simulating large parallel systems, bottom-up approaches exercise detailed hardware models with effects from simplified software models or traces, whereas top-down approaches evaluate the timing and functionality of detailed software models over coarse hardware models. Here, we focus on the top-down approach and significantly advance the scale of the simulated parallel programs. Via the direct execution technique combined with parallel discrete event simulation, we stretch the limits of the top-down approach by simulating message passing interface (MPI) programs with millions of tasks. Using a timing-validated benchmark application, a proof-of-concept scaling level is achieved to over 0.22 billion virtual MPI processes on 216,000 cores of a Cray XT5 supercomputer, representing one of the largest direct execution simulations to date, combined with a multiplexing ratio of 1024 simulated tasks per real task.

  19. Uranium in Canada: A billion dollar industry

    SciTech Connect

    Ruzicka, V. )

    1989-12-01

    In 1988, Canada maintained its position as the world's leading producer of uranium with an output of more than 12,400 MT of uranium in concentrates, worth $1.1 billion Canadian. As domestic requirements represent only 15% of current Canadian production, most of the output was exported. With current implementation of the Canada/US Free Trade Agreement, the US has become Canada's major uranium export customer. With a large share of the world's known uranium resources, Canada remains the focus of international uranium exploration activity. In 1988, the uranium exploration expenditures in Canada exceeded $58 million Canadian. The principal exploration targets were deposits associated with Proterozoic unconformities in Saskatchewan and Northwest Territories, particularly those in the Athabasca and Thelon basin regions of the Canadian Shield. Major attention was also paid to polymetallic deposits in which uranium is associated with precious metals, such as gold and platinum group elements. Conceptual genetic models for these deposit types represent useful tools to guide exploration.

  20. Agroecohydrology: Key to Feeding 9 Billion?

    NASA Astrophysics Data System (ADS)

    Herrick, J.

    2011-12-01

    Agricultural production necessary to feed 9 billion people in 2050 depends on increased production on existing croplands, and expanding onto 'marginal' lands. A high proportion of these lands are marginal because they are too steep or too dry to reliably support crop production. These same characteristics increase their susceptibility to accelerated erosion, leading (for most soil profiles) to further reductions in plant available water as infiltration and soil profile water holding capacity decline. Sustaining production on these marginal lands will require careful land use planning. In this paper, we present a land use planning framework that integrates 4 elements: (1) potential production (based on soil profile characteristics), (2) edaphic, topographic and climatic limitations to production, (3) soil resistance to degradation, and (4) resilience. This framework expands existing land capability classification systems through the integration of biophysical feedbacks and thresholds. State and transition models, similar to those currently applied to rangelands in the United States and other countries, are used to organize and communicate knowledge about the sustainability of different land use changes and management actions at field to regional scales. This framework emphasizes hydrologic characteristics of soil profiles and landscapes over fertility because fertility declines are more easily addressed through increased inputs. The presentation will conclude with a discussion of how research in ecohydrology can be more effectively focused to support sustainable food production in the context of increasingly rapid social and economic changes throughout the world.

  1. Eight billion asteroids in the Oort cloud

    NASA Astrophysics Data System (ADS)

    Shannon, Andrew; Jackson, Alan P.; Veras, Dimitri; Wyatt, Mark

    2015-01-01

    The Oort cloud is usually thought of as a collection of icy comets inhabiting the outer reaches of the Solar system, but this picture is incomplete. We use simulations of the formation of the Oort cloud to show that ˜4 per cent of the small bodies in the Oort cloud should have formed within 2.5 au of the Sun, and hence be ice-free rock-iron bodies. If we assume that these Oort cloud asteroids have the same size distribution as their cometary counterparts, the Large Synoptic Survey Telescope should find roughly a dozen Oort cloud asteroids during 10 years of operations. Measurement of the asteroid fraction within the Oort cloud can serve as an excellent test of the Solar system's formation and dynamical history. Oort cloud asteroids could be of particular concern as impact hazards as their high mass density, high impact velocity, and low visibility make them both hard to detect and hard to divert or destroy. However, they should be a rare class of object, and we estimate globally catastrophic collisions should only occur about once per billion years.

  2. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  4. Economic toll of AIDS put at $10 billion in Canada.

    PubMed

    1996-11-29

    John McCallum, Chief economist at the Royal Bank of Canada, announced that AIDS has cost the nation's economy $10 billion since 1981. These calculations included losses in both direct medical care and human capital. This monetary figure is expected to rise to $36 billion by 2010. An estimated 42,500 to 45,000 Canadians are infected with HIV. PMID:11364044

  5. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  6. Evolution and Probability.

    ERIC Educational Resources Information Center

    Bailey, David H.

    2000-01-01

    Some of the most impressive-sounding criticisms of the conventional theory of biological evolution involve probability. Presents a few examples of how probability should and should not be used in discussing evolution. (ASK)

  7. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  8. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  9. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  10. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  11. Gaia: how to map a billion stars with a billion pixels

    NASA Astrophysics Data System (ADS)

    de Bruijne, J. H. J.

    2008-07-01

    Gaia, ESA's ambitious star-mapper mission due for launch late-2011, will provide multi-epoch micro-arcsecond astrometric and milli-magnitude photometric data for the brightest one billion objects in the sky, down to at least magnitude 20. Spectroscopic data will simultaneously be collected for the subset of the brightest 100 million stars, down to about magnitude 17. This massive data volume will allow astronomers to reconstruct the structure, evolution and formation history of the Milky Way. It will also revolutionize studies of the solar system and stellar physics and will contribute to diverse research areas, ranging from extra-solar planets to general relativity. Underlying Gaia's scientific harvest will lie in a Catalogue, built on the fundamental space-based measurements. During the 5-year nominal operational lifetime, Gaia's payload, with its CCD mosaic containing 1 billion pixels, will autonomously detect all objects of interest and observe them throughout their passage of the focal plane. This paper discusses the workings of the Gaia instrument, details its payload, and discusses in depth how the scientific measurements will be collected. It addresses issues like maintenance of the scanning law, on-board data processing, the detection and confirmation of objects (single and multiple stars), the detection and rejection of cosmic rays and solar protons, the fundamental science measurements themselves composed of windows of CCD samples (pixels), and special strategies employed to maximize the science return for moving (i.e., solar-system) objects. The paper also explains how an on-board priority scheme will ensure catalogue completeness down to the faintest magnitudes possible, despite the limited ground-station availability and the enormous data volume that will be sent to the ground.

  12. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  13. NASA Now Minute: Earth and Space Science: 100 Billion Planets

    NASA Video Gallery

    Stephen Kane, co-author of the article, “Study Shows Our Galaxy has 100Billion Planets” reveals details about this incredible study explainsjust how common planets are in our Milky Way galaxy...

  14. Harnessing Energy from the Sun for Six Billion People

    ScienceCinema

    Daniel Nocera

    2013-07-19

    Daniel Nocera, a Massachusetts Institute of Technology professor whose recent research focuses on solar-powered fuels, presents a Brookhaven Science Associates Distinguished Lecture, titled "Harnessing Energy from the Sun for Six Billion People -- One at a Time."

  15. Circadian biology: a 2.5 billion year old clock.

    PubMed

    Loudon, Andrew S I

    2012-07-24

    A recent study suggests that circadian clocks may have evolved at the time of the Great Oxidation Event 2.5 billion years ago in order to drive detoxification of reactive oxygen species. PMID:22835791

  16. Academic Pork Barrel Tops $2-Billion for the First Time.

    ERIC Educational Resources Information Center

    Brainard, Jeffrey; Borrego, Anne Marie

    2003-01-01

    Describes how, despite the growing budget deficit, Congress directed a record $2 billion to college projects in 2003, many of them dealing with security and bioterrorism. Includes data tables on the earmarks. (EV)

  17. Harnessing Energy from the Sun for Six Billion People

    SciTech Connect

    Daniel Nocera

    2011-09-12

    Daniel Nocera, a Massachusetts Institute of Technology professor whose recent research focuses on solar-powered fuels, presents a Brookhaven Science Associates Distinguished Lecture, titled "Harnessing Energy from the Sun for Six Billion People -- One at a Time."

  18. Triploblastic animals more than 1 billion years ago: trace fossil evidence from india

    PubMed

    Seilacher; Bose; Pfluger

    1998-10-01

    Some intriguing bedding plane features that were observed in the Mesoproterozoic Chorhat Sandstone are biological and can be interpreted as the burrows of wormlike undermat miners (that is, infaunal animals that excavated tunnels underneath microbial mats). These burrows suggest that triploblastic animals existed more than a billion years ago. They also suggest that the diversification of animal designs proceeded very slowly before the appearance of organisms with hard skeletons, which was probably the key event in the Cambrian evolutionary explosion, and before the ecological changes that accompanied that event. PMID:9756480

  19. A Posteriori Transit Probabilities

    NASA Astrophysics Data System (ADS)

    Stevens, Daniel J.; Gaudi, B. Scott

    2013-08-01

    Given the radial velocity (RV) detection of an unseen companion, it is often of interest to estimate the probability that the companion also transits the primary star. Typically, one assumes a uniform distribution for the cosine of the inclination angle i of the companion's orbit. This yields the familiar estimate for the prior transit probability of ~Rlowast/a, given the primary radius Rlowast and orbital semimajor axis a, and assuming small companions and a circular orbit. However, the posterior transit probability depends not only on the prior probability distribution of i but also on the prior probability distribution of the companion mass Mc, given a measurement of the product of the two (the minimum mass Mc sin i) from an RV signal. In general, the posterior can be larger or smaller than the prior transit probability. We derive analytic expressions for the posterior transit probability assuming a power-law form for the distribution of true masses, dΓ/dMcvpropMcα, for integer values -3 <= α <= 3. We show that for low transit probabilities, these probabilities reduce to a constant multiplicative factor fα of the corresponding prior transit probability, where fα in general depends on α and an assumed upper limit on the true mass. The prior and posterior probabilities are equal for α = -1. The posterior transit probability is ~1.5 times larger than the prior for α = -3 and is ~4/π times larger for α = -2, but is less than the prior for α>=0, and can be arbitrarily small for α > 1. We also calculate the posterior transit probability in different mass regimes for two physically-motivated mass distributions of companions around Sun-like stars. We find that for Jupiter-mass planets, the posterior transit probability is roughly equal to the prior probability, whereas the posterior is likely higher for Super-Earths and Neptunes (10 M⊕ - 30 M⊕) and Super-Jupiters (3 MJup - 10 MJup), owing to the predicted steep rise in the mass function toward smaller

  20. Single-case probabilities

    NASA Astrophysics Data System (ADS)

    Miller, David

    1991-12-01

    The propensity interpretation of probability, bred by Popper in 1957 (K. R. Popper, in Observation and Interpretation in the Philosophy of Physics, S. Körner, ed. (Butterworth, London, 1957, and Dover, New York, 1962), p. 65; reprinted in Popper Selections, D. W. Miller, ed. (Princeton University Press, Princeton, 1985), p. 199) from pure frequency stock, is the only extant objectivist account that provides any proper understanding of single-case probabilities as well as of probabilities in ensembles and in the long run. In Sec. 1 of this paper I recall salient points of the frequency interpretations of von Mises and of Popper himself, and in Sec. 2 I filter out from Popper's numerous expositions of the propensity interpretation its most interesting and fertile strain. I then go on to assess it. First I defend it, in Sec. 3, against recent criticisms (P. Humphreys, Philos. Rev. 94, 557 (1985); P. Milne, Erkenntnis 25, 129 (1986)) to the effect that conditional [or relative] probabilities, unlike absolute probabilities, can only rarely be made sense of as propensities. I then challenge its predominance, in Sec. 4, by outlining a rival theory: an irreproachably objectivist theory of probability, fully applicable to the single case, that interprets physical probabilities as instantaneous frequencies.

  1. Winglets Save Billions of Dollars in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  2. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  3. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  4. Bill and Melinda Gates Pledge $1-Billion for Minority Scholarships.

    ERIC Educational Resources Information Center

    Monaghan, Peter; Lederman, Douglas; van der Werf, Martin; Pulley, John

    1999-01-01

    Reports on a $1 billion dollar grant from Bill and Melinda Gates to send 20,000 low-income minority students to college. The Gates Millenium Scholars Program will require students to demonstrate financial need and maintain a 3.0 grade point average in college. A list of the largest private gifts to higher education since 1967 is also provided. (DB)

  5. The BIA As Banker: "Trust" Is Hard When Billions Disappear.

    ERIC Educational Resources Information Center

    Johansen, Bruce E.

    1997-01-01

    The federal government's trust responsibility toward Native Americans involves protection of their lands, resources, and right to self-government and provision of services (including education). However, the Bureau of Indian Affairs has misplaced billions of dollars owed Native American individuals and tribes and now faces class-action litigation.…

  6. Congress Gives Colleges a Billion-Dollar Bonanza.

    ERIC Educational Resources Information Center

    Brainard, Jeffrey; Southwick, Ron

    2000-01-01

    Reports that Congress has earmarked a record amount of money (more than $1 billion) for projects involving specific colleges in the 2000 fiscal year. Notes that such "pork-barrel" spending has tripled since 1996. Charts show trends in earmarks since 1989, year 2000 earmarks by agency, the top 20 recipients of earmarked grants, and ranking of…

  7. Colleges' Billion-Dollar Campaigns Feel the Economy's Sting

    ERIC Educational Resources Information Center

    Masterson, Kathryn

    2009-01-01

    The economy's collapse has caught up with the billion-dollar campaign. In the past 12 months, the amount of money raised by a dozen of the colleges engaged in higher education's biggest fund-raising campaigns fell 32 percent from the year before. The decline, which started before the worst of the recession, has forced colleges to postpone…

  8. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  9. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  10. Acceptance, values, and probability.

    PubMed

    Steel, Daniel

    2015-10-01

    This essay makes a case for regarding personal probabilities used in Bayesian analyses of confirmation as objects of acceptance and rejection. That in turn entails that personal probabilities are subject to the argument from inductive risk, which aims to show non-epistemic values can legitimately influence scientific decisions about which hypotheses to accept. In a Bayesian context, the argument from inductive risk suggests that value judgments can influence decisions about which probability models to accept for likelihoods and priors. As a consequence, if the argument from inductive risk is sound, then non-epistemic values can affect not only the level of evidence deemed necessary to accept a hypothesis but also degrees of confirmation themselves. PMID:26386533

  11. Manpower Considers CETA 5 Percent Set-Aside

    ERIC Educational Resources Information Center

    American Vocational Journal, 1978

    1978-01-01

    Joan Wills, from the National Governor's Association (NGA), explains NGA's recommendation to eliminate CETA's five percent set-aside for vocational education. (This article summarizes her presentation at the annual vocational convention.) (Editor)

  12. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  13. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  14. Application of Quantum Probability

    NASA Astrophysics Data System (ADS)

    Bohdalová, Mária; Kalina, Martin; Nánásiová, Ol'ga

    2009-03-01

    This is the first attempt to smooth time series using estimators with applying quantum probability with causality (non-commutative s-maps on an othomodular lattice). In this context it means that we use non-symmetric covariance matrix to construction of our estimator.

  15. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  16. On the constancy of the lunar cratering flux over the past 3.3 billion yr

    NASA Technical Reports Server (NTRS)

    Guinness, E. A.; Arvidson, R. E.

    1977-01-01

    Utilizing a method that minimizes random fluctuations in sampling crater populations, it can be shown that the ejecta deposit of Tycho, the floor of Copernicus, and the region surrounding the Apollo 12 landing site have incremental crater size-frequency distributions that can be expressed as log-log linear functions over the diameter range from 0.1 to 1 km. Slopes are indistinguishable for the three populations, probably indicating that the surfaces are dominated by primary craters. Treating the crater populations of Tycho, the floor of Copernicus, and Apollo 12 as primary crater populations contaminated, but not overwhelmed, with secondaries, allows an attempt at calibration of the post-heavy bombardment cratering flux. Using the age of Tycho as 109 m.y., Copernicus as 800 m.y., and Apollo 12 as 3.26 billion yr, there is no basis for assuming that the flux has changed over the past 3.3 billion yr. This result can be used for dating intermediate aged surfaces by crater density.

  17. Conservation of protein structure over four billion years

    PubMed Central

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M.; Gaucher, Eric A.; Sanchez-Ruiz, Jose M.; Gavira, Jose A.

    2013-01-01

    SUMMARY Little is known with certainty about the evolution of protein structures in general and the degree of protein structure conservation over planetary time scales in particular. Here we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating back up to ~4 billion years before present. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold while only small structural changes have occurred over 4 billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods of time and is followed by long periods of structural stasis. PMID:23932589

  18. BLINK: Billion Lines INdexing in a clicK

    NASA Astrophysics Data System (ADS)

    Kamennoff, N.; Foucaud, S.; Reybier, S.; Tsai, M.-F.; Tang, C.-H.

    2012-09-01

    The coming generation of sky surveys are going to provide measurements for properties of a number of objects like never have been reached before. Astronomical databases will have to deal with requests on several billions of entries at once, and therefore a new computational framework is vital for the next generation of Data-Centers. As part of the efforts linked to the setting up of the Taiwan Extragalactic Astronomical Data Center (TWEA-DC), Billion Lines INdexing in a clicK (BLINK) is developed to satisfy this role. BLINK is a framework that aims to ease access to large amount of data and share analysis software amongst users. BLINK is also designed to be parallelized and distributed on large amount of heterogeneous resources. BLINK will propose at first a very fast indexing algorithm and cross-matching capability, enabling to gather multiwavelength information of large chunk of the sky in a very limited period of time.

  19. Conservation of protein structure over four billion years.

    PubMed

    Ingles-Prieto, Alvaro; Ibarra-Molero, Beatriz; Delgado-Delgado, Asuncion; Perez-Jimenez, Raul; Fernandez, Julio M; Gaucher, Eric A; Sanchez-Ruiz, Jose M; Gavira, Jose A

    2013-09-01

    Little is known about the evolution of protein structures and the degree of protein structure conservation over planetary time scales. Here, we report the X-ray crystal structures of seven laboratory resurrections of Precambrian thioredoxins dating up to approximately four billion years ago. Despite considerable sequence differences compared with extant enzymes, the ancestral proteins display the canonical thioredoxin fold, whereas only small structural changes have occurred over four billion years. This remarkable degree of structure conservation since a time near the last common ancestor of life supports a punctuated-equilibrium model of structure evolution in which the generation of new folds occurs over comparatively short periods and is followed by long periods of structural stasis. PMID:23932589

  20. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    NASA Technical Reports Server (NTRS)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  1. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  2. Probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  3. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  4. Billion particle linac simulations for future light sources

    SciTech Connect

    Ryne, R. D.; Venturini, M.; Zholents, A. A.; Qiang, J.

    2008-09-25

    In this paper we report on multi-physics, multi-billion macroparticle simulation of beam transport in a free electron laser (FEL) linac for future light source applications. The simulation includes a self-consistent calculation of 3D space-charge effects, short-range geometry wakefields, longitudinal coherent synchrotron radiation (CSR) wakefields, and detailed modeling of RF acceleration and focusing. We discuss the need for and the challenges associated with such large-scale simulation. Applications to the study of the microbunching instability in an FEL linac are also presented.

  5. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  6. Scalable in-memory RDFS closure on billions of triples.

    SciTech Connect

    Goodman, Eric L.; Mizell, David

    2010-06-01

    We present an RDFS closure algorithm, specifically designed and implemented on the Cray XMT supercomputer, that obtains inference rates of 13 million inferences per second on the largest system configuration we used. The Cray XMT, with its large global memory (4TB for our experiments), permits the construction of a conceptually straightforward algorithm, fundamentally a series of operations on a shared hash table. Each thread is given a partition of triple data to process, a dedicated copy of the ontology to apply to the data, and a reference to the hash table into which it inserts inferred triples. The global nature of the hash table allows the algorithm to avoid a common obstacle for distributed memory machines: the creation of duplicate triples. On LUBM data sets ranging between 1.3 billion and 5.3 billion triples, we obtain nearly linear speedup except for two portions: file I/O, which can be ameliorated with the additional service nodes, and data structure initialization, which requires nearly constant time for runs involving 32 processors or more.

  7. $75 Billion in Formula Grants Failed to Drive Reform. Can $5 Billion in Competitive Grants Do the Job? Education Stimulus Watch. Special Report 2

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    In early 2009, Congress passed and President Barack Obama signed into law the American Recovery and Reinvestment Act (ARRA), the federal government's nearly $800 billion stimulus legislation. According to key members of Congress and the Obama administration, the education portions of the law, totaling about $100 billion, were designed both to…

  8. Galaxy Evolution over the Last Eight Billion Years

    NASA Astrophysics Data System (ADS)

    Zhu, Guangtun; Blanton, M. R.; Hogg, D. W.; Eisenstein, D. J.; Coil, A. L.; Cool, R. J.; Moustakas, J.; Wong, K. C.

    2011-01-01

    We study galaxy evolution over the last eight billion years with large, deep galaxy surveys, PRIMUS, SDSS and DEEP2. Galaxies have changed dramatically over this period of time. The global star formation rate has declined by roughly an order-of-magnitude. Red galaxies have grown substantially in number and mass. Blue galaxies have faded and grown redder as their star formation rate dropped. I demonstrate these evolutionary features with new results from these surveys. I also introduce PRIMUS, the largest faint galaxy survey to date. We have measured 140,000 robust redshifts to the depths of i (AB) 23 up to z 1, covering 9.1 square degrees of the sky. I show that with the existing deep multi-wavelength imaging in PRIMUS fields we are able to study the evolution in greater detail and investigate proposed physical mechanisms responsible for the evolution.

  9. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  10. Bigger, Better Catalog Unveils Half a Billion Celestial Objects

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These frames are samples from the photographic sky surveys, which have been digitized by a technical team at the Space Telescope Science Institute to support the Hubble Space Telescope operations. The team processed these images to create a new astronomical catalog, called the Guide Star Catalog II. This project was undertaken by the Space Telescope Science Institute as an upgrade to an earlier sky survey and catalog (DSS-I and GSC-I), initially done to provide guide stars for pointing the Hubble Space Telescope. By virtue of its sheer size, the DSS-II and GSC-II have many research applications for both professional and amateur astronomers. [Top] An example from the DSS-II shows the Rosette Nebula, (originally photographed by the Palomar Observatory) as digitized in the DSS-I (left) and DSS-II (right). The DSS-II includes views of the sky at both red and blue wavelengths, providing invaluable color information on about one billion deep-sky objects. [Bottom] This blow-up of the inset box in the raw DSS-I scan shows examples of the GSC-I and the improved GSC-II catalogs. Astronomers extracted the stars from the scanned plate of the Rosette and listed them in the catalogs. The new GSC-II catalog provides the colors, positions, and luminosities of nearly half a billion stars -- over 20 times as many as the original GSC-I. The GSC-II contains information on stars as dim as the 19th magnitude. Credit: NASA, the DSS-II and GSC-II Consortia (with images from the Palomar Observatory-STScI Digital Sky Survey of the northern sky, based on scans of the Second Palomar Sky Survey are copyright c 1993-1999 by the California Institute of Technology)

  11. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  12. Quantum probability and many worlds

    NASA Astrophysics Data System (ADS)

    Hemmo, Meir; Pitowsky, Itamar

    We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.

  13. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  14. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  15. Orbital forcing of climate 1.4 billion years ago.

    PubMed

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U; Wang, Huajian; Costa, M Mafalda; Bjerrum, Christian J; Connelly, James N; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E

    2015-03-24

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment. PMID:25775605

  16. Fuel efficient stoves for the poorest two billion

    NASA Astrophysics Data System (ADS)

    Gadgil, Ashok

    2012-03-01

    About 2 billion people cook their daily meals on generally inefficient, polluting, biomass cookstoves. The fuels include twigs and leaves, agricultural waste, animal dung, firewood, and charcoal. Exposure to resulting smoke leads to acute respiratory illness, and cancers, particularly among women cooks, and their infant children near them. Resulting annual mortality estimate is almost 2 million deaths, higher than that from malaria or tuberculosis. There is a large diversity of cooking methods (baking, boiling, long simmers, brazing and roasting), and a diversity of pot shapes and sizes in which the cooking is undertaken. Fuel-efficiency and emissions depend on the tending of the fire (and thermal power), type of fuel, stove characteristics, and fit of the pot to the stove. Thus, no one perfect fuel-efficient low-emitting stove can suit all users. Affordability imposes a further severe constraint on the stove design. For various economic strata within the users, a variety of stove designs may be appropriate and affordable. In some regions, biomass is harvested non-renewably for cooking fuel. There is also increasing evidence that black carbon emitted from stoves is a significant contributor to atmospheric forcing. Thus improved biomass stoves can also help mitigate global climate change. The speaker will describe specific work undertaken to design, develop, test, and disseminate affordable fuel-efficient stoves for internally displaced persons (IDPs) of Darfur, Sudan, where the IDPs face hardship, humiliation, hunger, and risk of sexual assault owing to their dependence on local biomass for cooking their meals.

  17. Orbital forcing of climate 1.4 billion years ago

    PubMed Central

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U.; Wang, Huajian; Costa, M. Mafalda; Bjerrum, Christian J.; Connelly, James N.; Zhang, Baomin; Bian, Lizeng; Canfield, Donald E.

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes in the location of the Xiamaling relative to the position of the Intertropical Convergence Zone. On shorter time scales, and within a precisely calibrated stratigraphic framework, cyclicity in sediment geochemical dynamics is consistent with orbital control. In particular, sediment geochemical fluctuations reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment. PMID:25775605

  18. Semantic Sensor Observation Networks in a Billion-Sensor World

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Bogden, P.; Creager, G.; Graybeal, J.

    2008-12-01

    In 2010, there will be 10,000 telemetric devices for every human in the planet (prediction by Ernest and Young). Some of these devices will be collecting data from coastal phenomena. Some will be connected to adaptive sampling systems, which allow observing a phenomenon, forecasting its advance, and triggering of other numerical models, new missions or changes to the sampling frequency of other sensors. These highly sophisticated autonomous and adaptive sensors will help improve the understating of coastal phenomena; however, collaborative arrangements among communities need to happen to be able to interoperate in a world of billions of sensors. Arrangements will allow discovery and sharing of sensor descriptions and understanding and usage of observed data. OOSTethys is an open source collaborative project that helps implement ocean observing system components. Some of these components include sensor interfaces, catalogs of services, and semantic mediators. The OOSTethys team seeks to speed up collaborative arrangements by studying the best standards available, creating easy-to-adopt toolkits, and publishing guides that facilitate the implementation of these components. The interaction of some observing system components, and lessons learned about developing Semantic Sensor Networks using OGC Sensor Observation Services and ontologies, will be discussed.

  19. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  20. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  1. How to make a billion-barrel oil field in offshore California commercial

    SciTech Connect

    Patterson, J.C.; Ballard, J.H.

    1988-02-01

    The major obstacles and challenges involved in exploration and development of a giant deep-water low-gravity oil field are exemplified in the undeveloped Sword field of offshore southern California. In 1979, Conoco Exploration identified a northeast-southwest-trending basement high in the 800 to 2000-ft deep federal waters 12 mi southwest of Pt. Conception at the western end of the Santa Barbara Channel. The intended reservoir was fractured Miocene Monterey chert, silicic shales/siltstones, and dolomites that are draped over the axially faulted structure. Drilling of the initial well in OCS P-0322 in 1982 resulted in discovering the giant Sword field. A confirmation well drilled in OCS P-0320 indicates in-place reserves of well over 1 billion bbl. While the discovered potential is significant, the low gravity (8.5/degree/-10.5/degree/ API) of the oils discovered to date, along with water depths in excess of 1500 ft, currently pose economic challenges to successful field development. Conoco and its partners are addressing the current economic barriers on a number of fronts. Three-dimensional seismic surveys are being conducted to better delineate reservoir geometry and to define probable variations in lithology, fracturing, and oil gravity. A market feasibility study will be undertaken to assess the demand for low-gravity crude from offshore California.

  2. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  3. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  4. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  5. Decision analysis with approximate probabilities

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  6. Probability of sea level rise

    SciTech Connect

    Titus, J.G.; Narayanan, V.K.

    1995-10-01

    The report develops probability-based projections that can be added to local tide-gage trends to estimate future sea level at particular locations. It uses the same models employed by previous assessments of sea level rise. The key coefficients in those models are based on subjective probability distributions supplied by a cross-section of climatologists, oceanographers, and glaciologists.

  7. Computation of Most Probable Numbers

    PubMed Central

    Russek, Estelle; Colwell, Rita R.

    1983-01-01

    A rapid computational method for maximum likelihood estimation of most-probable-number values, incorporating a modified Newton-Raphson method, is presented. The method offers a much greater reliability for the most-probable-number estimate of total viable bacteria, i.e., those capable of growth in laboratory media. PMID:6870242

  8. 3.5 billion years of reshaped Moho, southern Africa

    NASA Astrophysics Data System (ADS)

    Stankiewicz, Jacek; de Wit, Maarten

    2013-12-01

    According to some previous studies, Archean continental crust is, on global average, apparently thinner than Proterozoic crust. Subsequently, the validity of this statement has been questioned. To provide an additional perspective on this issue, we present analyses of Moho signatures derived from recent seismic data along swaths 2000 km in length across southern Africa and its flanking ocean. The imaged crust has a near continuous age range between ca. 0.1 and 3.7 billion years, and the seismic data allow direct comparison of Moho depths between adjacent Archean, Proterozoic and Phanerozoic crust. We find no simple secular change in depth to Moho over this time period. In contrast, there is significant variation in depth to Moho beneath both Archean and Proterozoic crust; Archean crust of southern Africa displays as much crustal diversity in thickness as the adjacent Proterozoic crust. The Moho beneath all crustal provinces that we have analysed has been severely altered by tectono-metamorphic and igneous processes, in many cases more than once, and cannot provide unequivocal data for geodynamic models dealing with secular changes in continental crust formation. These results and conclusions are similar to those documented along ca. 2000 km swaths across the Canadian Shield recorded by Lithoprobe. Tying the age and character of the Precambrian crust of southern Africa to their depth diversities is clearly related to manifold processes of tectono-thermal ‘surgery’ subsequent to their origin, the details of which are still to be resolved, as they are in most Precambrian terranes. Reconstructing pristine Moho of the early Earth therefore remains a formidable challenge. In South Africa, better knowledge of ‘fossilised’ Archean crustal sections ‘turned-on-edge’, such as at the Vredefort impact crater (for the continental crust), and from the Barberton greenstone belt (for oceanic crust) is needed to characterize potential pristine Archean Moho transitions.

  9. A Multi-billion Parcel Atmospheric Trajectory Model

    NASA Astrophysics Data System (ADS)

    Cruz, C.; Clune, T. L.; Lait, L. R.; Ranawake, U.; Burns, R. W.

    2009-12-01

    We present a new parallel implementation of an atmospheric trajectory modelling framework which provides improved numerical accuracy, greater flexibility for specifying experiments, and sufficient raw performance to simultaneously simulate billions of parcel trajectories on suitable computing platforms. The application is parallelized using the Message Passing Interface (MPI) library and can scale efficiently on a wide variety of modern computing platforms. The ability to treat such large numbers of parcels is expected to enable a new generation of experiments to explore questions related to global stratosphere-troposphere exchange, age-of-air spectra, and transport of trace gases and aerosols. The modelling framework is written in C++ for easy integration with other computing technologies. It also provides a great deal of flexibility by allowing users to select from (or add to) alternative subclasses for vertical coordinates (pressure, potential temperature), integration schemes (Runge-Kutta, Euler), meteorological data sources (NCEP/NCAR Reanalsyis, MERRA), data interpolation methods (linear, log-linear, splines), and output (parcel histories, summary statistics, min/max quantities encountered). Significantly improved numerical accuracy, especially near the poles, is provided by expressing integration in terms of purely geometric constructs which avoid various complications associated with spherical coordinates near the poles. The entire package has been rigorously developed using Test-Driven Development (TDD) which both provides confidence in the implementation and should also assist other developers that wish to extend the framework. Several tests are performed to demonstrate the fourth-order Runge-Kutta integration scheme with our spherical geometric constructs. Tilted solid body rotation provides a baseline synthetic wind field for assessing model performance, and a time-varying case is used to examine the errors introduced by interpolating linearly in time

  10. Earth's air pressure 2.7 billion years ago constrained to less than half of modern levels

    NASA Astrophysics Data System (ADS)

    Som, Sanjoy M.; Buick, Roger; Hagadorn, James W.; Blake, Tim S.; Perreault, John M.; Harnmeijer, Jelte P.; Catling, David C.

    2016-06-01

    How the Earth stayed warm several billion years ago when the Sun was considerably fainter is the long-standing problem of the `faint young Sun paradox'. Because of negligible O2 and only moderate CO2 levels in the Archaean atmosphere, methane has been invoked as an auxiliary greenhouse gas. Alternatively, pressure broadening in a thicker atmosphere with a N2 partial pressure around 1.6-2.4 bar could have enhanced the greenhouse effect. But fossilized raindrop imprints indicate that air pressure 2.7 billion years ago (Gyr) was below twice modern levels and probably below 1.1 bar, precluding such pressure enhancement. This result is supported by nitrogen and argon isotope studies of fluid inclusions in 3.0-3.5 Gyr rocks. Here, we calculate absolute Archaean barometric pressure using the size distribution of gas bubbles in basaltic lava flows that solidified at sea level ~2.7 Gyr in the Pilbara Craton, Australia. Our data indicate a surprisingly low surface atmospheric pressure of Patm = 0.23 +/- 0.23 (2σ) bar, and combined with previous studies suggests ~0.5 bar as an upper limit to late Archaean Patm. The result implies that the thin atmosphere was rich in auxiliary greenhouse gases and that Patm fluctuated over geologic time to a previously unrecognized extent.

  11. The probabilities of unique events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  12. Transition probabilities of Br II

    NASA Technical Reports Server (NTRS)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  13. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  14. VESPA: False positive probabilities calculator

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.

    2015-03-01

    Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA) calculates false positive probabilities and statistically validates transiting exoplanets. Written in Python, it uses isochrones [ascl:1503.010] and the package simpledist.

  15. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  16. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  17. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  18. Evaluation of microbial release probabilities

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Work undertaken to improve the estimation of the probability of release of microorganisms from unmanned Martian landing spacecraft is summarized. An analytical model is described for the development of numerical values for release parameters and release mechanisms applicable to flight missions are defined. Laboratory test data are used to evolve parameter values for use by flight projects in estimating numerical values for release probabilities. The analysis treats microbial burden located on spacecraft surfaces, between mated surfaces, and encapsulated within materials.

  19. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  20. A SWIRE Picture is Worth Billions of Years

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: SWIRE View of Distant Galaxies [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 2Figure 3 Figure 4

    These spectacular images, taken by the Spitzer Wide-area Infrared Extragalactic (SWIRE) Legacy project, encapsulate one of the primary objectives of the Spitzer mission: to connect the evolution of galaxies from the distant, or early, universe to the nearby, or present day, universe.

    The Tadpole galaxy (main image) is the result of a recent galactic interaction in the local universe. Although these galactic mergers are rare in the universe's recent history, astronomers believe that they were much more common in the early universe. Thus, SWIRE team members will use this detailed image of the Tadpole galaxy to help understand the nature of the 'faint red-orange specks' of the early universe.

    The larger picture (figure 2) depicts one-sixteenth of the SWIRE survey field called ELAIS-N1. In this image, the bright blue sources are hot stars in our own Milky Way, which range anywhere from 3 to 60 times the mass of our Sun. The fainter green spots are cooler stars and galaxies beyond the Milky Way whose light is dominated by older stellar populations. The red dots are dusty galaxies that are undergoing intense star formation. The faintest specks of red-orange are galaxies billions of light-years away in the distant universe.

    Figure 3 features an unusual ring-like galaxy called CGCG 275-022. The red spiral arms indicate that this galaxy is very dusty and perhaps undergoing intense star formation. The star-forming activity could have been initiated by a near head-on collision with another galaxy.

    The most distant galaxies that SWIRE is able to detect are revealed in a zoom of deep space (figure 4). The colors in this feature represent the same objects as those in the larger field image of ELAIS

  1. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  2. Measure and probability in cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua S.; Wald, Robert M.

    2012-07-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. In ordinary statistical physics, the Liouville measure is used to compute probabilities of macrostates, and it would seem natural to use the similar measure arising in general relativity to compute probabilities in cosmology, such as the probability that the Universe underwent an era of inflation. Indeed, a number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)—namely, the Gibbons-Hawking-Stewart measure—to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account (we illustrate how) even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines “nearly homogeneous.” (4) In a Universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the Universe to retrodict the likelihood of past conditions.

  3. Flood hazard probability mapping method

    NASA Astrophysics Data System (ADS)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  4. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  5. Probability as a Physical Motive

    NASA Astrophysics Data System (ADS)

    Martin, Peter

    2007-06-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP”) to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  6. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  7. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  8. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    ERIC Educational Resources Information Center

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  9. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  10. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  11. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  12. Conditional Independence in Applied Probability.

    ERIC Educational Resources Information Center

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  13. Dynamic SEP event probability forecasts

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  14. The Yatela gold deposit: 2 billion years in the making

    NASA Astrophysics Data System (ADS)

    Hein, K. A. A.; Matsheka, I. R.; Bruguier, O.; Masurel, Q.; Bosch, D.; Caby, R.; Monié, P.

    2015-12-01

    Gold mineralisation in the Yatela Main gold mine is hosted in a saprolitic residuum situated above Birimian supracrustal rocks, and at depth. The supracrustal rocks comprise metamorphosed calcitic and dolomitic marbles that were intruded by diorite (2106 ± 10 Ma, 207Pb/206Pb), and sandstone-siltstone-shale sequences (youngest detrital zircon population dated at 2139 ± 6 Ma). In-situ gold-sulphide mineralisation is associated with hydrothermal activity synchronous to emplacement of the diorite and forms a sub-economic resource; however, the overlying saprolitic residuum hosts economic gold mineralisation in friable lateritized palaeosols and aeolian sands (loess). Samples of saprolitic residuum were studied to investigate the morphology and composition of gold grains as a proxy for distance from source (and possible exploration vector) because the deposit hosts both angular and detrital gold suggesting both proximal and distal sources. U-Pb geochronology of detrital zircons also indicated a proximal and distal source, with the age spectra giving Archaean (2.83-3.28 Ga), and Palaeoproterozoic (1.95-2.20 Ga) to Neoproterozoic (1.1-1.8 Ga) zircons in the Yatela depocentre. The 1.1-1.8 Ga age spectrum restricts the maximum age for the first deposition of the sedimentary units in the Neoproterozoic, or during early deposition in the Taoudeni Basin. Models for formation of the residuum include distal and proximal sources for detritus into the depocentre, however, it is more likely that material was sourced locally and included recycled material. The creation of a deep laterite weathering profile and supergene enrichment of the residuum probably took place during the mid-Cretaceous-early Tertiary.

  15. Rules Set for $4 Billion Race to Top Contest: Final Rules Give States Detailed Map in Quest for $4 Billion in Education Stimulus Aid

    ERIC Educational Resources Information Center

    McNeil, Michele

    2009-01-01

    For a good shot at $4 billion in grants from the federal Race to the Top Fund, states will need to make a persuasive case for their education reform agendas, demonstrate significant buy-in from local school districts, and devise plans to evaluate teachers and principals based on student performance, according to final regulations released last…

  16. Corporations Give Record $1.6 Billion to Colleges and Universities in 1984-85; Total Giving Reaches $6.3 Billion.

    ERIC Educational Resources Information Center

    CFAE Newsletter, 1986

    1986-01-01

    Findings from the publication, "Voluntary Support of Education 1984-85," are summarized. The survey report includes contributions to 1,114 colleges and universities. Highlights of findings show that: total estimated voluntary support was $6.32 billion in 1984-1985; for the first time, corporations contributed more than any other donor group ($1.57…

  17. If 1 in 10 U.S. Smokers Quits, $63 Billion Saved

    MedlinePlus

    ... nih.gov/medlineplus/news/fullstory_158758.html If 1 in 10 U.S. Smokers Quits, $63 Billion Saved ... money. That's because health care costs plummet just one year after stopping, new research shows. A 10 ...

  18. Probability densities in strong turbulence

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  19. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  20. Probability for primordial black holes

    NASA Astrophysics Data System (ADS)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  1. Relative transition probabilities of cobalt

    NASA Technical Reports Server (NTRS)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  2. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  3. Probability of Detection Demonstration Transferability

    NASA Technical Reports Server (NTRS)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  4. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters. PMID:22407706

  5. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  6. Measure and Probability in Cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua; Wald, Robert

    2012-03-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. A number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)---namely, the Gibbons-Hawking-Stewart measure---to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines ``nearly homogeneous''. (4) In a universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the universe to ``retrodict'' the likelihood of past conditions.

  7. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  8. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  9. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  10. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence

    NASA Technical Reports Server (NTRS)

    Ohmoto, H.; Kakegawa, T.; Lowe, D. R.

    1993-01-01

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>>10(-13) of the present atmospheric level) of free oxygen.

  11. 3.4-Billion-year-old biogenic pyrites from Barberton, South Africa: sulfur isotope evidence.

    PubMed

    Ohmoto, H; Kakegawa, T; Lowe, D R

    1993-10-22

    Laser ablation mass spectroscopy analyses of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (>10(-13) of the present atmospheric level) of free oxygen. PMID:11539502

  12. NOAA Budget Increases to $4.1 Billion, But Some Key Items Are Reduced

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2008-02-01

    The Bush administration has proposed a US$4.1 billion budget for fiscal year (FY) 2009 for the U.S. National Oceanic and Atmospheric Administration (NOAA). The proposed budget, which would be the agency's largest ever, is $202.6 million, or 5.2%, above the FY 2008 enacted budget. By topping $4 billion and the amount Congress passed for FY 2008, the budget proposal crosses into ``a new threshold,'' according Navy Vice Admiral Conrad Lautenbacher, undersecretary of commerce for oceans and atmosphere and NOAA administrator.

  13. Billions for biodefense: federal agency biodefense funding, FY2001-FY2005.

    PubMed

    Schuler, Ari

    2004-01-01

    Over the past several years, the United States government has spent substantial resources on preparing the nation against a bioterrorist attack. This article analyzes the civilian biodefense funding by the federal government from fiscal years 2001 through 2005, specifically analyzing the budgets and allocations for biodefense at the Department of Health and Human Services, the Department of Homeland Security, the Department of Defense, the Department of Agriculture, the Environmental Protection Agency, the National Science Foundation, and the Department of State. In total, approximately $14.5 billion has been funded for civilian biodefense through FY2004, with an additional $7.6 billion in the President's budget request for FY2005. PMID:15225402

  14. 3. 4-billion-year-old biogenic pyrites from Barberton, South Africa: Sulfur isotope evidence

    SciTech Connect

    Ohmoto, H.; Kakegawa, T. ); Lowe, D.R. )

    1993-10-22

    Laser ablation mass spectroscopy analysis of sulfur isotopic compositions of microscopic-sized grains of pyrite that formed about 3.4 billion years ago in the Barberton Greenstone Belt, South Africa, show that the pyrite formed by bacterial reduction of seawater sulfate. These data imply that by about 3.4 billion years ago sulfate-reducing bacteria had become active, the oceans were rich in sulfate, and the atmosphere contained appreciable amounts (> > 10[sup [minus]13] of the present atmospheric level) of free oxygen.

  15. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  16. Exploring the Overestimation of Conjunctive Probabilities

    PubMed Central

    Nilsson, Håkan; Rieskamp, Jörg; Jenny, Mirjam A.

    2013-01-01

    People often overestimate probabilities of conjunctive events. The authors explored whether the accuracy of conjunctive probability estimates can be improved by increased experience with relevant constituent events and by using memory aids. The first experiment showed that increased experience with constituent events increased the correlation between the estimated and the objective conjunctive probabilities, but that it did not reduce overestimation of conjunctive probabilities. The second experiment showed that reducing cognitive load with memory aids for the constituent probabilities led to improved estimates of the conjunctive probabilities and to decreased overestimation of conjunctive probabilities. To explain the cognitive process underlying people’s probability estimates, the configural weighted average model was tested against the normative multiplicative model. The configural weighted average model generates conjunctive probabilities that systematically overestimate objective probabilities although the generated probabilities still correlate strongly with the objective probabilities. For the majority of participants this model was better than the multiplicative model in predicting the probability estimates. However, when memory aids were provided, the predictive accuracy of the multiplicative model increased. In sum, memory tools can improve people’s conjunctive probability estimates. PMID:23460026

  17. Direct probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.

    1993-09-17

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration.

  18. Trajectory versus probability density entropy.

    PubMed

    Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy. PMID:11461383

  19. Trajectory versus probability density entropy

    NASA Astrophysics Data System (ADS)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  20. Probability distributions of turbulent energy.

    PubMed

    Momeni, Mahdi; Müller, Wolf-Christian

    2008-05-01

    Probability density functions (PDFs) of scale-dependent energy fluctuations, P[deltaE(l)] , are studied in high-resolution direct numerical simulations of Navier-Stokes and incompressible magnetohydrodynamic (MHD) turbulence. MHD flows with and without a strong mean magnetic field are considered. For all three systems it is found that the PDFs of inertial range energy fluctuations exhibit self-similarity and monoscaling in agreement with recent solar-wind measurements [Hnat, Geophys. Res. Lett. 29, 86 (2002)]. Furthermore, the energy PDFs exhibit similarity over all scales of the turbulent system showing no substantial qualitative change of shape as the scale of the fluctuations varies. This is in contrast to the well-known behavior of PDFs of turbulent velocity fluctuations. In all three cases under consideration the P[deltaE(l)] resemble Lévy-type gamma distributions approximately Delta;{-1} exp(-|deltaE|/Delta)|deltaE|;{-gamma} The observed gamma distributions exhibit a scale-dependent width Delta(l) and a system-dependent gamma . The monoscaling property reflects the inertial-range scaling of the Elsässer-field fluctuations due to lacking Galilei invariance of deltaE . The appearance of Lévy distributions is made plausible by a simple model of energy transfer. PMID:18643170

  1. Two Billion Cars: What it Means for Climate and Energy Policy

    SciTech Connect

    Daniel Sperling

    2009-04-15

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  2. Two Billion Cars: What it Means for Climate and Energy Policy

    ScienceCinema

    Daniel Sperling

    2010-01-08

    April 13, 2009: Daniel Sperling, director of the Institute of Transportation Studies at UC Davis, presents the next installment of Berkeley Lab's Environmental Energy Technologies Divisions Distinguished Lecture series. He discusses Two Billion Cars and What it Means for Climate and Energy Policy.

  3. Nitrogen, phosphorus, and potassium requirements to support a multi-billion gallon biofuel industry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    To accomplish the goals for biofuel and bioenergy production, 1 billion tons of biomass will need to be produced annually by the year 2030. Crop production data from a joint study by the U.S. Department of Energy (US DOE) and the U.S. Department of Agriculture (USDA) demonstrated how this goal could...

  4. Multi-Billion Shot, High-Fluence Exposure of Cr(4+): YAG Passive Q-Switch

    NASA Technical Reports Server (NTRS)

    Stephen, Mark A.; Dallas, Joseph L.; Afzal, Robert S.

    1997-01-01

    NASA's Goddard Space Flight Center is developing the Geoscience Laser Altimeter System (GLAS) employing a diode pumped, Q-Switched, ND:YAG laser operating at 40 Hz repetition rate. To meet the five-year mission lifetime goal, a single transmitter would accumulate over 6.3 billion shots. Cr(4+):YAG is a promising candidate material for passively Q-switching the laser. Historically, the performance of saturable absorbers has degraded over long-duration usage. To measure the multi-billion shot performance of Cr(4+):YAG, a passively Q-switched GLAS-like oscillator was tested at an accelerated repetition rate of 500 Hz. The intracavity fluence was calculated to be approximately 2.5 J/cm(exp 2). The laser was monitored autonomously for 165 days. There was no evidence of change in the material optical properties during the 7.2 billion shot test.. All observed changes in laser operation could be attributed to pump laser diode aging. This is the first demonstration of multi-billion shot exposure testing of Cr(4+):YAG in this pulse energy regime

  5. High-Stakes Hustle: Public Schools and the New Billion Dollar Accountability

    ERIC Educational Resources Information Center

    Baines, Lawrence A.; Stanley, Gregory Kent

    2004-01-01

    High-stakes testing costs up to $50 billion per annum, has no impact on student achievement, and has changed the focus of American public schools. This article analyzes the benefits and costs of the accountability movement, as well as discusses its roots in the eugenics movements of the early 20th century.

  6. Conservation in a World of Six Billion: A Grassroots Action Guide.

    ERIC Educational Resources Information Center

    Hren, Benedict J.

    This grassroots action guide features a conservation initiative working to bring the impacts of human population growth, economic development, and natural resource consumption into balance with the limits of nature for the benefit of current and future generations. Contents include information sheets entitled "Six Billion People and Growing,""The…

  7. US Physician Practices Spend More Than $15.4 Billion Annually To Report Quality Measures.

    PubMed

    Casalino, Lawrence P; Gans, David; Weber, Rachel; Cea, Meagan; Tuchovsky, Amber; Bishop, Tara F; Miranda, Yesenia; Frankel, Brittany A; Ziehler, Kristina B; Wong, Meghan M; Evenson, Todd B

    2016-03-01

    Each year US physician practices in four common specialties spend, on average, 785 hours per physician and more than $15.4 billion dealing with the reporting of quality measures. While much is to be gained from quality measurement, the current system is unnecessarily costly, and greater effort is needed to standardize measures and make them easier to report. PMID:26953292

  8. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  9. Cancer costs projected to reach at least $158 billion in 2020

    Cancer.gov

    Based on growth and aging of the U.S. population, medical expenditures for cancer in the year 2020 are projected to reach at least $158 billion (in 2010 dollars) – an increase of 27 percent over 2010. If newly developed tools for cancer diagnosis, treatme

  10. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  12. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  13. Subjective and objective probabilities in quantum mechanics

    SciTech Connect

    Srednicki, Mark

    2005-05-15

    We discuss how the apparently objective probabilities predicted by quantum mechanics can be treated in the framework of Bayesian probability theory, in which all probabilities are subjective. Our results are in accord with earlier work by Caves, Fuchs, and Schack, but our approach and emphasis are different. We also discuss the problem of choosing a noninformative prior for a density matrix.

  14. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  15. Illustrating Basic Probability Calculations Using "Craps"

    ERIC Educational Resources Information Center

    Johnson, Roger W.

    2006-01-01

    Instructors may use the gambling game of craps to illustrate the use of a number of fundamental probability identities. For the "pass-line" bet we focus on the chance of winning and the expected game length. To compute these, probabilities of unions of disjoint events, probabilities of intersections of independent events, conditional probabilities…

  16. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  17. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  18. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  19. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    SciTech Connect

    Stewart, Jeffrey S.

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  20. How to Bring Solar Energy to Seven Billion People (LBNL Science at the Theater)

    ScienceCinema

    Wadia, Cyrus

    2011-04-28

    By exploiting the powers of nanotechnology and taking advantage of non-toxic, Earth-abundant materials, Berkeley Lab's Cyrus Wadia has fabricated new solar cell devices that have the potential to be several orders of magnitude less expensive than conventional solar cells. And by mastering the chemistry of these materials-and the economics of solar energy-he envisions bringing electricity to the 1.2 billion people now living without it.

  1. Aid to families with dependent children: who receives more than $22 billion and why?

    PubMed

    Waldman, H B

    1996-01-01

    A general outline of the Aid to Families with Dependent Children program is provided. The $22 billion program provides financial support to 14 million persons (including more than 9 million children). The changing character of the family structure is considered in terms of efforts to control AFDC spending. Additional programs to assist children (Social Security, Supplemental Security Insurance and Food Stamps) are reviewed. PMID:8708125

  2. Severe Obesity In Adults Cost State Medicaid Programs Nearly $8 Billion In 2013.

    PubMed

    Wang, Y Claire; Pamplin, John; Long, Michael W; Ward, Zachary J; Gortmaker, Steven L; Andreyeva, Tatiana

    2015-11-01

    Efforts to expand Medicaid while controlling spending must be informed by a deeper understanding of the extent to which the high medical costs associated with severe obesity (having a body mass index of [Formula: see text] or higher) determine spending at the state level. Our analysis of population-representative data indicates that in 2013, severe obesity cost the nation approximately $69 billion, which accounted for 60 percent of total obesity-related costs. Approximately 11 percent of the cost of severe obesity was paid for by Medicaid, 30 percent by Medicare and other federal health programs, 27 percent by private health plans, and 30 percent out of pocket. Overall, severe obesity cost state Medicaid programs almost $8 billion a year, ranging from $5 million in Wyoming to $1.3 billion in California. These costs are likely to increase following Medicaid expansion and enhanced coverage of weight loss therapies in the form of nutrition consultation, drug therapy, and bariatric surgery. Ensuring and expanding Medicaid-eligible populations' access to cost-effective treatment for severe obesity should be part of each state's strategy to mitigate rising obesity-related health care costs. PMID:26526251

  3. Spatial variability in oceanic redox structure 1.8billion years ago

    NASA Astrophysics Data System (ADS)

    Poulton, Simon W.; Fralick, Philip W.; Canfield, Donald E.

    2010-07-01

    The evolution of ocean chemistry during the Proterozoic eon (2.5-0.542 billion years ago) is thought to have played a central role in both the timing and rate of eukaryote evolution. The timing of the deposition of iron formations implies that, early in the Earth's history, oceans were predominantly anoxic and rich in dissolved iron. However, global deposition of iron formations ceased about 1.84 billion years ago. This termination indicates a major upheaval in ocean chemistry, but the precise nature of this change remains debated. Here we use iron and sulphur systematics to reconstruct oceanic redox conditions from the 1.88- to 1.83-billion-year-old Animikie group from the Superior region, North America. We find that surface waters were oxygenated, whereas at mid-depths, anoxic and sulphidic (euxinic) conditions extended over 100km from the palaeoshoreline. The spatial extent of euxinia varied through time, but deep ocean waters remained rich in dissolved iron. Widespread euxinia along continental margins would have removed dissolved iron from the water column through the precipitation of pyrite, which would have reduced the supply of dissolved iron and resulted in the global cessation of the deposition of `Superior-type' iron formations. We suggest that incursions of sulphide from the mid-depths into overlying oxygenated surface waters may have placed severe constraints on eukaryotic evolution.

  4. MMap: Fast Billion-Scale Graph Computation on a PC via Memory Mapping

    PubMed Central

    Lin, Zhiyuan; Kahng, Minsuk; Sabrin, Kaeser Md.; Chau, Duen Horng (Polo); Lee, Ho; Kang, U

    2015-01-01

    Graph computation approaches such as GraphChi and TurboGraph recently demonstrated that a single PC can perform efficient computation on billion-node graphs. To achieve high speed and scalability, they often need sophisticated data structures and memory management strategies. We propose a minimalist approach that forgoes such requirements, by leveraging the fundamental memory mapping (MMap) capability found on operating systems. We contribute: (1) a new insight that MMap is a viable technique for creating fast and scalable graph algorithms that surpasses some of the best techniques; (2) the design and implementation of popular graph algorithms for billion-scale graphs with little code, thanks to memory mapping; (3) extensive experiments on real graphs, including the 6.6 billion edge YahooWeb graph, and show that this new approach is significantly faster or comparable to the highly-optimized methods (e.g., 9.5× faster than GraphChi for computing PageRank on 1.47B edge Twitter graph). We believe our work provides a new direction in the design and development of scalable algorithms. Our packaged code is available at http://poloclub.gatech.edu/mmap/. PMID:25866846

  5. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    PubMed

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes. PMID:27049949

  6. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    PubMed

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-01

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes. PMID:22158244

  7. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  8. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  9. Experience Matters: Information Acquisition Optimizes Probability Gain

    PubMed Central

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  10. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior. PMID:20525915

  11. Derivation of quantum probability from measurement

    NASA Astrophysics Data System (ADS)

    Herbut, Fedor

    2016-05-01

    To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).

  12. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  13. Dynamic probability estimator for machine learning.

    PubMed

    Starzyk, Janusz A; Wang, Feng

    2004-03-01

    An efficient algorithm for dynamic estimation of probabilities without division on unlimited number of input data is presented. The method estimates probabilities of the sampled data from the raw sample count, while keeping the total count value constant. Accuracy of the estimate depends on the counter size, rather than on the total number of data points. Estimator follows variations of the incoming data probability within a fixed window size, without explicit implementation of the windowing technique. Total design area is very small and all probabilities are estimated concurrently. Dynamic probability estimator was implemented using a programmable gate array from Xilinx. The performance of this implementation is evaluated in terms of the area efficiency and execution time. This method is suitable for the highly integrated design of artificial neural networks where a large number of dynamic probability estimators can work concurrently. PMID:15384523

  14. Entropy analysis of systems exhibiting negative probabilities

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J. A.

    2016-07-01

    This paper addresses the concept of negative probability and its impact upon entropy. An analogy between the probability generating functions, in the scope of quasiprobability distributions, and the Grünwald-Letnikov definition of fractional derivatives, is explored. Two distinct cases producing negative probabilities are formulated and their distinct meaning clarified. Numerical calculations using the Shannon entropy characterize further the characteristics of the two limit cases.

  15. Calculating the CEP (Circular Error Probable)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report compares the probability contained in the Circular Error Probable associated with an Elliptical Error Probable to that of the EEP at a given confidence level. The levels examined are 50 percent and 95 percent. The CEP is found to be both more conservative and less conservative than the associated EEP, depending on the eccentricity of the ellipse. The formulas used are derived in the appendix.

  16. Predicting accurate probabilities with a ranking loss

    PubMed Central

    Menon, Aditya Krishna; Jiang, Xiaoqian J; Vembu, Shankar; Elkan, Charles; Ohno-Machado, Lucila

    2013-01-01

    In many real-world applications of machine learning classifiers, it is essential to predict the probability of an example belonging to a particular class. This paper proposes a simple technique for predicting probabilities based on optimizing a ranking loss, followed by isotonic regression. This semi-parametric technique offers both good ranking and regression performance, and models a richer set of probability distributions than statistical workhorses such as logistic regression. We provide experimental results that show the effectiveness of this technique on real-world applications of probability prediction. PMID:25285328

  17. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  18. Air density 2.7 billion years ago limited to less than twice modern levels by fossil raindrop imprints.

    PubMed

    Som, Sanjoy M; Catling, David C; Harnmeijer, Jelte P; Polivka, Peter M; Buick, Roger

    2012-04-19

    According to the 'Faint Young Sun' paradox, during the late Archaean eon a Sun approximately 20% dimmer warmed the early Earth such that it had liquid water and a clement climate. Explanations for this phenomenon have invoked a denser atmosphere that provided warmth by nitrogen pressure broadening or enhanced greenhouse gas concentrations. Such solutions are allowed by geochemical studies and numerical investigations that place approximate concentration limits on Archaean atmospheric gases, including methane, carbon dioxide and oxygen. But no field data constraining ground-level air density and barometric pressure have been reported, leaving the plausibility of these various hypotheses in doubt. Here we show that raindrop imprints in tuffs of the Ventersdorp Supergroup, South Africa, constrain surface air density 2.7 billion years ago to less than twice modern levels. We interpret the raindrop fossils using experiments in which water droplets of known size fall at terminal velocity into fresh and weathered volcanic ash, thus defining a relationship between imprint size and raindrop impact momentum. Fragmentation following raindrop flattening limits raindrop size to a maximum value independent of air density, whereas raindrop terminal velocity varies as the inverse of the square root of air density. If the Archaean raindrops reached the modern maximum measured size, air density must have been less than 2.3 kg m(-3), compared to today's 1.2 kg m(-3), but because such drops rarely occur, air density was more probably below 1.3 kg m(-3). The upper estimate for air density renders the pressure broadening explanation possible, but it is improbable under the likely lower estimates. Our results also disallow the extreme CO(2) levels required for hot Archaean climates. PMID:22456703

  19. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  20. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  1. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  2. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  3. Teaching Statistics and Probability: 1981 Yearbook.

    ERIC Educational Resources Information Center

    Shulte, Albert P., Ed.; Smart, James R., Ed.

    This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…

  4. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  5. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  6. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  7. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  8. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  9. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  10. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  11. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  12. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  13. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  14. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  15. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  16. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  17. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  18. Quantum probability assignment limited by relativistic causality

    PubMed Central

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  19. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... Because there is a shift of more than fifty percentage points in the ownership of L stock during the three... section. As a result, they remain part of the public group which owns L stock, and no owner shift results..., apply this paragraph (j)(2)— (1) On a corporation-wide basis, in which case the small...

  20. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... acquisition of stock. A principal element in determining if such an understanding exists is whether the... acquisition of stock and, therefore, the Group is an entity. Thus, the acquisition of more than five percent... to a formal or informal understanding among themselves to make a coordinated acquisition of...

  1. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout...

  2. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (iii), as contained in 26 CFR part 1 revised as of April 1, 1994, for the application of paragraph (j... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout...

  3. 26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout...

  4. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile fiber... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Fibers present in amounts of less than...

  5. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile fiber... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Fibers present in amounts of less than...

  6. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile fiber... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Fibers present in amounts of less than...

  7. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile fiber product which has a clearly established and definite functional significance when present in the...

  8. Corrosion behavior of aluminum-alumina composites in aerated 3.5 percent chloride solution

    NASA Astrophysics Data System (ADS)

    Acevedo Hurtado, Paul Omar

    Aluminum based metal matrix composites are finding many applications in engineering. Of these Al-Al2O3 composites appear to have promise in a number of defense applications because of their mechanical properties. However, their corrosion behavior remains suspect, especially in marine environments. While efforts are being made to improve the corrosion resistance of Al-Al2O3 composites, the mechanism of corrosion is not well known. In this study, the corrosion behavior of powder metallurgy processed Al-Cu alloy reinforced with 10, 15, 20 and 25 vol. % Al2O3 particles (XT 1129, XT 2009, XT 2048, XT 2031) was evaluated in aerated 3.5% NaCl solution using microstructural and electrochemical measurements. AA1100-O and AA2024T4 monolithic alloys were also studied for comparison purposes. The composites and unreinforced alloys were subjected to potentiodynamic polarization and Electrochemical Impedance Spectroscopy (EIS) testing. Addition of 25 vol. % Al2O 3 to the base alloys was found to increase its corrosion resistance considerably. Microstructural studies revealed the presence of intermetallic Al2Cu particles in these composites that appeared to play an important role in the observations. Pitting potential for these composites was near corrosion potential values, and repassivation potential was below the corresponding corrosion potential, indicating that these materials begin to corrode spontaneously as soon as they come in contact with the 3.5 % NaCl solution. EIS measurements indicate the occurrence of adsorption/diffusion phenomena at the interface of the composites which ultimately initiate localized or pitting corrosion. Polarization resistance values were extracted from the EIS data for all the materials tested. Electrically equivalent circuits are proposed to describe and substantiate the corrosive processes occurring in these Al-Al2O 3 composite materials.

  9. A 25.5 percent AM0 gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltage approach 1.120 V and efficiencies well over 25 percent (AM0) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  10. A 25.5 percent AMO gallium arsenide grating solar cell

    NASA Technical Reports Server (NTRS)

    Weizer, V. G.; Godlewski, M. P.

    1985-01-01

    Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltages approach 1.120 V and efficiencies well over 25 percent (AMO) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.

  11. 16 CFR 303.3 - Fibers present in amounts of less than 5 percent.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... this section shall be construed as prohibiting the disclosure of any fiber present in a textile...

  12. 30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....22233 Section 57.22233 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND HEALTH SAFETY AND HEALTH STANDARDS-UNDERGROUND METAL AND NONMETAL... other work shall be permitted in affected areas....

  13. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core

    NASA Astrophysics Data System (ADS)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J.; Greene, Jenny E.; Blakeslee, John P.; Janish, Ryan

    2016-04-01

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day ‘dormant’ descendants of this population of ‘active’ black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall—the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600—a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  14. Nanobubble collapse on a silica surface in water: billion-atom reactive molecular dynamics simulations.

    PubMed

    Shekhar, Adarsh; Nomura, Ken-ichi; Kalia, Rajiv K; Nakano, Aiichiro; Vashishta, Priya

    2013-11-01

    Cavitation bubbles occur in fluids subjected to rapid changes in pressure. We use billion-atom reactive molecular dynamics simulations on a 163,840-processor BlueGene/P supercomputer to investigate damage caused by shock-induced collapse of nanobubbles in water near an amorphous silica surface. Collapse of an empty bubble generates a high-speed nanojet, which causes pitting on the silica surface. We find pit radii are close to bubble radii, and experiments also indicate linear scaling between them. The gas-filled bubbles undergo partial collapse and, consequently, the damage on the silica surface is mitigated. PMID:24237524

  15. The First Billion Years: The Growth of Galaxies in the Reionization Epoch

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth

    2015-08-01

    Detection and measurement of the earliest galaxies in the first billion years only became possible after the Hubble Space Telescope was updated in 2009 with the infrared WFC3/IR camera during Shuttle servicing mission SM4. The first billion years is a fascinating epoch, not just because of the earliest galaxies known from about 450 Myr after the Big Bang, but also because it encompasses the reionization epoch that peaked around z~9, as Planck has recently shown, and ended around redshift z~6 at 900 Myr. Before 2009 just a handful of galaxies were known in the reionization epoch at z>6. But within the last 5 years, with the first HUDF09 survey, the HUDF12, CANDELS and numerous other surveys on the GOODS and CANDELS fields, as well as detections from the cluster lensing programs like CLASH and the Frontier Fields, the number of galaxies at redshifts 7-10 has exploded, with some 700 galaxies being found and characterized. The first billion years was a period of extraordinary growth in the galaxy population with rapid growth in the star formation rate density and global mass density in galaxies. Spitzer observations in the infrared of these Hubble fields are establishing masses as well as giving insights into the nature and timescales of star formation from the very powerful emission lines being revealed by the Spitzer IRAC data. I will discuss what we understand about the growth of galaxies in this epoch from the insights gained from remarkable deep fields like the XDF, as well as the wide-area GOODS/CANDELS fields, the detection of unexpectedly luminous galaxies at redshifts 8-10, the impact of early galaxies on reionization, confirmation of a number of galaxies at z~7-8 from ground-based spectroscopic measurements, and the indications of a change in the growth of the star formation rate around 500 Myr. The first billion years was a time of dramatic growth and change in the early galaxy population.

  16. Nanobubble Collapse on a Silica Surface in Water: Billion-Atom Reactive Molecular Dynamics Simulations

    NASA Astrophysics Data System (ADS)

    Shekhar, Adarsh; Nomura, Ken-ichi; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya

    2013-11-01

    Cavitation bubbles occur in fluids subjected to rapid changes in pressure. We use billion-atom reactive molecular dynamics simulations on a 163 840-processor BlueGene/P supercomputer to investigate damage caused by shock-induced collapse of nanobubbles in water near an amorphous silica surface. Collapse of an empty bubble generates a high-speed nanojet, which causes pitting on the silica surface. We find pit radii are close to bubble radii, and experiments also indicate linear scaling between them. The gas-filled bubbles undergo partial collapse and, consequently, the damage on the silica surface is mitigated.

  17. Electron microscopy reveals unique microfossil preservation in 1 billion-year-old lakes

    NASA Astrophysics Data System (ADS)

    Saunders, M.; Kong, C.; Menon, S.; Wacey, D.

    2014-06-01

    Electron microscopy was applied to the study of 1 billion-year-old microfossils from northwest Scotland in order to investigate their 3D morphology and mode of fossilization. 3D-FIB-SEM revealed high quality preservation of organic cell walls with only minor amounts of post-mortem decomposition, followed by variable degrees of morphological alteration (folding and compression of cell walls) during sediment compaction. EFTEM mapping plus SAED revealed a diverse fossilizing mineral assemblage including K-rich clay, Fe-Mg-rich clay and calcium phosphate, with each mineral occupying specific microenvironments in proximity to carbonaceous microfossil cell walls.

  18. The $1. 5 billion question: Can the US Global Change Research Program deliver on its promises

    SciTech Connect

    Monastersky, R.

    1993-09-04

    President Clinton has continued the funding for scientific investigations of global climatic change, increasing funds to a total of $1.5 billion spread amoung 11 different agencies. However, a growing number of critics warn that the program appears heading toward failure. The main issue is relevancy. Almost every agrees that the research effort will support important scientific work over the next decade, but it will not necessarily provide the information policymakers need to address the threat of climatic change, ozone depletion, deforestation, desertification, and similiar issues. This article summarizes the concerns and comments of critics, and the gap between the climate scientists and governmental policymakers.

  19. States' Spending on Colleges Rises 19 Pct. in 2 Years, Nears $31-Billion for'85-86.

    ERIC Educational Resources Information Center

    Evangelauf, Jean

    1985-01-01

    The U.S. states' expenditures to nearly $31 billion in tax money mark a continuing recovery in support for higher education. Shaping this year's appropriations levels were concerns about tuition and efforts to promote economic development. (MLW)

  20. 77 FR 15052 - Dataset Workshop-U.S. Billion Dollar Disasters Dataset (1980-2011): Assessing Dataset Strengths...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-14

    ... Disasters (1980-2011) dataset and associated methods used to develop the data set. An important goal of the... data set addresses; What steps should be taken to enhance the robustness of the billion-dollar...

  1. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  2. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  3. Semigroups of tomographic probabilities and quantum correlations

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.

    2008-08-01

    Semigroups of stochastic and bistochastic matrices constructed by means of spin tomograms or tomographic probabilities and their relations to the problem of Bell's inequalities and entanglement are reviewed. The probability determining the quantum state of spins and the probability densities determining the quantum states of particles with continuous variables are considered. Entropies for semigroups of stochastic and bisctochastic matrices are studied, in view of both the Shannon information entropy and its generalization like Rényi entropy. Qubit portraits of qudit states are discussed in the connection with the problem of Bell's inequality violation for entangled states.

  4. Probability distributions for a surjective unimodal map

    NASA Astrophysics Data System (ADS)

    Sun, Hongyan; Wang, Long

    1996-04-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types, δ function, asymmetric and symmetric type; by identifying the binary structures of its initial values. The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps, and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  5. A massive galaxy in its core formation phase three billion years after the Big Bang.

    PubMed

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha Förster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-09-18

    Most massive galaxies are thought to have formed their dense stellar cores in early cosmic epochs. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes, but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we report a candidate core in the process of formation 11 billion years ago, at redshift z = 2.3. This galaxy, GOODS-N-774, has a stellar mass of 100 billion solar masses, a half-light radius of 1.0 kiloparsecs and a star formation rate of solar masses per year. The star-forming gas has a velocity dispersion of 317 ± 30 kilometres per second. This is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, which are compact quiescent galaxies at z ≈ 2 (refs 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 seem to be rare; however, from the star formation rate and size of this galaxy we infer that many star-forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys. PMID:25162527

  6. A solar origin for the large lunar magnetic field at 4.0 billion yr ago

    NASA Technical Reports Server (NTRS)

    Banerjee, S. K.; Mellema, J. P.

    1976-01-01

    A new method (Shaw, 1974) for paleointensity determination has been applied to three subsamples of one polymict breccia, 72215 (of age 4.0 billion yr) to yield an average paleointensity of 0.41 Oe at the Taurus-Littrow region of the moon around the time of breccia formation. Of the present models for lunar magnetism, only the Sonett and Runcorn (1974) model of a central iron core dynamo can explain the presence of such a large field in early lunar history. However, because of the similarity in size of this field and that for the early solar system deduced from carbonaceous chondrites, we draw attention to an apparently little-considered possibility: that the large magnetic field in early lunar history was external and solar in origin, and emanated from a pre-main sequence T-Tauri stage sun. Therefore, there should be no record of such a large magnetic field in lunar rocks younger than approximately 4.0 billion yr.

  7. The evolution in the stellar mass of brightest cluster galaxies over the past 10 billion years

    NASA Astrophysics Data System (ADS)

    Bellstedt, Sabine; Lidman, Chris; Muzzin, Adam; Franx, Marijn; Guatelli, Susanna; Hill, Allison R.; Hoekstra, Henk; Kurinsky, Noah; Labbe, Ivo; Marchesini, Danilo; Marsan, Z. Cemile; Safavi-Naeini, Mitra; Sifón, Cristóbal; Stefanon, Mauro; van de Sande, Jesse; van Dokkum, Pieter; Weigel, Catherine

    2016-08-01

    Using a sample of 98 galaxy clusters recently imaged in the near-infrared with the European Southern Observatory (ESO) New Technology Telescope, WIYN telescope and William Herschel Telescope, supplemented with 33 clusters from the ESO archive, we measure how the stellar mass of the most massive galaxies in the universe, namely brightest cluster galaxies (BCGs), increases with time. Most of the BCGs in this new sample lie in the redshift range 0.2 < z < 0.6, which has been noted in recent works to mark an epoch over which the growth in the stellar mass of BCGs stalls. From this sample of 132 clusters, we create a subsample of 102 systems that includes only those clusters that have estimates of the cluster mass. We combine the BCGs in this subsample with BCGs from the literature, and find that the growth in stellar mass of BCGs from 10 billion years ago to the present epoch is broadly consistent with recent semi-analytic and semi-empirical models. As in other recent studies, tentative evidence indicates that the stellar mass growth rate of BCGs may be slowing in the past 3.5 billion years. Further work in collecting larger samples, and in better comparing observations with theory using mock images, is required if a more detailed comparison between the models and the data is to be made.

  8. The evolution in the stellar mass of brightest cluster galaxies over the past 10 billion years

    NASA Astrophysics Data System (ADS)

    Bellstedt, Sabine; Lidman, Chris; Muzzin, Adam; Franx, Marijn; Guatelli, Susanna; Hill, Allison R.; Hoekstra, Henk; Kurinsky, Noah; Labbe, Ivo; Marchesini, Danilo; Marsan, Z. Cemile; Safavi-Naeini, Mitra; Sifón, Cristóbal; Stefanon, Mauro; van de Sande, Jesse; van Dokkum, Pieter; Weigel, Catherine

    2016-08-01

    Using a sample of 98 galaxy clusters recently imaged in the near infra-red with the ESO NTT, WIYN and WHT telescopes, supplemented with 33 clusters from the ESO archive, we measure how the stellar mass of the most massive galaxies in the universe, namely Brightest Cluster Galaxies (BCG), increases with time. Most of the BCGs in this new sample lie in the redshift range $0.2billion years ago to the present epoch is broadly consistent with recent semi-analytic and semi-empirical models. As in other recent studies, tentative evidence indicates that the stellar mass growth rate of BCGs may be slowing in the past 3.5 billion years. Further work in collecting larger samples, and in better comparing observations with theory using mock images is required if a more detailed comparison between the models and the data is to be made.

  9. Parametrization and Classification of 20 Billion LSST Objects: Lessons from SDSS

    SciTech Connect

    Ivezic, Z.; Axelrod, T.; Becker, A.C.; Becla, J.; Borne, K.; Burke, David L.; Claver, C.F.; Cook, K.H.; Connolly, A.; Gilmore, D.K.; Jones, R.L.; Juric, M.; Kahn, Steven M.; Lim, K-T.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Sesar, B.; Stubbs, Christopher W.; Tyson, J.Anthony; /UC, Davis

    2011-11-10

    The Large Synoptic Survey Telescope (LSST) will be a large, wide-field ground-based system designed to obtain, starting in 2015, multiple images of the sky that is visible from Cerro Pachon in Northern Chile. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times during the anticipated 10 years of operations (distributed over six bands, ugrizy). Each 30-second long visit will deliver 5{sigma} depth for point sources of r {approx} 24.5 on average. The co-added map will be about 3 magnitudes deeper, and will include 10 billion galaxies and a similar number of stars. We discuss various measurements that will be automatically performed for these 20 billion sources, and how they can be used for classification and determination of source physical and other properties. We provide a few classification examples based on SDSS data, such as color classification of stars, color-spatial proximity search for wide-angle binary stars, orbital-color classification of asteroid families, and the recognition of main Galaxy components based on the distribution of stars in the position-metallicity-kinematics space. Guided by these examples, we anticipate that two grand classification challenges for LSST will be (1) rapid and robust classification of sources detected in difference images, and (2) simultaneous treatment of diverse astrometric and photometric time series measurements for an unprecedentedly large number of objects.

  10. The evolution in the stellar mass of Brightest Cluster Galaxies over the past 10 billion years

    NASA Astrophysics Data System (ADS)

    Bellstedt, Sabine; Lidman, Chris; Muzzin, Adam; Franx, Marijn; Guatelli, Susanna; Hill, Allison R.; Hoekstra, Henk; Kurinsky, Noah; Labbe, Ivo; Marchesini, Danilo; Marsan, Z. Cemile; Safavi-Naeini, Mitra; Sifón, Cristóbal; Stefanon, Mauro; van de Sande, Jesse; van Dokkum, Pieter; Weigel, Catherine

    2016-05-01

    Using a sample of 98 galaxy clusters recently imaged in the near infra-red with the ESO NTT, WIYN and WHT telescopes, supplemented with 33 clusters from the ESO archive, we measure how the stellar mass of the most massive galaxies in the universe, namely Brightest Cluster Galaxies (BCG), increases with time. Most of the BCGs in this new sample lie in the redshift range 0.2 < z < 0.6, which has been noted in recent works to mark an epoch over which the growth in the stellar mass of BCGs stalls. From this sample of 132 clusters, we create a subsample of 102 systems that includes only those clusters that have estimates of the cluster mass. We combine the BCGs in this subsample with BCGs from the literature, and find that the growth in stellar mass of BCGs from 10 billion years ago to the present epoch is broadly consistent with recent semi-analytic and semi-empirical models. As in other recent studies, tentative evidence indicates that the stellar mass growth rate of BCGs may be slowing in the past 3.5 billion years. Further work in collecting larger samples, and in better comparing observations with theory using mock images is required if a more detailed comparison between the models and the data is to be made.

  11. Greenhouse gas implications of a 32 billion gallon bioenergy landscape in the US

    NASA Astrophysics Data System (ADS)

    DeLucia, E. H.; Hudiburg, T. W.; Wang, W.; Khanna, M.; Long, S.; Dwivedi, P.; Parton, W. J.; Hartman, M. D.

    2015-12-01

    Sustainable bioenergy for transportation fuel and greenhouse gas (GHGs) reductions may require considerable changes in land use. Perennial grasses have been proposed because of their potential to yield substantial biomass on marginal lands without displacing food and reduce GHG emissions by storing soil carbon. Here, we implemented an integrated approach to planning bioenergy landscapes by combining spatially-explicit ecosystem and economic models to predict a least-cost land allocation for a 32 billion gallon (121 billion liter) renewable fuel mandate in the US. We find that 2022 GHG transportation emissions are decreased by 7% when 3.9 million hectares of eastern US land are converted to perennial grasses supplemented with corn residue to meet cellulosic ethanol requirements, largely because of gasoline displacement and soil carbon storage. If renewable fuel production is accompanied by a cellulosic biofuel tax credit, CO2 equivalent emissions could be reduced by 12%, because it induces more cellulosic biofuel and land under perennial grasses (10 million hectares) than under the mandate alone. While GHG reducing bioenergy landscapes that meet RFS requirements and do not displace food are possible, the reductions in GHG emissions are 50% less compared to previous estimates that did not account for economically feasible land allocation.

  12. The Value Of The Nonprofit Hospital Tax Exemption Was $24.6 Billion In 2011.

    PubMed

    Rosenbaum, Sara; Kindig, David A; Bao, Jie; Byrnes, Maureen K; O'Laughlin, Colin

    2015-07-01

    The federal government encourages public support for charitable activities by allowing people to deduct donations to tax-exempt organizations on their income tax returns. Tax-exempt hospitals are major beneficiaries of this policy because it encourages donations to the hospitals while shielding them from federal and state tax liability. In exchange, these hospitals must engage in community benefit activities, such as providing care to indigent patients and participating in Medicaid. The congressional Joint Committee on Taxation estimated the value of the nonprofit hospital tax exemption at $12.6 billion in 2002--a number that included forgone taxes, public contributions, and the value of tax-exempt bond financing. In this article we estimate that the size of the exemption reached $24.6 billion in 2011. The Affordable Care Act (ACA) brings a new focus on community benefit activities by requiring tax-exempt hospitals to engage in communitywide planning efforts to improve community health. The magnitude of the tax exemption, coupled with ACA reforms, underscores the public's interest not only in community benefit spending generally but also in the extent to which nonprofit hospitals allocate funds for community benefit expenditures that improve the overall health of their communities. PMID:26085486

  13. As its R D budget nears $2 billion Bayer rethinks priorities

    SciTech Connect

    Rotman, D.

    1993-03-17

    With a planned 1993 research and development budget of roughly DM3.2 billion ($1.95 billion), Bayer (Leverkusen) is the industry's biggest R D spender. But while the German giant lays out a healthy 7% of sales on R D, caution is clearly replacing the heady spending spurts of several years ago. And faced with an increasingly rigorous corporate business restructuring, Bayer - like others in the chemical industry - is rethinking its R D strategies. While Bayer's R D stress is clearly on life sciences, the company remains bullish on certain new materials, particularly inorganics. It has developed several engineering ceramics for use in automotive engines, with the most advanced - a silicon nitride - being road tested in Mercedes models. [open quotes]We have the materials and know their properties,[close quotes] says Hauke Fuerstenwerth, Bayer's head of R D coordination. The challenge now, he says, is to demonstrate a commercially attractive process for large-scale production. Bayer is also pursuing new silicon wafer technology. Already in small-scale production, the firm is testing an amorphous silicon that is intended to be far cheaper than existing crystalline silicon wafers, while maintaining suitable properties for applications such as solar collectors.

  14. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  15. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications. PMID:25559642

  16. A Survey of Tables of Probability Distributions

    PubMed Central

    Kacker, Raghu; Olkin, Ingram

    2005-01-01

    This article is a survey of the tables of probability distributions published about or after the publication in 1964 of the Handbook of Mathematical Functions, edited by Abramowitz and Stegun PMID:27308104

  17. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959

  18. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  19. Determining Probabilities by Examining Underlying Structure.

    ERIC Educational Resources Information Center

    Norton, Robert M.

    2001-01-01

    Discusses how dice games pose fairness issues that appeal to students and examines a structure for three games involving two dice in a way that leads directly to the theoretical probabilities for all possible outcomes. (YDS)

  20. Neutron initiation probability in fast burst reactor

    SciTech Connect

    Liu, X.; Du, J.; Xie, Q.; Fan, X.

    2012-07-01

    Based on the probability balance of neutron random events in multiply system, the four random process of neutron in prompt super-critical is described and then the equation of neutron initiation probability W(r,E,{Omega},t) is deduced. On the assumption of static, slightly prompt super-critical and the two factorial approximation, the formula of the average probability of 'one' neutron is derived which is the same with the result derived from the point model. The MC simulation using point model is applied in Godiva- II and CFBR-II, and the simulation result of one neutron initiation is well consistent with the theory that the initiation probability of Godiva- II inverted commas CFBR-II burst reactor are 0.00032, 0.00027 respectively on the ordinary burst operation. (authors)

  1. Probability tree algorithm for general diffusion processes

    NASA Astrophysics Data System (ADS)

    Ingber, Lester; Chen, Colleen; Mondescu, Radu Paul; Muzzall, David; Renedo, Marco

    2001-11-01

    Motivated by path-integral numerical solutions of diffusion processes, PATHINT, we present a tree algorithm, PATHTREE, which permits extremely fast accurate computation of probability distributions of a large class of general nonlinear diffusion processes.

  2. Transition Probability and the ESR Experiment

    ERIC Educational Resources Information Center

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  3. Non-Gaussian Photon Probability Distribution

    NASA Astrophysics Data System (ADS)

    Solomon, Benjamin T.

    2010-01-01

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√u ) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al. (2006) microwave cloaking, and Oulton et al. (2008) sub wavelength confinement; thereby providing a strong case that

  4. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  5. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  6. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  7. The cumulative reaction probability as eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Miller, William H.

    1993-09-01

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.

  8. Familiarity and preference for pitch probability profiles.

    PubMed

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge. PMID:25838257

  9. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  10. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  11. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  12. Star Formation in Galaxy Clusters Over the Past 10 Billion Years

    NASA Astrophysics Data System (ADS)

    Tran, Kim-Vy

    2012-01-01

    Galaxy clusters are the largest gravitationally bound systems in the universe and include the most massive galaxies in the universe; this makes galaxy clusters ideal laboratories for disentangling the nature versus nurture aspect of how galaxies evolve. Understanding how galaxies form and evolve in clusters continues to be a fundamental question in astronomy. The ages and assembly histories of galaxies in rich clusters test both stellar population models and hierarchical formation scenarios. Is star formation in cluster galaxies simply accelerated relative to their counterparts in the lower density field, or do cluster galaxies assemble their stars in a fundamentally different manner? To answer this question, I review multi-wavelength results on star formation in galaxy clusters from Coma to the most distant clusters yet discovered at look-back times of 10 billion years (z 2).

  13. Billion-atom synchronous parallel kinetic Monte Carlo simulations of critical 3D Ising systems

    SciTech Connect

    Martinez, E.; Monasterio, P.R.; Marian, J.

    2011-02-20

    An extension of the synchronous parallel kinetic Monte Carlo (spkMC) algorithm developed by Martinez et al. [J. Comp. Phys. 227 (2008) 3804] to discrete lattices is presented. The method solves the master equation synchronously by recourse to null events that keep all processors' time clocks current in a global sense. Boundary conflicts are resolved by adopting a chessboard decomposition into non-interacting sublattices. We find that the bias introduced by the spatial correlations attendant to the sublattice decomposition is within the standard deviation of serial calculations, which confirms the statistical validity of our algorithm. We have analyzed the parallel efficiency of spkMC and find that it scales consistently with problem size and sublattice partition. We apply the method to the calculation of scale-dependent critical exponents in billion-atom 3D Ising systems, with very good agreement with state-of-the-art multispin simulations.

  14. A change in the geodynamics of continental growth 3 billion years ago.

    PubMed

    Dhuime, Bruno; Hawkesworth, Chris J; Cawood, Peter A; Storey, Craig D

    2012-03-16

    Models for the growth of continental crust rely on knowing the balance between the generation of new crust and the reworking of old crust throughout Earth's history. The oxygen isotopic composition of zircons, for which uranium-lead and hafnium isotopic data provide age constraints, is a key archive of crustal reworking. We identified systematic variations in hafnium and oxygen isotopes in zircons of different ages that reveal the relative proportions of reworked crust and of new crust through time. Growth of continental crust appears to have been a continuous process, albeit at variable rates. A marked decrease in the rate of crustal growth at ~3 billion years ago may be linked to the onset of subduction-driven plate tectonics. PMID:22422979

  15. Collision-free spatial hash functions for structural analysis of billion-vertex chemical bond networks

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Bansal, Bhupesh; Branicio, Paulo S.; Kalia, Rajiv K.; Nakano, Aiichiro; Sharma, Ashish; Vashishta, Priya

    2006-09-01

    State-of-the-art molecular dynamics (MD) simulations generate massive datasets involving billion-vertex chemical bond networks, which makes data mining based on graph algorithms such as K-ring analysis a challenge. This paper proposes an algorithm to improve the efficiency of ring analysis of large graphs, exploiting properties of K-rings and spatial correlations of vertices in the graph. The algorithm uses dual-tree expansion (DTE) and spatial hash-function tagging (SHAFT) to optimize computation and memory access. Numerical tests show nearly perfect linear scaling of the algorithm. Also a parallel implementation of the DTE + SHAFT algorithm achieves high scalability. The algorithm has been successfully employed to analyze large MD simulations involving up to 500 million atoms.

  16. Extraterrestrial demise of banded iron formations 1.85 billion years ago

    USGS Publications Warehouse

    Slack, J.F.; Cannon, W.F.

    2009-01-01

    In the Lake Superior region of North America, deposition of most banded iron formations (BIFs) ended abruptly 1.85 Ga ago, coincident with the oceanic impact of the giant Sudbury extraterrestrial bolide. We propose a new model in which this impact produced global mixing of shallow oxic and deep anoxic waters of the Paleoproterozoic ocean, creating a suboxic redox state for deep seawater. This suboxic state, characterized by only small concentrations of dissolved O2 (???1 ??M), prevented transport of hydrothermally derived Fe(II) from the deep ocean to continental-margin settings, ending an ???1.1 billion-year-long period of episodic BIF mineralization. The model is supported by the nature of Precambrian deep-water exhalative chemical sediments, which changed from predominantly sulfide facies prior to ca. 1.85 Ga to mainly oxide facies thereafter. ?? 2009 Geological Society of America.

  17. Investigation of Radar Propagation in Buildings: A 10 Billion Element Cartesian-Mesh FETD Simulation

    SciTech Connect

    Stowell, M L; Fasenfest, B J; White, D A

    2008-01-14

    In this paper large scale full-wave simulations are performed to investigate radar wave propagation inside buildings. In principle, a radar system combined with sophisticated numerical methods for inverse problems can be used to determine the internal structure of a building. The composition of the walls (cinder block, re-bar) may effect the propagation of the radar waves in a complicated manner. In order to provide a benchmark solution of radar propagation in buildings, including the effects of typical cinder block and re-bar, we performed large scale full wave simulations using a Finite Element Time Domain (FETD) method. This particular FETD implementation is tuned for the special case of an orthogonal Cartesian mesh and hence resembles FDTD in accuracy and efficiency. The method was implemented on a general-purpose massively parallel computer. In this paper we briefly describe the radar propagation problem, the FETD implementation, and we present results of simulations that used over 10 billion elements.

  18. Geodynamo, solar wind, and magnetopause 3.4 to 3.45 billion years ago.

    PubMed

    Tarduno, John A; Cottrell, Rory D; Watkeys, Michael K; Hofmann, Axel; Doubrovine, Pavel V; Mamajek, Eric E; Liu, Dunji; Sibeck, David G; Neukirch, Levi P; Usui, Yoichi

    2010-03-01

    Stellar wind standoff by a planetary magnetic field prevents atmospheric erosion and water loss. Although the early Earth retained its water and atmosphere, and thus evolved as a habitable planet, little is known about Earth's magnetic field strength during that time. We report paleointensity results from single silicate crystals bearing magnetic inclusions that record a geodynamo 3.4 to 3.45 billion years ago. The measured field strength is approximately 50 to 70% that of the present-day field. When combined with a greater Paleoarchean solar wind pressure, the paleofield strength data suggest steady-state magnetopause standoff distances of < or = 5 Earth radii, similar to values observed during recent coronal mass ejection events. The data also suggest lower-latitude aurora and increases in polar cap area, as well as heating, expansion, and volatile loss from the exosphere that would have affected long-term atmospheric composition. PMID:20203044

  19. Atmospheric carbon dioxide: a driver of photosynthetic eukaryote evolution for over a billion years?

    PubMed Central

    Beerling, David J.

    2012-01-01

    Exciting evidence from diverse fields, including physiology, evolutionary biology, palaeontology, geosciences and molecular genetics, is providing an increasingly secure basis for robustly formulating and evaluating hypotheses concerning the role of atmospheric carbon dioxide (CO2) in the evolution of photosynthetic eukaryotes. Such studies span over a billion years of evolutionary change, from the origins of eukaryotic algae through to the evolution of our present-day terrestrial floras, and have relevance for plant and ecosystem responses to future global CO2 increases. The papers in this issue reflect the breadth and depth of approaches being adopted to address this issue. They reveal new discoveries pointing to deep evidence for the role of CO2 in shaping evolutionary changes in plants and ecosystems, and establish an exciting cross-disciplinary research agenda for uncovering new insights into feedbacks between biology and the Earth system. PMID:22232760

  20. Billion-atom synchronous parallel kinetic Monte Carlo simulations of critical 3D Ising systems

    NASA Astrophysics Data System (ADS)

    Martínez, E.; Monasterio, P. R.; Marian, J.

    2011-02-01

    An extension of the synchronous parallel kinetic Monte Carlo (spkMC) algorithm developed by Martinez et al. [J. Comp. Phys. 227 (2008) 3804] to discrete lattices is presented. The method solves the master equation synchronously by recourse to null events that keep all processors' time clocks current in a global sense. Boundary conflicts are resolved by adopting a chessboard decomposition into non-interacting sublattices. We find that the bias introduced by the spatial correlations attendant to the sublattice decomposition is within the standard deviation of serial calculations, which confirms the statistical validity of our algorithm. We have analyzed the parallel efficiency of spkMC and find that it scales consistently with problem size and sublattice partition. We apply the method to the calculation of scale-dependent critical exponents in billion-atom 3D Ising systems, with very good agreement with state-of-the-art multispin simulations.

  1. Barium fluoride whispering-gallery-mode disk-resonator with one billion quality-factor.

    PubMed

    Lin, Guoping; Diallo, Souleymane; Henriet, Rémi; Jacquot, Maxime; Chembo, Yanne K

    2014-10-15

    We demonstrate a monolithic optical whispering-gallery-mode resonator fabricated with barium fluoride (BaF₂) with an ultra-high quality (Q) factor above 10⁹ at 1550 nm, and measured with both the linewidth and cavity-ring-down methods. Vertical scanning optical profilometry shows that the root mean square surface roughness of 2 nm is achieved for our mm-size disk. To the best of our knowledge, we show for the first time that one billion Q-factor is achievable by precision polishing in relatively soft crystals with mohs hardness of 3. We show that complex thermo-optical dynamics can take place in these resonators. Beside usual applications in nonlinear optics and microwave photonics, high-energy particle scintillation detection utilizing monolithic BaF₂ resonators potentially becomes feasible. PMID:25361142

  2. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. PMID:22609725

  3. Effects of Neutrino Decay on Oscillation Probabilities

    NASA Astrophysics Data System (ADS)

    Leonard, Kayla; de Gouvêa, André

    2016-01-01

    It is now well accepted that neutrinos oscillate as a quantum mechanical result of a misalignment between their mass-eigenstates and the flavor-eigenstates. We study neutrino decay—the idea that there may be new, light states that the three Standard Model flavors may be able to decay into. We consider what effects this neutrino decay would have on the observed oscillation probabilities.The Hamiltonian governs how the states change with time, so we use it to calculate an oscillation amplitude, and from that, the oscillation probability. We simplify the theoretical probabilities using results from experimental data, such as the neutrino mixing angles and mass differences. By exploring what values of the decay parameters are physically allowable, we can begin to understand just how large the decay parameters can be. We compare the probabilities in the case of no neutrino decay and in the case of maximum neutrino decay to determine how much of an effect neutrino decay could have on observations, and discuss the ability of future experiments to detect these differences.We also examine neutrino decay in the realm of CP invariance, and found that it is a new source of CP violation. Our work indicates that there is a difference in the oscillation probabilities between particle transitions and their corresponding antiparticle transitions. If neutrino decay were proven true, it could be an important factor in understanding leptogenesis and the particle-antiparticle asymmetry present in our Universe.

  4. Laboratory-tutorial activities for teaching probability

    NASA Astrophysics Data System (ADS)

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-12-01

    We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  5. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  6. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  7. Reconstructing the prior probabilities of allelic phylogenies.

    PubMed Central

    Golding, G Brian

    2002-01-01

    In general when a phylogeny is reconstructed from DNA or protein sequence data, it makes use only of the probabilities of obtaining some phylogeny given a collection of data. It is also possible to determine the prior probabilities of different phylogenies. This information can be of use in analyzing the biological causes for the observed divergence of sampled taxa. Unusually "rare" topologies for a given data set may be indicative of different biological forces acting. A recursive algorithm is presented that calculates the prior probabilities of a phylogeny for different allelic samples and for different phylogenies. This method is a straightforward extension of Ewens' sample distribution. The probability of obtaining each possible sample according to Ewens' distribution is further subdivided into each of the possible phylogenetic topologies. These probabilities depend not only on the identity of the alleles and on 4N(mu) (four times the effective population size times the neutral mutation rate) but also on the phylogenetic relationships among the alleles. Illustrations of the algorithm are given to demonstrate how different phylogenies are favored under different conditions. PMID:12072482

  8. IRON AND {alpha}-ELEMENT PRODUCTION IN THE FIRST ONE BILLION YEARS AFTER THE BIG BANG

    SciTech Connect

    Becker, George D.; Carswell, Robert F.; Sargent, Wallace L. W.; Rauch, Michael E-mail: acalver@ast.cam.ac.uk E-mail: mr@obs.carnegiescience.edu

    2012-01-10

    We present measurements of carbon, oxygen, silicon, and iron in quasar absorption systems existing when the universe was roughly one billion years old. We measure column densities in nine low-ionization systems at 4.7 < z < 6.3 using Keck, Magellan, and Very Large Telescope optical and near-infrared spectra with moderate to high resolution. The column density ratios among C II, O I, Si II, and Fe II are nearly identical to sub-damped Ly{alpha} systems (sub-DLAs) and metal-poor ([M/H] {<=} -1) DLAs at lower redshifts, with no significant evolution over 2 {approx}< z {approx}< 6. The estimated intrinsic scatter in the ratio of any two elements is also small, with a typical rms deviation of {approx}< 0.1 dex. These facts suggest that dust depletion and ionization effects are minimal in our z > 4.7 systems, as in the lower-redshift DLAs, and that the column density ratios are close to the intrinsic relative element abundances. The abundances in our z > 4.7 systems are therefore likely to represent the typical integrated yields from stellar populations within the first gigayear of cosmic history. Due to the time limit imposed by the age of the universe at these redshifts, our measurements thus place direct constraints on the metal production of massive stars, including iron yields of prompt supernovae. The lack of redshift evolution further suggests that the metal inventories of most metal-poor absorption systems at z {approx}> 2 are also dominated by massive stars, with minimal contributions from delayed Type Ia supernovae or winds from asymptotic giant branch stars. The relative abundances in our systems broadly agree with those in very metal-poor, non-carbon-enhanced Galactic halo stars. This is consistent with the picture in which present-day metal-poor stars were potentially formed as early as one billion years after the big bang.

  9. An anoxic, Fe(II)-rich, U-poor ocean 3.46 billion years ago

    NASA Astrophysics Data System (ADS)

    Li, Weiqiang; Czaja, Andrew D.; Van Kranendonk, Martin J.; Beard, Brian L.; Roden, Eric E.; Johnson, Clark M.

    2013-11-01

    The oxidation state of the atmosphere and oceans on the early Earth remains controversial. Although it is accepted by many workers that the Archean atmosphere and ocean were anoxic, hematite in the 3.46 billion-year-old (Ga) Marble Bar Chert (MBC) from Pilbara Craton, NW Australia has figured prominently in arguments that the Paleoarchean atmosphere and ocean was fully oxygenated. In this study, we report the Fe isotope compositions and U concentrations of the MBC, and show that the samples have extreme heavy Fe isotope enrichment, where δ56Fe values range between +1.5‰ and +2.6‰, the highest δ56Fe values for bulk samples yet reported. The high δ56Fe values of the MBC require very low levels of oxidation and, in addition, point to a Paleoarchean ocean that had high aqueous Fe(II) contents. A dispersion/reaction model indicates that O2 contents in the photic zone of the ocean were less than 10-3 μM, which suggests that the ocean was essentially anoxic. An independent test of anoxic conditions is provided by U-Th-Pb isotope systematics, which show that U contents in the Paleoarchean ocean were likely below 0.02 ppb, two orders-of-magnitude lower than the modern ocean. Collectively, the Fe and U data indicate a reduced, Fe(II)-rich, U-poor environment in the Archean oceans at 3.46 billion years ago. Given the evidence for photosynthetic communities provided by broadly coeval stromatolites, these results suggests that an important photosynthetic pathway in the Paleoarchean oceans may have been anoxygenic photosynthetic Fe(II) oxidation.

  10. Searching for Organics Preserved in 4.5 Billion Year Old Salt

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.; Fries, M.; Steele, A.; Bodnar, R.

    2012-01-01

    Our understanding of early solar system fluids took a dramatic turn a decade ago with the discovery of fluid inclusion-bearing halite (NaCl) crystals in the matrix of two freshly fallen brecciated H chondrite falls, Monahans and Zag. Both meteorites are regolith breccias, and contain xenolithic halite (and minor admixed sylvite -- KCl, crystals in their regolith lithologies. The halites are purple to dark blue, due to the presence of color centers (electrons in anion vacancies) which slowly accumulated as 40K (in sylvite) decayed over billions of years. The halites were dated by K-Ar, Rb-Sr and I-Xe systematics to be 4.5 billion years old. The "blue" halites were a fantastic discovery for the following reasons: (1) Halite+sylvite can be dated (K is in sylvite and will substitute for Na in halite, Rb substitutes in halite for Na, and I substitutes for Cl). (2) The blue color is lost if the halite dissolves on Earth and reprecipitates (because the newly-formed halite has no color centers), so the color serves as a "freshness" or pristinity indicator. (3) Halite frequently contains aqueous fluid inclusions. (4) Halite contains no structural oxygen, carbon or hydrogen, making them ideal materials to measure these isotopic systems in any fluid inclusions. (5) It is possible to directly measure fluid inclusion formation temperatures, and thus directly measure the temperature of the mineralizing aqueous fluid. In addition to these two ordinary chondrites halite grains have been reliably reported in several ureilites, an additional ordinary chondrite (Jilin), and in the carbonaceous chondrite (Murchison), although these reports were unfortunately not taken seriously. We have lately found additional fluid inclusions in carbonates in several additional carbonaceous chondrites. Meteoritic aqueous fluid inclusions are apparently relatively widespread in meteorites, though very small and thus difficult to analyze.

  11. Deposition of 1.88-billion-year-old iron formations as a consequence of rapid crustal growth.

    PubMed

    Rasmussen, Birger; Fletcher, Ian R; Bekker, Andrey; Muhling, Janet R; Gregory, Courtney J; Thorne, Alan M

    2012-04-26

    Iron formations are chemical sedimentary rocks comprising layers of iron-rich and silica-rich minerals whose deposition requires anoxic and iron-rich (ferruginous) sea water. Their demise after the rise in atmospheric oxygen by 2.32 billion years (Gyr) ago has been attributed to the removal of dissolved iron through progressive oxidation or sulphidation of the deep ocean. Therefore, a sudden return of voluminous iron formations nearly 500 million years later poses an apparent conundrum. Most late Palaeoproterozoic iron formations are about 1.88 Gyr old and occur in the Superior region of North America. Major iron formations are also preserved in Australia, but these were apparently deposited after the transition to a sulphidic ocean at 1.84 Gyr ago that should have terminated iron formation deposition, implying that they reflect local marine conditions. Here we date zircons in tuff layers to show that iron formations in the Frere Formation of Western Australia are about 1.88 Gyr old, indicating that the deposition of iron formations from two disparate cratons was coeval and probably reflects global ocean chemistry. The sudden reappearance of major iron formations at 1.88 Gyr ago--contemporaneous with peaks in global mafic-ultramafic magmatism, juvenile continental and oceanic crust formation, mantle depletion and volcanogenic massive sulphide formation--suggests deposition of iron formations as a consequence of major mantle activity and rapid crustal growth. Our findings support the idea that enhanced submarine volcanism and hydrothermal activity linked to a peak in mantle melting released large volumes of ferrous iron and other reductants that overwhelmed the sulphate and oxygen reservoirs of the ocean, decoupling atmospheric and seawater redox states, and causing the return of widespread ferruginous conditions. Iron formations formed on clastic-starved coastal shelves where dissolved iron upwelled and mixed with oxygenated surface water. The

  12. Beaufortian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Beaufortian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes Jurassic through lower Cretaceous (Neocomian) strata of the Kingak Shale and the overlying pebble shale unit. These strata are part of a composite total petroleum system involving hydrocarbons expelled from source rocks in three stratigraphic intervals, the Lower Jurassic part of the Kingak Shale, the Triassic Shublik Formation, and the Lower Cretaceous gamma-ray zone (GRZ) and associated strata. The potential for undiscovered oil and gas resources in the Beaufortian megasequence in NPRA was assessed by defining eight plays (assessment units), two in lower Cretaceous (Neocomian) topset seismic facies, four in Upper Jurassic topset seismic facies, one in Lower Jurassic topset seismic facies, and one in Jurassic through lower Cretaceous (Neocomian) clinoform seismic facies. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 239 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 103 million barrels. The Beaufortian Cretaceous Topset North Play is estimated to contain between 0 (95-percent probability) and 1,162 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 405 billion cubic feet. The Beaufortian Cretaceous Topset South Play is estimated to contain between 635 (95-percent probability) and 4,004 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 2,130 billion cubic feet. No technically recoverable oil is assessed in the Beaufortian Cretaceous Topset South Play, as it lies at depths that are entirely in the gas window. The Beaufortian Upper Jurassic Topset Northeast Play is estimated to contain between 2,744 (95-percent probability) and 8,086 (5-percent probability) million barrels of technically recoverable oil

  13. Classical and Quantum Probability for Biologists - Introduction

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2010-01-01

    The aim of this review (oriented to biologists looking for applications of QM) is to provide a detailed comparative analysis of classical (Kolmogorovian) and quantum (Dirac-von Neumann) models. We will stress differences in the definition of conditional probability and as a consequence in the structures of matrices of transition probabilities, especially the condition of double stochasticity which arises naturally in QM. One of the most fundamental differences between two models is deformation of the classical formula of total probability (FTP) which plays an important role in statistics and decision making. An additional term appears in the QM-version of FTP - so called interference term. Finally, we discuss Bell's inequality and show that the common viewpoint that its violation induces either nonlocality or "death of realism" has not been completely justified. For us it is merely a sign of non-Kolmogorovianity of probabilistic data collected in a few experiments with incompatible setups of measurement devices.

  14. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  15. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. PMID:24300550

  16. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  17. Sampling Quantum Nonlocal Correlations with High Probability

    NASA Astrophysics Data System (ADS)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(< u_i,v_jrangle)_{i,j=1}^n}, where the vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  18. Monte Carlo simulation of scenario probability distributions

    SciTech Connect

    Glaser, R.

    1996-10-23

    Suppose a scenario of interest can be represented as a series of events. A final result R may be viewed then as the intersection of three events, A, B, and C. The probability of the result P(R) in this case is the product P(R) = P(A) P(B {vert_bar} A) P(C {vert_bar} A {intersection} B). An expert may be reluctant to estimate P(R) as a whole yet agree to supply his notions of the component probabilities in the form of prior distributions. Each component prior distribution may be viewed as the stochastic characterization of the expert`s uncertainty regarding the true value of the component probability. Mathematically, the component probabilities are treated as independent random variables and P(R) as their product; the induced prior distribution for P(R) is determined which characterizes the expert`s uncertainty regarding P(R). It may be both convenient and adequate to approximate the desired distribution by Monte Carlo simulation. Software has been written for this task that allows a variety of component priors that experts with good engineering judgment might feel comfortable with. The priors are mostly based on so-called likelihood classes. The software permits an expert to choose for a given component event probability one of six types of prior distributions, and the expert specifies the parameter value(s) for that prior. Each prior is unimodal. The expert essentially decides where the mode is, how the probability is distributed in the vicinity of the mode, and how rapidly it attenuates away. Limiting and degenerate applications allow the expert to be vague or precise.

  19. Electric quadrupole transition probabilities for atomic lithium

    SciTech Connect

    Çelik, Gültekin; Gökçe, Yasin; Yıldız, Murat

    2014-05-15

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT.

  20. Non-Gaussian Photon Probability Distribution

    SciTech Connect

    Solomon, Benjamin T.

    2010-01-28

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mGAMMA distribution (whose parameters are alpha = r, betar/sq root(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact P{sub i}, the probabilistic function and the ability to interact A{sub i}, the electromagnetic function. Splitting the probability function P{sub i} from the electromagnetic function A{sub i} enables the investigation of the photon behavior from a purely probabilistic P{sub i} perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function P{sub i} and the ability to interact A{sub i}, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon P{sub i} of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub

  1. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. PMID:26621989

  2. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  3. Practical algorithmic probability: an image inpainting example

    NASA Astrophysics Data System (ADS)

    Potapov, Alexey; Scherbakov, Oleg; Zhdanov, Innokentii

    2013-12-01

    Possibility of practical application of algorithmic probability is analyzed on an example of image inpainting problem that precisely corresponds to the prediction problem. Such consideration is fruitful both for the theory of universal prediction and practical image inpaiting methods. Efficient application of algorithmic probability implies that its computation is essentially optimized for some specific data representation. In this paper, we considered one image representation, namely spectral representation, for which an image inpainting algorithm is proposed based on the spectrum entropy criterion. This algorithm showed promising results in spite of very simple representation. The same approach can be used for introducing ALP-based criterion for more powerful image representations.

  4. Flood frequency: expected and unexpected probabilities

    USGS Publications Warehouse

    Thomas, D.M.

    1976-01-01

    Flood-frequency curves may be defined either with or without an ' expeced probability ' adustment; and the two curves differ in the way that they attempt to average the time-sampling uncertainties. A curve with no adustment is shown to estimate a median value of both discharge and frequency of occurrence, while an expected probability curve is shown to estimate a mean frequency of flood years. The attributes and constraints of the two types of curves for various uses are discussed. 

  5. Brookian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Brookian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes bottomset and clinoform seismic facies of the Torok Formation (mostly Albian age) and generally coeval, topset seismic facies of the uppermost Torok Formation and the Nanushuk Group. These strata are part of a composite total petroleum system involving hydrocarbons expelled from three stratigraphic intervals of source rocks, the Lower Cretaceous gamma-ray zone (GRZ), the Lower Jurassic Kingak Shale, and the Triassic Shublik Formation. The potential for undiscovered oil and gas resources in the Brookian megasequence in NPRA was assessed by defining five plays (assessment units), one in the topset seismic facies and four in the bottomset-clinoform seismic facies. The Brookian Topset Play is estimated to contain between 60 (95-percent probability) and 465 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 239 million barrels. The Brookian Topset Play is estimated to contain between 0 (95-percent probability) and 679 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 192 billion cubic feet. The Brookian Clinoform North Play, which extends across northern NPRA, is estimated to contain between 538 (95-percent probability) and 2,257 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 1,306 million barrels. The Brookian Clinoform North Play is estimated to contain between 0 (95-percent probability) and 1,969 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 674 billion cubic feet. The Brookian Clinoform Central Play, which extends across central NPRA, is estimated to contain between 299 (95-percent probability) and 1,849 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 973

  6. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    NASA Astrophysics Data System (ADS)

    Vourdas, A.

    2014-08-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator {{D}}(H_1, H_2), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors {{P}}(H_1), {{P}}(H_2), to the subspaces H1, H2. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  7. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    SciTech Connect

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  8. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    USGS Publications Warehouse

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    similar and related to three primary physiographic provinces. The final regional regression analyses resulted in three sets of equations. For Regions 1 and 2, the basin characteristics of drainage area and basin shape factor were statistically significant. For Region 3, because of the small amount of data from streamgages, only drainage area was statistically significant. Average standard errors of prediction ranged from 28.7 to 38.4 percent for flood region 1, 24.1 to 43.5 percent for flood region 2, and 25.8 to 30.5 percent for region 3. The regional regression equations are only applicable to stream sites in Missouri with flows not significantly affected by regulation, channelization, backwater, diversion, or urbanization. Basins with about 5 percent or less impervious area were considered to be rural. Applicability of the equations are limited to the basin characteristic values that range from 0.11 to 8,212.38 square miles (mi2) and basin shape from 2.25 to 26.59 for Region 1, 0.17 to 4,008.92 mi2 and basin shape 2.04 to 26.89 for Region 2, and 2.12 to 2,177.58 mi2 for Region 3. Annual peak data from streamgages were used to qualitatively assess the largest floods recorded at streamgages in Missouri since the 1915 water year. Based on existing streamgage data, the 1983 flood event was the largest flood event on record since 1915. The next five largest flood events, in descending order, took place in 1993, 1973, 2008, 1994 and 1915. Since 1915, five of six of the largest floods on record occurred from 1973 to 2012.

  9. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    PubMed

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores. PMID:25883353

  10. Switching To Less-Expensive Blindness Drug Could Save Medicare Part B $18 Billion Over A Ten-Year Period

    PubMed Central

    Hutton, DW; Newman-Casey, PA; Tavag, M; Zacks, DN; Stein, JD

    2014-01-01

    The biologic drugs bevacizumab and ranibizumab have revolutionized treatment of diabetic macular edema and macular degeneration, leading causes of blindness. Ophthalmologic use of these drugs has increased, now accounting for roughly one-sixth of the Medicare Part B drug budget. Ranibizumab and bevacizumab have similar efficacy and potentially minor differences in adverse event rates, but at $2,023 per dose, ranibizumab costs forty times more than bevacizumab. Using modeling methods, we predict ten-year (2010–2020) population-level costs and health benefits of using bevacizumab and ranibizumab. Our results show that if all patients were treated with the less-expensive bevacizumab instead of current usage patterns, Medicare Part B, patients, and the health care system would save $18 billion, $4.6 billion, and $29 billion, respectively. Altering patterns of use with these therapies by encouraging bevacizumab use and hastening approval of biosimilar therapies would dramatically reduce spending without substantially affecting patient outcomes. PMID:24889941

  11. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  12. Spatial Probability Cuing and Right Hemisphere Damage

    ERIC Educational Resources Information Center

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  13. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  14. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  15. Phonotactic Probability Effects in Children Who Stutter

    ERIC Educational Resources Information Center

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  16. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  17. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  18. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  19. Simplicity and Probability in Causal Explanation

    ERIC Educational Resources Information Center

    Lombrozo, Tania

    2007-01-01

    What makes some explanations better than others? This paper explores the roles of simplicity and probability in evaluating competing causal explanations. Four experiments investigate the hypothesis that simpler explanations are judged both better and more likely to be true. In all experiments, simplicity is quantified as the number of causes…

  20. Exploring Concepts in Probability: Using Graphics Calculators

    ERIC Educational Resources Information Center

    Ghosh, Jonaki

    2004-01-01

    This article describes a project in which certain key concepts in probability were explored using graphics calculators with year 10 students. The lessons were conducted in the regular classroom where students were provided with a Casio CFX 9850 GB PLUS graphics calculator with which they were familiar from year 9. The participants in the…

  1. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  2. Monte Carlo, Probability, Algebra, and Pi.

    ERIC Educational Resources Information Center

    Hinders, Duane C.

    1981-01-01

    The uses of random number generators are illustrated in three ways: (1) the solution of a probability problem using a coin; (2) the solution of a system of simultaneous linear equations using a die; and (3) the approximation of pi using darts. (MP)

  3. On the bound of first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J. N.

    1969-01-01

    Method has been developed to improve the lower bound of the first excursion probability that can apply to the problem with either constant or time-dependent barriers. The method requires knowledge of the joint density function of the random process at two arbitrary instants.

  4. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  5. Quantum temporal probabilities in tunneling systems

    NASA Astrophysics Data System (ADS)

    Anastopoulos, Charis; Savvidou, Ntina

    2013-09-01

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines 'classical' time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems.

  6. Conceptual Variation and Coordination in Probability Reasoning

    ERIC Educational Resources Information Center

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  7. Teaching Mathematics with Technology: Probability Simulations.

    ERIC Educational Resources Information Center

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  8. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  9. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  10. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  11. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  12. Learning a Probability Distribution Efficiently and Reliably

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  13. Five-Parameter Bivariate Probability Distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  14. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  15. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  16. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  17. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  18. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    SciTech Connect

    Downing, Mark; Eaton, Laurence M; Graham, Robin Lambert; Langholtz, Matthew H; Perlack, Robert D; Turhollow Jr, Anthony F; Stokes, Bryce; Brandt, Craig C

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small-diameter trees were

  19. The albedo effect on neutron transmission probability.

    PubMed

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  20. Quantum temporal probabilities in tunneling systems

    SciTech Connect

    Anastopoulos, Charis Savvidou, Ntina

    2013-09-15

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.

  1. Neural representation of probabilities for Bayesian inference.

    PubMed

    Rich, Dylan; Cazettes, Fanny; Wang, Yunyan; Peña, José Luis; Fischer, Brian J

    2015-04-01

    Bayesian models are often successful in describing perception and behavior, but the neural representation of probabilities remains in question. There are several distinct proposals for the neural representation of probabilities, but they have not been directly compared in an example system. Here we consider three models: a non-uniform population code where the stimulus-driven activity and distribution of preferred stimuli in the population represent a likelihood function and a prior, respectively; the sampling hypothesis which proposes that the stimulus-driven activity over time represents a posterior probability and that the spontaneous activity represents a prior; and the class of models which propose that a population of neurons represents a posterior probability in a distributed code. It has been shown that the non-uniform population code model matches the representation of auditory space generated in the owl's external nucleus of the inferior colliculus (ICx). However, the alternative models have not been tested, nor have the three models been directly compared in any system. Here we tested the three models in the owl's ICx. We found that spontaneous firing rate and the average stimulus-driven response of these neurons were not consistent with predictions of the sampling hypothesis. We also found that neural activity in ICx under varying levels of sensory noise did not reflect a posterior probability. On the other hand, the responses of ICx neurons were consistent with the non-uniform population code model. We further show that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization. PMID:25561333

  2. Cooling and exhumation of continents at billion-year time scales

    NASA Astrophysics Data System (ADS)

    Blackburn, T.; Bowring, S. A.; Perron, T.; Mahan, K. H.; Dudas, F. O.

    2011-12-01

    The oldest rocks on Earth are preserved within the continental lithosphere, where assembled fragments of ancient orogenic belts have survived erosion and destruction by plate tectonic and surface processes for billions of years. Though the rate of orogenic exhumation and erosion has been measured for segments of an orogenic history, it remains unclear how these exhumation rates have changed over the lifetime of any terrane. Because the exhumation of the lithospheric surface has a direct effect on the rate of heat loss within the lithosphere, a continuous record of lithosphere exhumation can be reconstructed through the use of thermochronology. Thermochronologic studies have typically employed systems sensitive to cooling at temperatures <300 °C, such as the (U-Th)/He and 40Ar/39Ar systems. This largely restricts their application to measuring cooling in rocks from the outer 10 km of the Earth's crust, resulting in a thermal history that is controlled by either upper crustal flexure and faulting and/or isotherm inflections related to surface topography. Combining these biases with the uplift, erosion and recycling of these shallow rocks results in a poor preservation potential of any long-term record. Here, an ancient and long-term record of lithosphere exhumation is constructed using U-Pb thermochronology, a geochronologic system sensitive to cooling at temperatures found at 20-50 km depth (400-650 °C). Lower crustal xenoliths provide material that resided at these depths for billions of years or more, recording a thermal history that is buried deep enough to remain insensitive to upper crustal deformation and instead is dominated by the vertical motions of the continents. We show how this temperature-sensitive system can produce a long-term integrated measure of continental exhumation and erosion. Preserved beneath Phanerozoic sedimentary rocks within Montana, USA, the Great Falls Tectonic Zone formed when two Archean cratons, the Wyoming Province and Medicine

  3. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  4. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  5. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in other journals. However, there is…

  6. The Star Formation History of the Universe over the Past Eight Billion Years

    NASA Astrophysics Data System (ADS)

    Zhu, Guangtun

    How galaxies such as our own Milky Way formed and evolved remains a mystery. There are two general approaches in galaxy formation and evolution studies. One is to infer formation histories via archaeological investigations of galaxies at low redshift, in the local Universe. The other is to study galaxy formation and evolution in action by observing faint distant galaxies, the ancestors of local galaxies, in the more distant and younger Universe, at higher redshift. I employ the first approach to study the formation of elliptical galaxies, the most massive galaxies in the Universe. I investigate the stellar content of 1923 elliptical galaxies, the largest high-fidelity sample in the local Universe, as a function of stellar mass and environment. I infer their star formation histories, finding that isolated low-mass elliptical galaxies formed their stars slightly later than their counterparts in galaxy clusters. I measure the cosmic star formation rate (SFR) density at redshift z ˜ 1, when the Universe was eight billion years younger. The cosmic SFR density measures how many stars are being formed per unit volume of the Universe. I show that galaxies were more actively forming stars eight billion years ago than they are at present, by roughly an order of magnitude. The reason why galaxies are so much less active at present remains unknown, partly due to the small sample size of distant galaxies observed previously. To improve the sample size, we have completed a new galaxy survey, the Prism Multi-object Survey (PRIMUS). We have observed ˜ 120, 000 galaxies spanning distances from the local Universe to redshift z ˜ 1. We specifically targeted fields with existing multi-wavelength data in the X-ray, ultraviolet, optical, and infrared. The large sample and multi-wavelength data allow precise statistical studies of galaxy evolution since z ˜1. As a preliminary result from PRIMUS, I show that 15% of galaxies that appear to lack star formation in the optical actually

  7. A redox-stratified ocean 3.2 billion years ago

    NASA Astrophysics Data System (ADS)

    Satkoski, Aaron M.; Beukes, Nicolas J.; Li, Weiqiang; Beard, Brian L.; Johnson, Clark M.

    2015-11-01

    Before the Great Oxidation Event (GOE) 2.4-2.2 billion years ago it has been traditionally thought that oceanic water columns were uniformly anoxic due to a lack of oxygen-producing microorganisms. Recently, however, it has been proposed that transient oxygenation of shallow seawater occurred between 2.8 and 3.0 billion years ago. Here, we present a novel combination of stable Fe and radiogenic U-Th-Pb isotope data that demonstrate significant oxygen contents in the shallow oceans at 3.2 Ga, based on analysis of the Manzimnyama Banded Iron Formation (BIF), Fig Tree Group, South Africa. This unit is exceptional in that proximal, shallow-water and distal, deep-water facies are preserved. When compared to the distal, deep-water facies, the proximal samples show elevated U concentrations and moderately positive δ56Fe values, indicating vertical stratification in dissolved oxygen contents. Confirmation of oxidizing conditions using U abundances is robustly constrained using samples that have been closed to U and Pb mobility using U-Th-Pb geochronology. Although redox-sensitive elements have been commonly used in ancient rocks to infer redox conditions, post-depositional element mobility has been rarely tested, and U-Th-Pb geochronology can constrain open- or closed-system behavior. The U abundances and δ56Fe values of the Manzimnyama BIF suggest the proximal, shallow-water samples record precipitation under stronger oxidizing conditions compared to the distal deeper-water facies, which in turn indicates the existence of a discrete redox boundary between deep and shallow ocean waters at this time; this work, therefore, documents the oldest known preserved marine redox gradient in the rock record. The relative enrichment of O2 in the upper water column is likely due to the existence of oxygen-producing microorganisms such as cyanobacteria. These results provide a new approach for identifying free oxygen in Earth's ancient oceans, including confirming the age of redox

  8. The First Billion Years project: dark matter haloes going from contraction to expansion and back again

    NASA Astrophysics Data System (ADS)

    Davis, Andrew J.; Khochfar, Sadegh; Dalla Vecchia, Claudio

    2014-09-01

    We study the effect of baryons on the inner dark matter profile of the first galaxies using the First Billion Years simulation between z = 16 and 6 before secular evolution sets in. Using a large statistical sample from two simulations of the same volume and cosmological initial conditions, one with and one without baryons, we are able to directly compare haloes with their baryon-free counterparts, allowing a detailed study of the modifications to the dark matter density profile due to the presence of baryons during the first billion years of galaxy formation. For each of the ≈5000 haloes in our sample (3 × 107 M⊙ ≤ Mtot ≤ 5 × 109 M⊙), we quantify the impact of the baryons using η, defined as the ratio of dark matter mass enclosed in 100 pc in the baryonic run to its counterpart without baryons. During this epoch of rapid growth of galaxies, we find that many haloes of these first galaxies show an enhancement of dark matter in the halo centre compared to the baryon-free simulation, while many others show a deficit. We find that the mean value of η is close to unity, but there is a large dispersion, with a standard deviation of 0.677. The enhancement is cyclical in time and tracks the star formation cycle of the galaxy; as gas falls to the centre and forms stars, the dark matter moves in as well. Supernova (SN) feedback then removes the gas, and the dark matter again responds to the changing potential. We study three physical models relating the motion of baryons to that of the dark matter: adiabatic contraction, dynamical friction, and rapid outflows. We find that dynamical friction plays only a very minor role, while adiabatic contraction and the rapid outflows due to feedback describe well the enhancement (or decrement) of dark matter. For haloes which show significant decrements of dark matter in the core, we find that to remove the dark matter requires an energy input between 1051 and 1053 erg. For our SN feedback proscription, this requires as a

  9. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  10. From data to probability densities without histograms

    NASA Astrophysics Data System (ADS)

    Berg, Bernd A.; Harris, Robert C.

    2008-09-01

    When one deals with data drawn from continuous variables, a histogram is often inadequate to display their probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which overcomes this problem. Error bars on the estimated probability density are calculated using a jackknife method. We give several examples and provide computer code reproducing them. You may want to look at the corresponding figures 4 to 9 first. Program summaryProgram title: cdf_to_pd Catalogue identifier: AEBC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2758 No. of bytes in distributed program, including test data, etc.: 18 594 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any capable of compiling and executing Fortran code Operating system: Any capable of compiling and executing Fortran code Classification: 4.14, 9 Nature of problem: When one deals with data drawn from continuous variables, a histogram is often inadequate to display the probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Solution method: Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which

  11. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  12. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  13. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  14. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  15. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. R.; Petrosky, V. E.

    1973-01-01

    The technique of photoelectron spectroscopy was used to obtain the relative continuum transition probabilities of atomic oxygen at 584 A for transitions from 3P ground state into the 4S, D2, and P2 states of the ion. Transition probability ratios for the D2 and P2 states relative to the S4 state of the ion are 1.57 + or - 0.14 and 0.82 + or - 0.07, respectively. In addition, transitions from excited O2(a 1 Delta g) state into the O2(+)(2 Phi u and 2 Delta g) were observed. The adiabatic ionization potential of O2(+)(2 Delta g) was measured as 18.803 + or - 0.006 eV.

  16. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  17. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  18. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  19. Volcano shapes, entropies, and eruption probabilities

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  20. Assessment of potential oil and gas resources in source rocks of the Alaska North Slope, 2012

    USGS Publications Warehouse

    Houseknecht, David W.; Rouse, William A.; Garrity, Christopher P.; Whidden, Katherine J.; Dumoulin, Julie A.; Schenk, Christopher J.; Charpentier, Ronald R.; Cook, Troy A.; Gaswirth, Stephanie B.; Kirschbaum, Mark A.; Pollastro, Richard M.

    2012-01-01

    The U.S. Geological Survey estimated potential, technically recoverable oil and gas resources for source rocks of the Alaska North Slope. Estimates (95-percent to 5-percent probability) range from zero to 2 billion barrels of oil and from zero to nearly 80 trillion cubic feet of gas.

  1. A Massive Galaxy in Its Core Formation Phase Three Billion Years After the Big Bang

    NASA Technical Reports Server (NTRS)

    Nelson, Erica; van Dokkum, Pieter; Franx, Marijn; Brammer, Gabriel; Momcheva, Ivelina; Schreiber, Natascha M. Forster; da Cunha, Elisabete; Tacconi, Linda; Bezanson, Rachel; Kirkpatrick, Allison; Leja, Joel; Rix, Hans-Walter; Skelton, Rosalind; van der Wel, Arjen; Whitaker, Katherine; Wuyts, Stijn

    2014-01-01

    Most massive galaxies are thought to have formed their dense stellar cores at early cosmic epochs. However, cores in their formation phase have not yet been observed. Previous studies have found galaxies with high gas velocity dispersions or small apparent sizes but so far no objects have been identified with both the stellar structure and the gas dynamics of a forming core. Here we present a candidate core in formation 11 billion years ago, at z = 2.3. GOODS-N-774 has a stellar mass of 1.0 × 10 (exp 11) solar mass, a half-light radius of 1.0 kpc, and a star formation rate of 90 (sup +45 / sub -20) solar mass/yr. The star forming gas has a velocity dispersion 317 plus or minus 30 km/s, amongst the highest ever measured. It is similar to the stellar velocity dispersions of the putative descendants of GOODS-N-774, compact quiescent galaxies at z is approximately equal to 2 (exp 8-11) and giant elliptical galaxies in the nearby Universe. Galaxies such as GOODS-N-774 appear to be rare; however, from the star formation rate and size of the galaxy we infer that many star forming cores may be heavily obscured, and could be missed in optical and near-infrared surveys.

  2. Sharing global CO2 emission reductions among one billion high emitters

    PubMed Central

    Chakravarty, Shoibal; Chikkatur, Ananth; de Coninck, Heleen; Pacala, Stephen; Socolow, Robert; Tavoni, Massimo

    2009-01-01

    We present a framework for allocating a global carbon reduction target among nations, in which the concept of “common but differentiated responsibilities” refers to the emissions of individuals instead of nations. We use the income distribution of a country to estimate how its fossil fuel CO2 emissions are distributed among its citizens, from which we build up a global CO2 distribution. We then propose a simple rule to derive a universal cap on global individual emissions and find corresponding limits on national aggregate emissions from this cap. All of the world's high CO2-emitting individuals are treated the same, regardless of where they live. Any future global emission goal (target and time frame) can be converted into national reduction targets, which are determined by “Business as Usual” projections of national carbon emissions and in-country income distributions. For example, reducing projected global emissions in 2030 by 13 GtCO2 would require the engagement of 1.13 billion high emitters, roughly equally distributed in 4 regions: the U.S., the OECD minus the U.S., China, and the non-OECD minus China. We also modify our methodology to place a floor on emissions of the world's lowest CO2 emitters and demonstrate that climate mitigation and alleviation of extreme poverty are largely decoupled. PMID:19581586

  3. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    USGS Publications Warehouse

    Lowenstern, Jacob B.; Evans, William C.; Bergfeld, D.; Hunt, Andrew G.

    2014-01-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents1. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot2. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  4. Half a billion years of good weather: Gaia or good luck?

    NASA Astrophysics Data System (ADS)

    Waltham, Dave

    2007-06-01

    For the past 550 million years, Earth has had a relatively stable climate, with average global temperatures generally fluctuating by less than 10°C from the present value of around 15°C. In the preceding 4 billion years, temperature fluctuations were almost an order of magnitude greater. One explanation for climate stability is that the biosphere evolves to maintain optimum conditions for life (the Gaia hypothesis). But this stability could also result from luck and, without such good fortune, conditions on Earth would have been unsuitable for the evolution of complex life: anthropic selection, in other words. One element of such good luck concerns the climatic impact of the Moon; the properties of the Earth-Moon system only just allow a stable rotation axis for the Earth (considered a prerequisite for climate stability and the evolution of complex life). Axial stability also requires Jupiter and Saturn to be widely spaced, offering a test of the rarity or otherwise of the solar system arrangement among exoplanet systems. Gravitational microlensing surveys should allow this to be tested within a decade.

  5. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone.

    PubMed

    Lowenstern, J B; Evans, W C; Bergfeld, D; Hunt, A G

    2014-02-20

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions. PMID:24553240

  6. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    PubMed Central

    Bell, Elizabeth A.; Harrison, T. Mark; Mao, Wendy L.

    2015-01-01

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼3.5 billion years (Ga), the chemofossil record arguably to ∼3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of −24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼300 My earlier than has been previously proposed. PMID:26483481

  7. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    DOE PAGESBeta

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; Mao, Wendy L.

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in amore » crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.« less

  8. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon

    SciTech Connect

    Bell, Elizabeth A.; Boehnke, Patrick; Harrison, T. Mark; Mao, Wendy L.

    2015-10-19

    Here, evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ~3.5 billion years (Ga), the chemofossil record arguably to ~3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ13CPDB of –24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ~300 My earlier than has been previously proposed.

  9. If slow rate of health care spending growth persists, projections may be off by $770 billion.

    PubMed

    Cutler, David M; Sahni, Nikhil R

    2013-05-01

    Despite earlier forecasts to the contrary, US health care spending growth has slowed in the past four years, continuing a trend that began in the early 2000s. In this article we attempt to identify why US health care spending growth has slowed, and we explore the spending implications if the trend continues for the next decade. We find that the 2007-09 recession, a one-time event, accounted for 37 percent of the slowdown between 2003 and 2012. A decline in private insurance coverage and cuts to some Medicare payment rates accounted for another 8 percent of the slowdown, leaving 55 percent of the spending slowdown unexplained. We conclude that a host of fundamental changes--including less rapid development of imaging technology and new pharmaceuticals, increased patient cost sharing, and greater provider efficiency--were responsible for the majority of the slowdown in spending growth. If these trends continue during 2013-22, public-sector health care spending will be as much as $770 billion less than predicted. Such lower levels of spending would have an enormous impact on the US economy and on government and household finances. PMID:23650316

  10. How to make a billion-barrel oil field in offshore California commercial

    SciTech Connect

    Patterson, J.C.; Ballard, J.H.

    1988-01-01

    The major obstacles and challenges involved in exploration and development of a giant deep-water low-gravity oil field are exemplified in the undeveloped Sword field of offshore southern California. In 1979, Conoco Exploration identified a northeast-southwest-trending basement high in the 800 to 2,000-ft deep federal waters 12 mi southwest of Pt. Conception at the western end of the Santa Barbara Channel. The intended reservoir was fractured Miocene Monterey chert, silicic shales/siltstones,m and dolomites that are draped over the axially faulted structure. Drilling of the initial well in OCS P-0322 in 1982 resulted in discovering the giant Sword field. A confirmation well drilled in OCS P-0320 indicates in-place reserves of well over 1 billion bbl. while the discovered potential is significant, the low gravity (8.5/sup 0/-10.5/sup 0/ API) of the oils discovered to data, along with water depths in excess of 1,500 ft, currently pose economic challenges to successful field development.

  11. Rapid oxygenation of Earth’s atmosphere 2.33 billion years ago

    PubMed Central

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J.; Wang, David T.; Xie, Shucheng; Summons, Roger E.

    2016-01-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth’s biogeochemical cycles. Although “whiffs” of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly—within 1 to 10 million years—and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, “Snowball Earth” glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  12. Constraints on the first billion years of the geodynamo from paleointensity studies of zircons

    NASA Astrophysics Data System (ADS)

    Tarduno, John; Cottrell, Rory; Davis, William

    2014-05-01

    Several lines of reasoning, including new ideas on core thermal conductivity, suggest that onset of a strong geomagnetic field might have been delayed by one billion years (or more) after the lunar forming event. Here we extend the Proterozoic/Archean to Paleoarchean record of the geomagnetic field constrained by single crystal paleointensity (SCP) analyses (Tarduno et al., Science, 2010) to older times using zircons containing minute magnetic inclusions. Specifically, we focus on samples from the Jack Hills (Yilgarn Craton, Western Australia). We employ a CO2 laser demagnetization system and a small bore (6.3 mm) 3-component DC SQUID magnetometer; the latter offers the highest currently available moment resolution. Sample age is analyzed using SHRIMP U-Pb geochronology. Preliminary data support the presence of a relatively strong Paleoarchean field produced by a core dynamo, extending the known record by at least 100 million years, to approximately 3.55 Ga. These data only serve to exacerbate the apparent problem posed by the presence of a Paleoarchean dynamo. Alternative dynamo driving mechanisms, or efficient core/lowermost mantle heat loss processes unique to the Paleoarchean (and older times) might have been at work. We will discuss these processes, and our efforts to study even older Eoarchean-Hadean zircons.

  13. Ballography: A Billion Nanosecond History of the Bee Bluff Impact Crater of South Texas

    NASA Astrophysics Data System (ADS)

    Graham, R. A.

    2006-07-01

    The Bee Bluff Structure of South Texas in Zavala County near Uvalde has been found to exhibit unusual features permitting study of impactites and meteorite impact processes from the standpoint of grain-level, nanosecond shock-compression science. The site is characterized by a thin cap of Carrizo Sandstone covering a thin hard Indio fm calcareous siltstone. A soft calcareous silt lies below the hard cap. Calculations based on the Earth Impact Effects web-based program indicate that the site is best described by a 60 m diameter iron meteorite striking the ground at 11 km/sec. Such an impact into sandstone is expected to produce a shock pressure of 250 GPa. A large release wave originates from the bottom of the hard target with upward moving melt-vaporization waves of solid, liquid and vapor products that become trapped at the impact interface. Numerous distinctive types of impactites result from this `bottom-up' release behavior. Evidence for hydrodynamic instabilities and resulting density gradients are abundant at the impact interface. An unusually valuable breccia sample called `The Uvalde Crater Rosetta Stone' contains at least seven types of impactites in a well defined arrangement that can be used to read the billion nanosecond history of the impact and identify scattered impactites relative to their place in that history.

  14. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes.

    PubMed

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D

    2014-01-01

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments. PMID:25068404

  15. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    PubMed

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe. PMID:14985754

  16. Providing safe drinking water to 1.2 billion unserved people

    SciTech Connect

    Gadgil, Ashok J.; Derby, Elisabeth A.

    2003-06-01

    Despite substantial advances in the past 100 years in public health, technology and medicine, 20% of the world population, mostly comprised of the poor population segments in developing countries (DCs), still does not have access to safe drinking water. To reach the United Nations (UN) Millennium Goal of halving the number of people without access to safe water by 2015, the global community will need to provide an additional one billion urban residents and 600 million rural residents with safe water within the next twelve years. This paper examines current water treatment measures and implementation methods for delivery of safe drinking water, and offers suggestions for making progress towards the goal of providing a timely and equitable solution for safe water provision. For water treatment, based on the serious limitations of boiling water and chlorination, we suggest an approach based on filtration coupled with ultraviolet (UV) disinfection, combined with public education. Additionally, owing to the capacity limitations for non-governmental organizations (NGOs) to take on this task primarily on their own, we suggest a strategy based on financially sustainable models that include the private sector as well as NGOs.

  17. Potentially biogenic carbon preserved in a 4.1 billion-year-old zircon.

    PubMed

    Bell, Elizabeth A; Boehnke, Patrick; Harrison, T Mark; Mao, Wendy L

    2015-11-24

    Evidence of life on Earth is manifestly preserved in the rock record. However, the microfossil record only extends to ∼ 3.5 billion years (Ga), the chemofossil record arguably to ∼ 3.8 Ga, and the rock record to 4.0 Ga. Detrital zircons from Jack Hills, Western Australia range in age up to nearly 4.4 Ga. From a population of over 10,000 Jack Hills zircons, we identified one >3.8-Ga zircon that contains primary graphite inclusions. Here, we report carbon isotopic measurements on these inclusions in a concordant, 4.10 ± 0.01-Ga zircon. We interpret these inclusions as primary due to their enclosure in a crack-free host as shown by transmission X-ray microscopy and their crystal habit. Their δ(13)CPDB of -24 ± 5‰ is consistent with a biogenic origin and may be evidence that a terrestrial biosphere had emerged by 4.1 Ga, or ∼ 300 My earlier than has been previously proposed. PMID:26483481

  18. Prodigious degassing of a billion years of accumulated radiogenic helium at Yellowstone

    NASA Astrophysics Data System (ADS)

    Lowenstern, J. B.; Evans, W. C.; Bergfeld, D.; Hunt, A. G.

    2014-02-01

    Helium is used as a critical tracer throughout the Earth sciences, where its relatively simple isotopic systematics is used to trace degassing from the mantle, to date groundwater and to time the rise of continents. The hydrothermal system at Yellowstone National Park is famous for its high helium-3/helium-4 isotope ratio, commonly cited as evidence for a deep mantle source for the Yellowstone hotspot. However, much of the helium emitted from this region is actually radiogenic helium-4 produced within the crust by α-decay of uranium and thorium. Here we show, by combining gas emission rates with chemistry and isotopic analyses, that crustal helium-4 emission rates from Yellowstone exceed (by orders of magnitude) any conceivable rate of generation within the crust. It seems that helium has accumulated for (at least) many hundreds of millions of years in Archaean (more than 2.5 billion years old) cratonic rocks beneath Yellowstone, only to be liberated over the past two million years by intense crustal metamorphism induced by the Yellowstone hotspot. Our results demonstrate the extremes in variability of crustal helium efflux on geologic timescales and imply crustal-scale open-system behaviour of helium in tectonically and magmatically active regions.

  19. Enhanced cellular preservation by clay minerals in 1 billion-year-old lakes

    NASA Astrophysics Data System (ADS)

    Wacey, David; Saunders, Martin; Roberts, Malcolm; Menon, Sarath; Green, Leonard; Kong, Charlie; Culwick, Timothy; Strother, Paul; Brasier, Martin D.

    2014-07-01

    Organic-walled microfossils provide the best insights into the composition and evolution of the biosphere through the first 80 percent of Earth history. The mechanism of microfossil preservation affects the quality of biological information retained and informs understanding of early Earth palaeo-environments. We here show that 1 billion-year-old microfossils from the non-marine Torridon Group are remarkably preserved by a combination of clay minerals and phosphate, with clay minerals providing the highest fidelity of preservation. Fe-rich clay mostly occurs in narrow zones in contact with cellular material and is interpreted as an early microbially-mediated phase enclosing and replacing the most labile biological material. K-rich clay occurs within and exterior to cell envelopes, forming where the supply of Fe had been exhausted. Clay minerals inter-finger with calcium phosphate that co-precipitated with the clays in the sub-oxic zone of the lake sediments. This type of preservation was favoured in sulfate-poor environments where Fe-silicate precipitation could outcompete Fe-sulfide formation. This work shows that clay minerals can provide an exceptionally high fidelity of microfossil preservation and extends the known geological range of this fossilization style by almost 500 Ma. It also suggests that the best-preserved microfossils of this time may be found in low-sulfate environments.

  20. Rapid oxygenation of Earth's atmosphere 2.33 billion years ago.

    PubMed

    Luo, Genming; Ono, Shuhei; Beukes, Nicolas J; Wang, David T; Xie, Shucheng; Summons, Roger E

    2016-05-01

    Molecular oxygen (O2) is, and has been, a primary driver of biological evolution and shapes the contemporary landscape of Earth's biogeochemical cycles. Although "whiffs" of oxygen have been documented in the Archean atmosphere, substantial O2 did not accumulate irreversibly until the Early Paleoproterozoic, during what has been termed the Great Oxygenation Event (GOE). The timing of the GOE and the rate at which this oxygenation took place have been poorly constrained until now. We report the transition (that is, from being mass-independent to becoming mass-dependent) in multiple sulfur isotope signals of diagenetic pyrite in a continuous sedimentary sequence in three coeval drill cores in the Transvaal Supergroup, South Africa. These data precisely constrain the GOE to 2.33 billion years ago. The new data suggest that the oxygenation occurred rapidly-within 1 to 10 million years-and was followed by a slower rise in the ocean sulfate inventory. Our data indicate that a climate perturbation predated the GOE, whereas the relationships among GOE, "Snowball Earth" glaciation, and biogeochemical cycling will require further stratigraphic correlation supported with precise chronologies and paleolatitude reconstructions. PMID:27386544

  1. Analysis of precious metals at parts-per-billion levels in industrial applications

    NASA Astrophysics Data System (ADS)

    Tickner, James; O'Dwyer, Joel; Roach, Greg; Smith, Michael; Van Haarlem, Yves

    2015-11-01

    Precious metals, including gold and the platinum group metals (notable Pt, Pd and Rh), are mined commercially at concentrations of a few parts-per-million and below. Mining and processing operations demand sensitive and rapid analysis at concentrations down to about 100 parts-per-billion (ppb). In this paper, we discuss two technologies being developed to meet this challenge: X-ray fluorescence (XRF) and gamma-activation analysis (GAA). We have designed on-stream XRF analysers capable of measuring targeted elements in slurries with precisions in the 35-70 ppb range. For the past two years, two on-stream analysers have been in continuous operation at a precious metals concentrator plant. The simultaneous measurement of feed and waste stream grades provides real-time information on metal recovery, allowing changes in operating conditions and plant upsets to be detected and corrected more rapidly. Separately, we have been developing GAA for the measurement of gold as a replacement for the traditional laboratory fire-assay process. High-energy Bremsstrahlung X-rays are used to excite gold via the 197Au(γ,γ‧)197Au-M reaction, and the gamma-rays released in the decay of the meta-state are then counted. We report on work to significantly improve accuracy and detection limits.

  2. Full-sky weak-lensing simulation with 70 billion particles

    NASA Astrophysics Data System (ADS)

    Teyssier, R.; Pires, S.; Prunet, S.; Aubert, D.; Pichon, C.; Amara, A.; Benabed, K.; Colombi, S.; Refregier, A.; Starck, J.-L.

    2009-04-01

    We have performed a 70 billion dark-matter particles N-body simulation in a 2 h-1 Gpc periodic box, using the concordance, cosmological model as favored by the latest WMAP3 results. We have computed a full-sky convergence map with a resolution of Δ θ ≃ 0.74 arcmin2, spanning 4 orders of magnitude in angular dynamical range. Using various high-order statistics on a realistic cut sky, we have characterized the transition from the linear to the nonlinear regime at ℓ ≃ 1000 and shown that realistic galactic masking affects high-order moments only below ℓ < 200. Each domain (Gaussian and non-Gaussian) spans 2 decades in angular scale. This map is therefore an ideal tool for testing map-making algorithms on the sphere. As a first step in addressing the full map reconstruction problem, we have benchmarked in this paper two denoising methods: 1) Wiener filtering applied to the Spherical Harmonics decomposition of the map and 2) a new method, called MRLens, based on the modification of the Maximum Entropy Method on a Wavelet decomposition. While the latter is optimal on large spatial scales, where the signal is Gaussian, MRLens outperforms the Wiener method on small spatial scales, where the signal is highly non-Gaussian. The simulated full-sky convergence map is freely available to the community to help the development of new map-making algorithms dedicated to the next generation of weak-lensing surveys.

  3. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    NASA Astrophysics Data System (ADS)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  4. A billion years of environmental stability and the emergence of eukaryotes: new data from northern Australia.

    PubMed

    Brasier, M D; Lindsay, J F

    1998-06-01

    Carbon isotopes through 6km of fully cored drill holes in 1.7 to 1.5 Ga carbonates of the Mount Isa and McArthur basins, Australia (which host the earliest known eukaryote biomarkers) provide the most comprehensive and best-dated delta 13C stratigraphy yet obtained from such ancient rocks. Both basins reveal remarkably stable temporal delta 13C trends (mean of -0.6% +/- 2% PDB [Peedee belemnite]) and confirm the impression of delta 13C stasis between 2.0 and 1.0 Ga, which, together with other evidence, suggest a prolonged period of stability in crustal dynamics, redox state of surface environments, and planetary climate. This delta 13C stasis is consistent with great stability in the carbon cycle controlled, we suggest, by P limitation of primary productivity. Recent evidence shows that P depletion is a major factor in obligate associations between photosymbionts and host cells. We argue that a billion years of stability in the carbon and nutrient cycles may have been the driving force that propelled prokaryotes toward photosymbiosis and the emergence of the autotrophic eukaryote cell. PMID:11541449

  5. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  6. Investigation of Flood Inundation Probability in Taiwan

    NASA Astrophysics Data System (ADS)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  7. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  8. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-04-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  9. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  10. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. A. R.; Petrosky, V. E.

    1974-01-01

    The technique of photoelectron spectroscopy was employed in the investigation. Atomic oxygen was produced in a microwave discharge operating at a power of 40 W and at a pressure of approximately 20 mtorr. The photoelectron spectrum of the oxygen with and without the discharge is shown. The atomic states can be clearly seen. In connection with the measurement of the probability for transitions into the various ionic states, the analyzer collection efficiency was determined as a function of electron energy.

  11. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  12. SureTrak Probability of Impact Display

    NASA Technical Reports Server (NTRS)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  13. Classical probabilities for Majorana and Weyl spinors

    SciTech Connect

    Wetterich, C.

    2011-08-15

    Highlights: > Map of classical statistical Ising model to fermionic quantum field theory. > Lattice-regularized real Grassmann functional integral for single Weyl spinor. > Emerging complex structure characteristic for quantum physics. > A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q{sub {tau}}(t) for the Ising states {tau}. The time dependent probability distribution of a generalized Ising model obtains as p{sub {tau}}(t)=q{sub {tau}}{sup 2}(t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  14. Chemisorptive electron emission versus sticking probability

    NASA Astrophysics Data System (ADS)

    Böttcher, Artur; Niehus, Horst

    2001-07-01

    The chemisorption of N2O on thin Cs films has been studied by monitoring the time evolution of the sticking probability as well as the kinetics of the low-energy electron emission. By combining the data sets, two time domains become distinguishable: the initial chemisorption stage is characterized by a high sticking probability (0.1probability of less than 0.01. Such evident anticoincidence between the exoemission and the chemisorption excludes the model of surface harpooning as the elementary process responsible for the electron emission in the late chemisorption stage. A long-term emission decay has also been observed after turning off the flux of chemisorbing molecules. A model is proposed that attributes both, the late chemisorptive and the nonchemisorptive electron emission to the relaxation of a narrow state originated from an oxygen vacancy in the Cs oxide layer terminating the surface. The presence of such a state has been confirmed by the metastable de-excitation spectroscopy [MDS, He*(21S)].

  15. The Probability Distribution of Daily Streamflow

    NASA Astrophysics Data System (ADS)

    Blum, A.; Vogel, R. M.

    2015-12-01

    Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.

  16. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive. PMID:24681395

  17. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  18. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  19. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  20. Industrial R&D Spending Reached $26.6 Billion in 1976. Science Resources Studies Highlights, May 5, 1978.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    This report presents data compiled as part of a comprehensive program to measure and analyze the nation's resources expended for research and development (R&D). Industry, which carries out 69% of the R&D in the United States, spent $26.6 billion on these activities in 1976, 10% above the 1975 level. In constant dollars, this presents an increase…

  1. Industrial R&D Expenditures Rise to $22 Billion in 1974. Science Resources Studies Highlights, January 14, 1976.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Reported in this newsletter in narrative, graphical, and tabular form are data related to industrial research and development expenditures in 1974, showing a seven percent increase over 1973. It is noted that more than 80 percent of a total of $22.3 billion was spent by five industries; these included electrical equipment and communication,…

  2. $100 Billion: For Reform...or to Subsidize the Status Quo? Education Stimulus Watch. Special Report 1

    ERIC Educational Resources Information Center

    Smarick, Andy

    2009-01-01

    This is the first in a quarterly series of special reports on the K-12 education implications of the federal government's economic stimulus package, the American Recovery and Reinvestment Act (ARRA). That the ARRA, which was signed into law in February, will pump nearly $100 billion--an unprecedented sum of federal money--into K-12 education is…

  3. 77 FR 29458 - Supervisory Guidance on Stress Testing for Banking Organizations With More Than $10 Billion in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ...The Board, FDIC and OCC, (collectively, the ``agencies'') are issuing this guidance, which outlines high-level principles for stress testing practices, applicable to all Federal Reserve-supervised, FDIC- supervised, and OCC-supervised banking organizations with more than $10 billion in total consolidated assets. The guidance highlights the importance of stress testing as an ongoing risk......

  4. 76 FR 35072 - Proposed Guidance on Stress Testing for Banking Organizations With More Than $10 Billion in Total...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... Stress Testing Guidance. The agency form number for the collection is FR 4202. The agency control number... Proposed Guidance on Stress Testing for Banking Organizations With More Than $10 Billion in Total..., Board, and the FDIC (collectively, the ``agencies'') request comment on proposed guidance on...

  5. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  6. Social Science and the Bayesian Probability Explanation Model

    NASA Astrophysics Data System (ADS)

    Yin, Jie; Zhao, Lei

    2014-03-01

    C. G. Hempel, one of the logical empiricists, who builds up his probability explanation model by using the empiricist view of probability, this model encountered many difficulties in the scientific explanation in which Hempel is difficult to make a reasonable defense. Based on the bayesian probability theory, the Bayesian probability model provides an approach of a subjective probability explanation based on the subjective probability, using the subjectivist view of probability. On the one hand, this probability model establishes the epistemological status of the subject in the social science; On the other hand, it provides a feasible explanation model for the social scientific explanation, which has important methodological significance.

  7. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  8. The First Billion Years project: the escape fraction of ionizing photons in the epoch of reionization

    NASA Astrophysics Data System (ADS)

    Paardekooper, Jan-Pieter; Khochfar, Sadegh; Dalla Vecchia, Claudio

    2015-08-01

    Protogalaxies forming in low-mass dark matter haloes are thought to provide the majority of ionizing photons needed to reionize the Universe, due to their high escape fractions of ionizing photons. We study how the escape fraction in high-redshift galaxies relates to the physical properties of the halo in which the galaxies form, by computing escape fractions in more than 75 000 haloes between redshifts 27 and 6 that were extracted from the First Billion Years project, high-resolution cosmological hydrodynamical simulations of galaxy formation. We find that the main constraint on the escape fraction is the gas column density in a radius of 10 pc around the stellar populations, causing a strong mass dependence of the escape fraction. The lower potential well in haloes with M200 ≲ 108 M⊙ results in low column densities that can be penetrated by radiation from young stars (age <5 Myr). In haloes with M200 ≳ 108 M⊙ supernova feedback is important, but only ˜30 per cent of the haloes in this mass range have an escape fraction higher than 1 per cent. We find a large range of escape fractions in haloes with similar properties, caused by different distributions of the dense gas in the halo. This makes it very hard to predict the escape fraction on the basis of halo properties and results in a highly anisotropic escape fraction. The strong mass dependence, the large spread and the large anisotropy of the escape fraction may strongly affect the topology of reionization and is something current models of cosmic reionization should strive to take into account.

  9. A sawtooth-like timeline for the first billion years of lunar bombardment

    NASA Astrophysics Data System (ADS)

    Morbidelli, A.; Marchi, S.; Bottke, W. F.; Kring, D. A.

    2012-11-01

    We revisit the early evolution of the Moon's bombardment. Our work combines modeling (based on plausible projectile sources and their dynamical decay rates) with constraints from the lunar crater record, radiometric ages of the youngest lunar basins, and the abundance of highly siderophile elements in the lunar crust and mantle. We deduce that the evolution of the impact flux did not decline exponentially over the first billion years of lunar history, but also there was no prominent and narrow impact spike ˜3.9Gy ago, unlike that typically envisioned in the lunar cataclysm scenario. Instead, we show the timeline of the lunar bombardment has a sawtooth-like profile, with an uptick in the impact flux near ˜4.1Gy ago. The impact flux at the beginning of this weaker cataclysm was 5-10 times higher than the immediately preceding period. The Nectaris basin should have been one of the first basins formed at the sawtooth. We predict the bombardment rate since ˜4.1Gy ago declined slowly and adhered relatively close to classic crater chronology models (Neukum and Ivanov, 1994). Overall we expect that the sawtooth event accounted for about one-fourth of the total bombardment suffered by the Moon since its formation. Consequently, considering that ˜12-14 basins formed during the sawtooth event, we expect that the net number of basins formed on the Moon was ˜45-50. From our expected bombardment timeline, we derived a new and improved lunar chronology suitable for use on pre-Nectarian surface units. According to this chronology, a significant portion of the oldest lunar cratered terrains has an age of 4.38-4.42 Gyr. Moreover, the largest lunar basin, South Pole Aitken, is older than 4.3 Gy, and therefore was not produced during the lunar cataclysm.

  10. Searching for the birthplaces of open clusters with ages of several billion years

    NASA Astrophysics Data System (ADS)

    Acharova, I. A.; Shevtsova, E. S.

    2016-01-01

    We discuss the possibility of finding the birthplaces of open clusters (OC) with ages of several billion years. The proposed method is based on the comparison of the results of the chemical evolution modeling of the Galactic disk with the parameters of the cluster. Five OCs older than 7 Gyr are known: NGC6791, BH176, Collinder 261, Berkeley 17, and Berkeley 39. The oxygen and iron abundances in NGC6791 and the oxygen abundance in BH176 are twice the solar level, the heavy-element abundances in other clusters are close to the corresponding solar values. According to chemical evolution models, at the time of the formation of the objects considered the regions where the oxygen and iron abundances reached the corresponding levels extended out to 5 kpc from the Galactic center.At present time theOCs considered are located several kpc from the Galactic center. Some of these clusters are located extremely high, about 1 kpc above the disk midplane, i.e., they have been subject to some mechanism that has carried them into orbits uncharacteristic of this type of objects. It follows from a comparison with the results of chemical evolution that younger clusters with ages of 4-5 Gyr, e.g., NGC1193,M67, and others, may have formed in a broad range of Galactocentric distances. Their large heights above the disk midplane is sufficient to suggest that these clusters have moved away from their likely birthplaces. Clusters are carried far away from the Galactic disk until the present time: about 40 clusters with ages from 0 to 2 Gyr are observed at heights ranging from 300 to 750 pc.

  11. No Photon Left Behind: How Billions of Spectral Lines are Transforming Planetary Sciences

    NASA Astrophysics Data System (ADS)

    Villanueva, Geronimo L.

    2014-06-01

    With the advent of realistic potential energy surface (PES) and dipole moment surface (DMS) descriptions, theoretically computed linelists can now synthesize accurate spectral parameters for billions of spectral lines sampling the untamed high-energy molecular domain. Being the initial driver for these databases the characterization of stellar spectra, these theoretical databases, in combination with decades of precise experimental studies (nicely compiled in community databases such as HITRAN and GEISA), are leading to unprecedented precisions in the characterization of planetary atmospheres. Cometary sciences are among the most affected by this spectroscopic revolution. Even though comets are relatively cold bodies (T˜100 K), their infrared molecular emission is mainly defined by non-LTE solar fluorescence induced by a high-energy source (Sun, T˜5600 K). In order to interpret high-resolution spectra of comets acquired with extremely powerful telescopes (e.g., Keck, VLT, NASA-IRTF), we have developed advanced non-LTE fluorescence models that integrate the high-energy dynamic range of ab-initio databases (e.g., BT2, VTT, HPT2, BYTe, TROVE) and the precision of laboratory and semi-empirical compilations (e.g., HITRAN, GEISA, CDMS, WKMC, SELP, IUPAC). These new models allow us to calculate realistic non-LTE pumps, cascades, branching-ratios, and emission rates for a broad range of excitation regimes for H2O, HDO, HCN, HNC and NH3. We have implemented elements of these compilations to the study of Mars spectra, and we are now exploring its application to modeling non-LTE emission in exoplanets. In this presentation, we present application of these advanced models to interpret highresolution spectra of comets, Mars and exoplanets.

  12. The formation of submillimetre-bright galaxies from gas infall over a billion years.

    PubMed

    Narayanan, Desika; Turk, Matthew; Feldmann, Robert; Robitaille, Thomas; Hopkins, Philip; Thompson, Robert; Hayward, Christopher; Ball, David; Faucher-Giguère, Claude-André; Kereš, Dušan

    2015-09-24

    Submillimetre-bright galaxies at high redshift are the most luminous, heavily star-forming galaxies in the Universe and are characterized by prodigious emission in the far-infrared, with a flux of at least five millijanskys at a wavelength of 850 micrometres. They reside in haloes with masses about 10(13) times that of the Sun, have low gas fractions compared to main-sequence disks at a comparable redshift, trace complex environments and are not easily observable at optical wavelengths. Their physical origin remains unclear. Simulations have been able to form galaxies with the requisite luminosities, but have otherwise been unable to simultaneously match the stellar masses, star formation rates, gas fractions and environments. Here we report a cosmological hydrodynamic galaxy formation simulation that is able to form a submillimetre galaxy that simultaneously satisfies the broad range of observed physical constraints. We find that groups of galaxies residing in massive dark matter haloes have increasing rates of star formation that peak at collective rates of about 500-1,000 solar masses per year at redshifts of two to three, by which time the interstellar medium is sufficiently enriched with metals that the region may be observed as a submillimetre-selected system. The intense star formation rates are fuelled in part by the infall of a reservoir gas supply enabled by stellar feedback at earlier times, not through major mergers. With a lifetime of nearly a billion years, our simulations show that the submillimetre-bright phase of high-redshift galaxies is prolonged and associated with significant mass buildup in early-Universe proto-clusters, and that many submillimetre-bright galaxies are composed of numerous unresolved components (for which there is some observational evidence). PMID:26399829

  13. A Highly Functional Synthetic Phage Display Library Containing over 40 Billion Human Antibody Clones

    PubMed Central

    Weber, Marcel; Bujak, Emil; Putelli, Alessia; Villa, Alessandra; Matasci, Mattia; Gualandi, Laura; Hemmerle, Teresa; Wulhfard, Sarah; Neri, Dario

    2014-01-01

    Several synthetic antibody phage display libraries have been created and used for the isolation of human monoclonal antibodies. The performance of antibody libraries, which is usually measured in terms of their ability to yield high-affinity binding specificities against target proteins of interest, depends both on technical aspects (such as library size and quality of cloning) and on design features (which influence the percentage of functional clones in the library and their ability to be used for practical applications). Here, we describe the design, construction and characterization of a combinatorial phage display library, comprising over 40 billion human antibody clones in single-chain fragment variable (scFv) format. The library was designed with the aim to obtain highly stable antibody clones, which can be affinity-purified on protein A supports, even when used in scFv format. The library was found to be highly functional, as >90% of randomly selected clones expressed the corresponding antibody. When selected against more than 15 antigens from various sources, the library always yielded specific and potent binders, at a higher frequency compared to previous antibody libraries. To demonstrate library performance in practical biomedical research projects, we isolated the human antibody G5, which reacts both against human and murine forms of the alternatively spliced BCD segment of tenascin-C, an extracellular matrix component frequently over-expressed in cancer and in chronic inflammation. The new library represents a useful source of binding specificities, both for academic research and for the development of antibody-based therapeutics. PMID:24950200

  14. Spin Glass Computations and Ruelle's Probability Cascades

    NASA Astrophysics Data System (ADS)

    Arguin, Louis-Pierre

    2007-03-01

    We study the Parisi functional, appearing in the Parisi formula for the pressure of the SK model, as a functional on Ruelle's Probability Cascades (RPC). Computation techniques for the RPC formulation of the functional are developed. They are used to derive continuity and monotonicity properties of the functional retrieving a theorem of Guerra. We also detail the connection between the Aizenman-Sims-Starr variational principle and the Parisi formula. As a final application of the techniques, we rederive the Almeida-Thouless line in the spirit of Toninelli but relying on the RPC structure.

  15. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  16. Mapping probability of shipping sound exposure level.

    PubMed

    Gervaise, Cédric; Aulanier, Florian; Simard, Yvan; Roy, Nathalie

    2015-06-01

    Mapping vessel noise is emerging as one method of identifying areas where sound exposure due to shipping noise could have negative impacts on aquatic ecosystems. The probability distribution function (pdf) of sound exposure levels (SEL) is an important metric for identifying areas of concern. In this paper a probabilistic shipping SEL modeling method is described to obtain the pdf of SEL using the sonar equation and statistical relations linking the pdfs of ship traffic density, source levels, and transmission losses to their products and sums. PMID:26093451

  17. Calculation of radiative transition probabilities and lifetimes

    NASA Technical Reports Server (NTRS)

    Zemke, W. T.; Verma, K. K.; Stwalley, W. C.

    1982-01-01

    Procedures for calculating bound-bound and bound-continuum (free) radiative transition probabilities and radiative lifetimes are summarized. Calculations include rotational dependence and R-dependent electronic transition moments (no Franck-Condon or R-centroid approximation). Detailed comparisons of theoretical results with experimental measurements are made for bound-bound transitions in the A-X systems of LiH and Na2. New bound-free results are presented for LiH. New bound-free results and comparisons with very recent fluorescence experiments are presented for Na2.

  18. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  19. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.

  20. Properties of atoms in molecules: Transition probabilities

    NASA Astrophysics Data System (ADS)

    Bader, R. F. W.; Bayles, D.; Heard, G. L.

    2000-06-01

    The transition probability for electric dipole transitions is a measurable property of a system and is therefore, partitionable into atomic contributions using the physics of a proper open system. The derivation of the dressed property density, whose averaging over an atomic basin yields the atomic contribution to a given oscillator strength, is achieved through the development of perturbation theory for an open system. A dressed density describes the local contribution resulting from the interaction of a single electron at some position r, as determined by the relevant observable, averaged over the motions of all of the remaining particles in the system. In the present work, the transition probability density expressed in terms of the relevant transition density, yields a local measure of the associated oscillator strength resulting from the interaction of the entire molecule with a radiation field. The definition of the atomic contributions to the oscillator strength enables one to determine the extent to which a given electronic or vibrational transition is spatially localized to a given atom or functional group. The concepts introduced in this article are applied to the Rydberg-type transitions observed in the electronic excitation of a nonbonding electron in formaldehyde and ammonia. The atomic partitioning of the molecular density distribution and of the molecular properties by surfaces of zero flux in the gradient vector field of the electron density, the boundary condition defining the physics of a proper open system, is found to apply to the density distributions of the excited, Rydberg states.

  1. Classical probability model for Bell inequality

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2014-04-01

    We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction "Kolmogorovization" of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θi,θ'j) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θi, θ'j)). One may (but need not) say that randomness of "free will" has to be accounted for.

  2. On the probability of matching DNA fingerprints.

    PubMed

    Risch, N J; Devlin, B

    1992-02-01

    Forensic scientists commonly assume that DNA fingerprint patterns are infrequent in the general population and that genotypes are independent across loci. To test these assumptions, the number of matching DNA patterns in two large databases from the Federal Bureau of Investigation (FBI) and from Lifecodes was determined. No deviation from independence across loci in either database was apparent. For the Lifecodes database, the probability of a three-locus match ranges from 1 in 6,233 in Caucasians to 1 in 119,889 in Blacks. When considering all trios of five loci in the FBI database, there was only a single match observed out of more than 7.6 million comparisons. If independence is assumed, the probability of a five-locus match ranged from 1.32 x 10(-12) in Southeast Hispanics to 5.59 x 10(-14) in Blacks, implying that the minimum number of possible patterns for each ethnic group is several orders of magnitude greater than their corresponding population sizes in the United States. The most common five-locus pattern can have a frequency no greater than about 10(-6). Hence, individual five-locus DNA profiles are extremely uncommon, if not unique. PMID:1738844

  3. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  4. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  5. Risks and probabilities of breast cancer: short-term versus lifetime probabilities.

    PubMed Central

    Bryant, H E; Brasher, P M

    1994-01-01

    OBJECTIVE: To calculate age-specific short-term and lifetime probabilities of breast cancer among a cohort of Canadian women. DESIGN: Double decrement life table. SETTING: Alberta. SUBJECTS: Women with first invasive breast cancers registered with the Alberta Cancer Registry between 1985 and 1987. MAIN OUTCOME MEASURES: Lifetime probability of breast cancer from birth and for women at various ages; short-term (up to 10 years) probability of breast cancer for women at various ages. RESULTS: The lifetime probability of breast cancer is 10.17% at birth and peaks at 10.34% at age 25 years, after which it decreases owing to a decline in the number of years over which breast cancer risk will be experienced. However, the probability of manifesting breast cancer in the next year increases steadily from the age of 30 onward, reaching 0.36% at 85 years. The probability of manifesting the disease within the next 10 years peaks at 2.97% at age 70 and decreases thereafter, again owing to declining probabilities of surviving the interval. CONCLUSIONS: Given that the incidence of breast cancer among Albertan women during the study period was similar to the national average, we conclude that currently more than 1 in 10 women in Canada can expect to have breast cancer at some point during their life. However, risk varies considerably over a woman's lifetime, with most risk concentrated after age 49. On the basis of the shorter-term age-specific risks that we present, the clinician can put breast cancer risk into perspective for younger women and heighten awareness among women aged 50 years or more. PMID:8287343

  6. Estimated probability of arsenic in groundwater from bedrock aquifers in New Hampshire, 2011

    USGS Publications Warehouse

    Ayotte, Joseph D.; Cahillane, Matthew; Hayes, Laura; Robinson, Keith W.

    2012-01-01

    Probabilities of arsenic occurrence in groundwater from bedrock aquifers at concentrations of 1, 5, and 10 micrograms per liter (µg/L) were estimated during 2011 using multivariate logistic regression. These estimates were developed for use by the New Hampshire Environmental Public Health Tracking Program. About 39 percent of New Hampshire bedrock groundwater was identified as having at least a 50 percent chance of containing an arsenic concentration greater than or equal to 1 µg/L. This compares to about 7 percent of New Hampshire bedrock groundwater having at least a 50 percent chance of containing an arsenic concentration equaling or exceeding 5 µg/L and about 5 percent of the State having at least a 50 percent chance for its bedrock groundwater to contain concentrations at or above 10 µg/L. The southeastern counties of Merrimack, Strafford, Hillsborough, and Rockingham have the greatest potential for having arsenic concentrations above 5 and 10 µg/L in bedrock groundwater. Significant predictors of arsenic in groundwater from bedrock aquifers for all three thresholds analyzed included geologic, geochemical, land use, hydrologic, topographic, and demographic factors. Among the three thresholds evaluated, there were some differences in explanatory variables, but many variables were the same. More than 250 individual predictor variables were assembled for this study and tested as potential predictor variables for the models. More than 1,700 individual measurements of arsenic concentration from a combination of public and private water-supply wells served as the dependent (or predicted) variable in the models. The statewide maps generated by the probability models are not designed to predict arsenic concentration in any single well, but they are expected to provide useful information in areas of the State that currently contain little to no data on arsenic concentration. They also may aid in resource decision making, in determining potential risk for private

  7. Subsampled open-reference clustering creates consistent, comprehensive OTU definitions and scales to billions of sequences.

    PubMed

    Rideout, Jai Ram; He, Yan; Navas-Molina, Jose A; Walters, William A; Ursell, Luke K; Gibbons, Sean M; Chase, John; McDonald, Daniel; Gonzalez, Antonio; Robbins-Pianka, Adam; Clemente, Jose C; Gilbert, Jack A; Huse, Susan M; Zhou, Hong-Wei; Knight, Rob; Caporaso, J Gregory

    2014-01-01

    We present a performance-optimized algorithm, subsampled open-reference OTU picking, for assigning marker gene (e.g., 16S rRNA) sequences generated on next-generation sequencing platforms to operational taxonomic units (OTUs) for microbial community analysis. This algorithm provides benefits over de novo OTU picking (clustering can be performed largely in parallel, reducing runtime) and closed-reference OTU picking (all reads are clustered, not only those that match a reference database sequence with high similarity). Because more of our algorithm can be run in parallel relative to "classic" open-reference OTU picking, it makes open-reference OTU picking tractable on massive amplicon sequence data sets (though on smaller data sets, "classic" open-reference OTU clustering is often faster). We illustrate that here by applying it to the first 15,000 samples sequenced for the Earth Microbiome Project (1.3 billion V4 16S rRNA amplicons). To the best of our knowledge, this is the largest OTU picking run ever performed, and we estimate that our new algorithm runs in less than 1/5 the time than would be required of "classic" open reference OTU picking. We show that subsampled open-reference OTU picking yields results that are highly correlated with those generated by "classic" open-reference OTU picking through comparisons on three well-studied datasets. An implementation of this algorithm is provided in the popular QIIME software package, which uses uclust for read clustering. All analyses were performed using QIIME's uclust wrappers, though we provide details (aided by the open-source code in our GitHub repository) that will allow implementation of subsampled open-reference OTU picking independently of QIIME (e.g., in a compiled programming language, where runtimes should be further reduced). Our analyses should generalize to other implementations of these OTU picking algorithms. Finally, we present a comparison of parameter settings in QIIME's OTU picking workflows and

  8. The Other Inconvenient Truth: Feeding 9 Billion While Sustaining the Earth System

    NASA Astrophysics Data System (ADS)

    Foley, J. A.

    2010-12-01

    As the international community focuses on climate change as the great challenge of our era, we have been largely ignoring another looming problem — the global crisis in agriculture, food security and the environment. Our use of land, particularly for agriculture, is absolutely essential to the success of the human race: we depend on agriculture to supply us with food, feed, fiber, and, increasingly, biofuels. Without a highly efficient, productive, and resilient agricultural system, our society would collapse almost overnight. But we are demanding more and more from our global agricultural systems, pushing them to their very limits. Continued population growth (adding more than 70 million people to the world every year), changing dietary preferences (including more meat and dairy consumption), rising energy prices, and increasing needs for bioenergy sources are putting tremendous pressure on the world’s resources. And, if we want any hope of keeping up with these demands, we’ll need to double the agricultural production of the planet in the next 30 to 40 years. Meeting these huge new agricultural demands will be one of the greatest challenges of the 21st century. At present, it is completely unclear how (and if) we can do it. If this wasn’t enough, we must also address the massive environmental impacts of our current agricultural practices, which new evidence indicates rival the impacts of climate change. Simply put, providing for the basic needs of 9 billion-plus people, without ruining the biosphere in the process, will be one of the greatest challenges our species has ever faced. In this presentation, I will present a new framework for evaluating and assessing global patterns of agriculture, food / fiber / fuel production, and their relationship to the earth system, particularly in terms of changing stocks and flows of water, nutrients and carbon in our planetary environment. This framework aims to help us manage the challenges of increasing global food

  9. Trending in Probability of Collision Measurements

    NASA Technical Reports Server (NTRS)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  10. Quantum probabilities for inflation from holography

    NASA Astrophysics Data System (ADS)

    Hartle, James B.; Hawking, S. W.; Hertog, Thomas

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in hbar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  11. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  12. Probability of Brownian motion hitting an obstacle

    SciTech Connect

    Knessl, C.; Keller, J.B.

    2000-02-01

    The probability p(x) that Brownian motion with drift, starting at x, hits an obstacle is analyzed. The obstacle {Omega} is a compact subset of R{sup n}. It is shown that p(x) is expressible in terms of the field U(x) scattered by {Omega} when it is hit by plane wave. Therefore results for U(x), and methods for finding U(x) can be used to determine p(x). The authors illustrate this by obtaining exact and asymptotic results for p(x) when {Omega} is a slit in R{sup 2}, and asymptotic results when {Omega} is a disc in R{sup 3}.

  13. Probability density function learning by unsupervised neurons.

    PubMed

    Fiori, S

    2001-10-01

    In a recent work, we introduced the concept of pseudo-polynomial adaptive activation function neuron (FAN) and presented an unsupervised information-theoretic learning theory for such structure. The learning model is based on entropy optimization and provides a way of learning probability distributions from incomplete data. The aim of the present paper is to illustrate some theoretical features of the FAN neuron, to extend its learning theory to asymmetrical density function approximation, and to provide an analytical and numerical comparison with other known density function estimation methods, with special emphasis to the universal approximation ability. The paper also provides a survey of PDF learning from incomplete data, as well as results of several experiments performed on real-world problems and signals. PMID:11709808

  14. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2004-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  15. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2006-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  16. 5426 Sharp: A Probable Hungaria Binary

    NASA Astrophysics Data System (ADS)

    Warner, Brian D.; Benishek, Vladimir; Ferrero, Andrea

    2015-07-01

    Initial CCD photometry observations of the Hungaria asteroid 5426 Sharp in 2014 December and 2015 January at the Center of Solar System Studies-Palmer Divide Station in Landers, CA, showed attenuations from the general lightcurve, indicating the possibility of the asteroid being a binary system. The secondary period was almost exactly an Earth day, prompting a collaboration to be formed with observers in Europe, which eventually allowed establishing two periods: P1 = 4.5609 ± 0.0003 h, A1 = 0.18 ± 0.01 mag and P2 = 24.22 ± 0.02 h, A2 = 0.08 ± 0.01 mag. No mutual events, i.e., occultations and/or eclipses, were seen, therefore the asteroid is considered a probable and not confirmed binary

  17. Objective Lightning Probability Forecast Tool Phase II

    NASA Technical Reports Server (NTRS)

    Lambert, Winnie

    2007-01-01

    This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.

  18. Quantum probabilities for inflation from holography

    SciTech Connect

    Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  19. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-01

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data. PMID:26754250

  20. Model estimates hurricane wind speed probabilities

    NASA Astrophysics Data System (ADS)

    Mumane, Richard J.; Barton, Chris; Collins, Eric; Donnelly, Jeffrey; Eisner, James; Emanuel, Kerry; Ginis, Isaac; Howard, Susan; Landsea, Chris; Liu, Kam-biu; Malmquist, David; McKay, Megan; Michaels, Anthony; Nelson, Norm; O Brien, James; Scott, David; Webb, Thompson, III

    In the United States, intense hurricanes (category 3, 4, and 5 on the Saffir/Simpson scale) with winds greater than 50 m s -1 have caused more damage than any other natural disaster [Pielke and Pielke, 1997]. Accurate estimates of wind speed exceedance probabilities (WSEP) due to intense hurricanes are therefore of great interest to (re)insurers, emergency planners, government officials, and populations in vulnerable coastal areas.The historical record of U.S. hurricane landfall is relatively complete only from about 1900, and most model estimates of WSEP are derived from this record. During the 1899-1998 period, only two category-5 and 16 category-4 hurricanes made landfall in the United States. The historical record therefore provides only a limited sample of the most intense hurricanes.

  1. Homonymous Hemianopsia Associated with Probable Alzheimer's Disease.

    PubMed

    Ishiwata, Akiko; Kimura, Kazumi

    2016-01-01

    Posterior cortical atrophy (PCA) is a rare neurodegenerative disorder that has cerebral atrophy in the parietal, occipital, or occipitotemporal cortices and is characterized by visuospatial and visuoperceptual impairments. The most cases are pathologically compatible with Alzheimer's disease (AD). We describe a case of PCA in which a combination of imaging methods, in conjunction with symptoms and neurological and neuropsychological examinations, led to its being diagnosed and to AD being identified as its probable cause. Treatment with donepezil for 6 months mildly improved alexia symptoms, but other symptoms remained unchanged. A 59-year-old Japanese woman with progressive alexia, visual deficit, and mild memory loss was referred to our neurologic clinic for the evaluation of right homonymous hemianopsia. Our neurological examination showed alexia, constructional apraxia, mild disorientation, short-term memory loss, and right homonymous hemianopsia. These findings resulted in a score of 23 (of 30) points on the Mini-Mental State Examination. Occipital atrophy was identified, with magnetic resonance imaging (MRI) showing left-side dominance. The MRI data were quantified with voxel-based morphometry, and PCA was diagnosed on the basis of these findings. Single photon emission computed tomography with (123)I-N-isopropyl-p-iodoamphetamine showed hypoperfusion in the corresponding voxel-based morphometry occipital lobes. Additionally, the finding of hypoperfusion in the posterior associate cortex, posterior cingulate gyrus, and precuneus was consistent with AD. Therefore, the PCA was considered to be a result of AD. We considered Lewy body dementia as a differential diagnosis because of the presence of hypoperfusion in the occipital lobes. However, the patient did not meet the criteria for Lewy body dementia during the course of the disease. We therefore consider including PCA in the differential diagnoses to be important for patients with visual deficit, cognitive

  2. Repetition probability effects for inverted faces.

    PubMed

    Grotheer, Mareike; Hermann, Petra; Vidnyánszky, Zoltán; Kovács, Gyula

    2014-11-15

    It has been shown, that the repetition related reduction of the blood-oxygen level dependent (BOLD) signal is modulated by the probability of repetitions (P(rep)) for faces (Summerfield et al., 2008), providing support for the predictive coding (PC) model of visual perception (Rao and Ballard, 1999). However, the stage of face processing where repetition suppression (RS) is modulated by P(rep) is still unclear. Face inversion is known to interrupt higher level configural/holistic face processing steps and if modulation of RS by P(rep) takes place at these stages of face processing, P(rep) effects are expected to be reduced for inverted when compared to upright faces. Therefore, here we aimed at investigating whether P(rep) effects on RS observed for face stimuli originate at the higher-level configural/holistic stages of face processing by comparing these effects for upright and inverted faces. Similarly to previous studies, we manipulated P(rep) for pairs of stimuli in individual blocks of fMRI recordings. This manipulation significantly influenced repetition suppression in the posterior FFA, the OFA and the LO, independently of stimulus orientation. Our results thus reveal that RS in the ventral visual stream is modulated by P(rep) even in the case of face inversion and hence strongly compromised configural/holistic face processing. An additional whole-brain analysis could not identify any areas where the modulatory effect of probability was orientation specific either. These findings imply that P(rep) effects on RS might originate from the earlier stages of face processing. PMID:25123974

  3. Direct Updating of an RNA Base-Pairing Probability Matrix with Marginal Probability Constraints

    PubMed Central

    2012-01-01

    Abstract A base-pairing probability matrix (BPPM) stores the probabilities for every possible base pair in an RNA sequence and has been used in many algorithms in RNA informatics (e.g., RNA secondary structure prediction and motif search). In this study, we propose a novel algorithm to perform iterative updates of a given BPPM, satisfying marginal probability constraints that are (approximately) given by recently developed biochemical experiments, such as SHAPE, PAR, and FragSeq. The method is easily implemented and is applicable to common models for RNA secondary structures, such as energy-based or machine-learning–based models. In this article, we focus mainly on the details of the algorithms, although preliminary computational experiments will also be presented. PMID:23210474

  4. The continuing cost of privatization: extra payments to Medicare Advantage plans jump to $11.4 billion in 2009.

    PubMed

    Biles, Brian; Pozen, Jonah; Guterman, Stuart

    2009-05-01

    The Medicare Modernization Act of 2003 explicitly increased Medicare payments to private Medicare Advantage (MA) plans. As a result, MA plans have, for the past six years, been paid more for their enrollees than they would be expected to cost in traditional fee-for-service Medicare. Payments to MA plans in 2009 are projected to be 13 percent greater than the corresponding costs in traditional Medicare--an average of $1,138 per MA plan enrollee, for a total of $11.4 billion. Although the extra payments are used to provide enrollees additional benefits, those benefits are not available to all beneficiaries-- but they are financed by general program funds. If payments to MA plans were instead equal to the spending level under traditional Medicare, the more than $150 billion in savings over 10 years could be used to finance improved benefits for the low-income elderly and disabled, or for expanding health-insurance coverage. PMID:19449498

  5. $17 billion needed for population programme to year 2000: Dr. Nafis Sadik launches State of World Population Report.

    PubMed

    1995-01-01

    Dr. Nafis Sadik, Executive Director of the United Nations Population Fund (UNFPA), in her address on July 11 to the Foreign Press Association in London on the occasion of the release of the "1995 State of the World Population Report," stated that governments needed to invest in people, and that the estimated amount needed to reduce population numbers in developing countries was $17 billion for the year 2000. Two-thirds of the cost would be supplied by the developing countries. She said that coordinating population policies globally through such documents as the Programme of Action from the Cairo Conference would aid in slowing population growth. World population, currently 5.7 billion, is projected to reach 7.1-7.83 billion in 2015 and 7.9-11.9 billion in 2050. She also noted that certain conditions faced by women bear upon unsustainable population growth. The cycle of poverty continues in developing countries because very young mothers, who face higher risks in pregnancy and childbirth than those who delay childbearing until after the age of 20, are less likely to continue their education, more likely to have lower-paying jobs, and have a higher rate of separation and divorce. The isolation of women from widespread political participation and the marginalization of women's concerns from mainstream topics has resulted in ineffective family planning programs, including prevention of illness or impairment related to pregnancy or childbirth. Women, in most societies, cannot fully participate in economic and public life, have limited access to positions of influence and power, have narrower occupational choices and lower earnings than men, and must struggle to reconcile activities outside the home with their traditional roles. Sustainable development can only be achieved when social development expands opportunities for individuals (men and women), and their families, empowering them in the attainment of their social, economic, political, and cultural aspirations. PMID

  6. National R&D Spending to Exceed $50 Billion in 1979. Science Resources Studies Highlights, May 1, 1978.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    This report presents data compiled as part of a comprehensive program to measure and analyze the nation's resources expended for research and development (R&D). R&D spending in the United States is expected to reach $51 billion in 1979, 9% over the 1978 level. The R&D expenditures are expected to account for 2.2% of the gross national product…

  7. Theoretical transition probabilities for the OH Meinel system

    NASA Technical Reports Server (NTRS)

    Langhoff, S. R.; Werner, H.-J.; Rosmus, P.

    1986-01-01

    An electric dipole moment function (EDMF) for the X 2Pi ground state of OH, based on the complete active-space self-consistent field plus a multireference singles-plus-double excitation configuration-interaction procedure (using an extended Slater basis) is reported. Two theoretical EDMFS are considered: the MCSCF (7)-SCEP EDMF of Werner et al., (1983) and a previously unpublished EDMF based on the MCSCF multireference CI(SD) procedure using a large Slater basis. The theoretical treatment follows that of Mies (1974), except that the Hill and Van Vleck (1928) approximation to intermediate coupling is used. This approximation is shown to be accurate to better than 5 percent for the six principal branches of the OH Meinel system.

  8. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    SciTech Connect

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  9. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    PubMed Central

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomography pulmonary angiography (CTPA) scan in which 21 (20%) had a positive scan, 81 (79%) had a negative scan and one (1%) had an equivocal result. The rate of PE in the normal, low-probability, and high-probability scan categories were: 2 (9.5%), 10 (47.5%), and 9 (43%) respectively. A very low correlation (Pearson correlation coefficient r = 0.20) between the clinical PTP score and lung VQ scan. The area under the curve (AUC) of the clinical PTP score was 52% when compared with the CTPA results. However, the accuracy of lung VQ scan was better (AUC = 74%) when compared with CTPA scan. Conclusion: The clinical PTP score is unreliable on its own; however, it may still aid in the interpretation of lung VQ scan. The accuracy of the lung VQ scan was better in the assessment of underlying pulmonary embolism (PE). PMID:24379532

  10. Evidence for Oxygenic Photosynthesis Half a Billion Years Before the Great Oxidation Event

    NASA Astrophysics Data System (ADS)

    Planavsky, Noah; Reinhard, Chris; Asael, Dan; Lyons, Tim; Hofmann, Axel; Rouxel, Olivier

    2014-05-01

    Despite detailed investigations over the past 50 years, there is still intense debate about when oxygenic photosynthesis evolved. Current estimates span over a billion years of Earth history, ranging from prior to 3.7 Ga, the age of the oldest sedimentary rocks, to 2.4-2.3 Ga, coincident with the rise of atmospheric oxygen ("The Great Oxidation Event" or GOE). As such, a new, independent perspective is needed. We will provide such a perspective herein by using molybdenum (Mo) isotopes in a novel way to track the onset of manganese(II)oxidation and thus biological oxygen production. The oxidation of Mn(II) in modern marine setting requires free dissolved oxygen. Mn is relatively unique in its environmental specificity for oxygen as an electron acceptor among the redox-sensitive transition metals, many of which, like Fe, can be oxidized under anoxic conditions either through a microbial pathway and/or with alternative oxidants such as nitrate. There are large Mo isotope fractionations associated with the sorption of Mo (as a polymolybdate complex) onto Mn-oxyhydroxides, with an approximately -2.7‰ fractionation in d98Mo associated with Mo sorption onto Mn-oxyhydroxides (e.g., birnessite, vernadite). In contrast, sorption of Mo onto the Fe-oxyhydroxide (e.g., ferrihydrite) results in a fractionation of only -1.1‰ or less. Because of this difference in Mo isotope fractionation, Mo isotope values should become lighter with increasing Mn content, if Mn oxidation occurred during deposition and is an important vector of Mo transfer to the sediment. We find a strong positive correlation between d98Mo values and Fe/Mn ratios in iron formations deposited before and after the Great Oxidation Event. Most strikingly, Mo isotope data and Fe/Mn ratios correlate over a 2.5‰ range in d98Mo values in the Mn-rich (0.1 - 6%) iron formation of the 2.95 Ga Sinqeni Formation, South Africa. The large isotopic shifts occur over a relatively thin (3 meter thick) horizon, reflecting

  11. Atomic Transition Probabilities for Neutral Cerium

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; den Hartog, E. A.; Wood, M. P.; Nitz, D. E.; Chisholm, J.; Sobeck, J.

    2009-10-01

    The spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are more complex than spectra of other rare earth species. The resulting high density of lines in the visible makes Ce ideal for use in metal halide (MH) High Intensity Discharge (HID) lamps. Inclusion of cerium-iodide in a lamp dose can improve both the Color Rendering Index and luminous efficacy of a MH-HID lamp. Basic spectroscopic data including absolute atomic transition probabilities for Ce I and Ce II are needed for diagnosing and modeling these MH-HID lamps. Recent work on Ce II [1] is now being augmented with similar work on Ce I. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2000 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. [4pt] [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [0pt] [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).

  12. Joint probability distributions and fluctuation theorems

    NASA Astrophysics Data System (ADS)

    García-García, Reinaldo; Lecomte, Vivien; Kolton, Alejandro B.; Domínguez, Daniel

    2012-02-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation-dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators.

  13. Do aftershock probabilities decay with time?

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  14. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  15. Lectures on probability and statistics. Revision

    SciTech Connect

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  16. Enhanced awakening probability of repetitive impulse sounds.

    PubMed

    Vos, Joos; Houben, Mark M J

    2013-09-01

    In the present study relations between the level of impulse sounds and the observed proportion of behaviorally confirmed awakening reactions were determined. The sounds (shooting sounds, bangs produced by door slamming or by container transshipment, aircraft landings) were presented by means of loudspeakers in the bedrooms of 50 volunteers. The fragments for the impulse sounds consisted of single or multiple events. The sounds were presented during a 6-h period that started 75 min after the subjects wanted to sleep. In order to take account of habituation, each subject participated during 18 nights. At equal indoor A-weighted sound exposure levels, the proportion of awakening for the single impulse sounds was equal to that for the aircraft sounds. The proportion of awakening induced by the multiple impulse sounds, however, was significantly higher. For obtaining the same rate of awakening, the sound level of each of the successive impulses in a fragment had to be about 15-25 dB lower than the level of one single impulse. This level difference was largely independent of the degree of habituation. Various explanations for the enhanced awakening probability are discussed. PMID:23967934

  17. Essays on probability elicitation scoring rules

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  18. Probability of rupture of multiple fault segments

    USGS Publications Warehouse

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  19. Probability judgments under ambiguity and conflict

    PubMed Central

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081

  20. Levetiracetam: Probably Associated Diurnal Frequent Urination.

    PubMed

    Ju, Jun; Zou, Li-Ping; Shi, Xiu-Yu; Hu, Lin-Yan; Pang, Ling-Yu

    2016-01-01

    Diurnal frequent urination is a common condition in elementary school children who are especially at risk for associated somatic and behavioral problems. Levetiracetam (LEV) is a broad-spectrum antiepileptic drug that has been used in both partial and generalized seizures and less commonly adverse effects including psychiatric and behavioral problems. Diurnal frequent urination is not a well-known adverse effect of LEV. Here, we reported 2 pediatric cases with epilepsy that developed diurnal frequent urination after LEV administration. Case 1 was a 6-year-old male patient who presented urinary frequency and urgency in the daytime since the third day after LEV was given as adjunctive therapy. Symptoms increased accompanied by the raised dosage of LEV. Laboratory tests and auxiliary examinations did not found evidence of organic disease. Diurnal frequent urination due to LEV was suspected, and then the drug was discontinued. As expected, his frequency of urination returned to normal levels. Another 13-year-old female patient got similar clinical manifestations after oral LEV monotherapy and the symptoms became aggravated while in stress state. Since the most common causes of frequent micturition had been ruled out, the patient was considered to be diagnosed with LEV-associated psychogenic frequent urination. The dosage of LEV was reduced to one-third, and the frequency of urination was reduced by 60%. Both patients got the Naranjo score of 6, which indicated that LEV was a "probable" cause of diurnal frequent urination. Although a definite causal link between LEV and diurnal urinary frequency in the 2 cases remains to be established, we argue that diurnal frequent urination associated with LEV deserves clinician's attention. PMID:26938751

  1. Movement disorders of probable infectious origin

    PubMed Central

    Jhunjhunwala, Ketan; Netravathi, M.; Pal, Pramod Kumar

    2014-01-01

    Background: Movement disorders (MDs) associated with infections remains an important debilitating disorder in the Asian countries. Objectives: The objective of the following study is to report the clinical and imaging profile of a large cohort of patients with MDs probably associated with infection. Materials and Methods: This was a chart review of 35 patients (F:M-15:20) presenting with MD in the Neurology services of National Institute of Mental Health and Neurosciences, India. The demographic profile, type of infection, time from infection to MD, phenomenology of MD and magnetic resonance imaging (MRI) findings were reviewed. Results: The mean age at presentation was 22.6 ± 13.3 years, (5-60), age of onset of MD was 15.7 ± 15 years, and duration of symptoms was 6.9 ± 8.1 years (42 days to 32 years). The mean latency of onset of MD after the infection was 5.9 ± 4.2 weeks. The phenomenology of MD were: (1) Pure dystonia-28.6%, (2) dystonia with choreoathetosis-22.9%, (3) Parkinsonism-14.6%, (4) pure tremor, hemiballismus, myoclonus and chorea-2.9% each, and (5) mixed MD-22.9%. Most often the MD was generalized (60%), followed by right upper limb (31.4%) and left upper limb (8.6%). A viral encephalitic type of neuroinfection was the most common infection (85.7%), which was associated with MD. Abnormalities of brain MRI, seen in 79.2%, included signal changes in (1) thalamus-52.0%, (2) putamen and subcortical white matter-16% each, (3) pons-12%, (4) striatopallidum, striatum and grey matter-8% each, and (5) caudate, cerebellum, lentiform nucleus, midbrain and subthalamic nucleus-4.0% each. Conclusions: MDs associated with infection were the most often post-encephalitic. Dystonia was the most common MD, and thalamus was the most common anatomical site involved. PMID:25221398

  2. Datamining approaches for modeling tumor control probability

    PubMed Central

    Naqa, Issam El; Deasy, Joseph O.; Mu, Yi; Huang, Ellen; Hope, Andrew J.; Lindsay, Patricia E.; Apte, Aditya; Alaly, James; Bradley, Jeffrey D.

    2016-01-01

    Background Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Material and methods Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Results Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs = 0.68 on leave-one-out testing compared to logistic regression (rs = 0.4), Poisson-based TCP (rs = 0.33), and cell kill equivalent uniform dose model (rs = 0.17). Conclusions The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications. PMID:20192878

  3. Projecting Climate Change Impacts on Wildfire Probabilities

    NASA Astrophysics Data System (ADS)

    Westerling, A. L.; Bryant, B. P.; Preisler, H.

    2008-12-01

    We present preliminary results of the 2008 Climate Change Impact Assessment for wildfire in California, part of the second biennial science report to the California Climate Action Team organized via the California Climate Change Center by the California Energy Commission's Public Interest Energy Research Program pursuant to Executive Order S-03-05 of Governor Schwarzenegger. In order to support decision making by the State pertaining to mitigation of and adaptation to climate change and its impacts, we model wildfire occurrence monthly from 1950 to 2100 under a range of climate scenarios from the Intergovernmental Panel on Climate Change. We use six climate change models (GFDL CM2.1, NCAR PCM1, CNRM CM3, MPI ECHAM5, MIROC3.2 med, NCAR CCSM3) under two emissions scenarios--A2 (C02 850ppm max atmospheric concentration) and B1(CO2 550ppm max concentration). Climate model output has been downscaled to a 1/8 degree (~12 km) grid using two alternative methods: a Bias Correction and Spatial Donwscaling (BCSD) and a Constructed Analogues (CA) downscaling. Hydrologic variables have been simulated from temperature, precipitation, wind and radiation forcing data using the Variable Infiltration Capacity (VIC) Macroscale Hydrologic Model. We model wildfire as a function of temperature, moisture deficit, and land surface characteristics using nonlinear logistic regression techniques. Previous work on wildfire climatology and seasonal forecasting has demonstrated that these variables account for much of the inter-annual and seasonal variation in wildfire. The results of this study are monthly gridded probabilities of wildfire occurrence by fire size class, and estimates of the number of structures potentially affected by fires. In this presentation we will explore the range of modeled outcomes for wildfire in California, considering the effects of emissions scenarios, climate model sensitivities, downscaling methods, hydrologic simulations, statistical model specifications for

  4. Failure-probability driven dose painting

    SciTech Connect

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  5. On lacunary statistical convergence of order α in probability

    NASA Astrophysics Data System (ADS)

    Işık, Mahmut; Et, Kübra Elif

    2015-09-01

    In this study, we examine the concepts of lacunary statistical convergence of order α in probability and Nθ—convergence of order α in probability. We give some relations connected to these concepts.

  6. Gusev Crater Paleolake: Two-Billion Years of Martian Geologic, (and Biologic?) History

    NASA Technical Reports Server (NTRS)

    Cabrol, N. A.; Grin, E. A.; Landheim, R.; Greeley, R.; Kuzmin, R.; McKay, C. P.

    1998-01-01

    Ancient Martian lakes are sites where the climatological, chemical, and possibly biological history of the planet has been recorded. Their potential to keep this global information in their sedimentary deposits, potential only shared with the polar layered-deposits, designates them as the most promising targets for the ongoing exploration of Mars in terms of science return and global knowledge about Mars evolution. Many of the science priority objectives of the Surveyor Program can be met by exploring ancient Martian lake beds. Among martian paleolakes, lakes in impact craters represent probably the most favorable sites to explore. Though highly destructive events when they occur, impacts may have provided in time a significant energy source for life, by generating heat, and at the contact of water and/or ice, deep hydrothermal systems, which are considered as favorable environments for life. In addition, impact crater lakes are changing environments, from thermally driven systems at the very first stage of their formation, to cold ice-protected potential oases in the more recent Martian geological times. Thus, they are plausible sites to study the progression of diverse microbiologic communities.

  7. What is preexisting strength? Predicting free association probabilities, similarity ratings, and cued recall probabilities.

    PubMed

    Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B

    2005-08-01

    Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength. PMID:16447386

  8. Probability in Theories With Complex Dynamics and Hardy's Fifth Axiom

    NASA Astrophysics Data System (ADS)

    Burić, Nikola

    2010-08-01

    L. Hardy has formulated an axiomatization program of quantum mechanics and generalized probability theories that has been quite influential. In this paper, properties of typical Hamiltonian dynamical systems are used to argue that there are applications of probability in physical theories of systems with dynamical complexity that require continuous spaces of pure states. Hardy’s axiomatization program does not deal with such theories. In particular Hardy’s fifth axiom does not differentiate between such applications of classical probability and quantum probability.

  9. Asymptotic behavior of the supremum tail probability for anomalous diffusions

    NASA Astrophysics Data System (ADS)

    Michna, Zbigniew

    2008-01-01

    In this paper we investigate asymptotic behavior of the tail probability for subordinated self-similar processes with regularly varying tail probability. We show that the tail probability of the one-dimensional distributions and the supremum tail probability are regularly varying with the pre-factor depending on the moments of the subordinating process. We can apply our result to the so-called anomalous diffusion.

  10. Pretest probability assessment derived from attribute matching

    PubMed Central

    Kline, Jeffrey A; Johnson, Charles L; Pollack, Charles V; Diercks, Deborah B; Hollander, Judd E; Newgard, Craig D; Garvey, J Lee

    2005-01-01

    Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation <2% in a validation set of 8,120 patients evaluated for possible ACS and did not have ST segment elevation on ECG. 1,061 patients were excluded prior to validation analysis because of ST-segment elevation (713), missing data (77) or being lost to follow-up (271). Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82) for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77) for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25%) patients as having a PTP < 2.0%; 28 developed ACS (1.7% 95% CI = 1.1–2.4%). The LRE categorized 244 (4%, 95% CI = 3–4%) with PTP < 2.0%; four developed ACS (1.6%, 95% CI = 0.4–4.1%). Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE. PMID:16095534

  11. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F., Jr.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  12. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure... probabilistically valid. For a launch vehicle with fewer than two flights, the failure probability estimate must... circumstances. For a launch vehicle with two or more flights, launch vehicle failure probability......

  13. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle...

  14. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle...

  15. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  16. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  17. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  18. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  19. Probability Constructs in Preschool Education and How they Are Taught

    ERIC Educational Resources Information Center

    Antonopoulos, Konstantinos; Zacharos, Konstantinos

    2013-01-01

    The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…

  20. NREL Helps Clean Cities Displace Billions of Gallons of Petroleum, One Vehicle at a Time (Fact Sheet)

    SciTech Connect

    Not Available

    2010-10-01

    With more than 15 years and nearly 3 billion gallons of displaced petroleum under its belt, the Clean Cities program relies on the support and expertise of the National Renewable Energy Laboratory (NREL). An initiative of the U.S. Department of Energy (DOE), Clean Cities creates public-private partnerships with a common mission: to reduce petroleum consumption in the transportation sector. Since the inception of Clean Cities in 1993, NREL has played a central role in supporting the program, an effort that stems from the laboratory's strategy to put scientific innovation into action in the marketplace.

  1. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  2. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  3. Probability Distribution for Flowing Interval Spacing

    SciTech Connect

    S. Kuzio

    2004-09-22

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  4. Debris-flow hazard map units from gridded probabilities

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.

    1997-01-01

    The common statistical practice of dividing a range of probabilities into equal probability intervals may not result in useful landslide-hazard map units for areas populated by equal-area cells, each of which has a unique probability. Most hazard map areas contain very large numbers of cells having low probability of failure, and as probability increases, the number of cells decreases in a non-linear fashion. Exploration of this distribution suggests that the spatial frequency of expected failures may be used to identify probability intervals that define map units. From a spatial database of gridded probabilities, map units that address the different objectives of land-use planners and emergency response officials can be defined.

  5. Tensile and fatigue data for irradiated and unirradiated AISI 310 stainless steel and titanium - 5 percent aluminum - 2.5 percent tin: Application of the method of universal slopes

    NASA Technical Reports Server (NTRS)

    Debogdan, C. E.

    1973-01-01

    Irradiated and unirradiated tensile and fatigue specimens of AISI 310 stainless steel and Ti-5Al-2.5Sn were tested in the range of 100 to 10,000 cycles to failure to determine the applicability of the method of universal slopes to irradiated materials. Tensile data for both materials showed a decrease in ductility and increase in ultimate tensile strength due to irradiation. Irradiation caused a maximum change in fatigue life of only 15 to 20 percent for both materials. The method of universal slopes predicted all the fatigue data for the 310 SS (irradiated as well as unirradiated) within a life factor of 2. For the titanium alloy, 95 percent of the data was predicted within a life factor of 3.

  6. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  7. An age difference of two billion years between a metal-rich and a metal-poor globular cluster.

    PubMed

    Hansen, B M S; Kalirai, J S; Anderson, J; Dotter, A; Richer, H B; Rich, R M; Shara, M M; Fahlman, G G; Hurley, J R; King, I R; Reitzel, D; Stetson, P B

    2013-08-01

    Globular clusters trace the formation history of the spheroidal components of our Galaxy and other galaxies, which represent the bulk of star formation over the history of the Universe. The clusters exhibit a range of metallicities (abundances of elements heavier than helium), with metal-poor clusters dominating the stellar halo of the Galaxy, and higher-metallicity clusters found within the inner Galaxy, associated with the stellar bulge, or the thick disk. Age differences between these clusters can indicate the sequence in which the components of the Galaxy formed, and in particular which clusters were formed outside the Galaxy and were later engulfed along with their original host galaxies, and which were formed within it. Here we report an absolute age of 9.9 ± 0.7 billion years (at 95 per cent confidence) for the metal-rich globular cluster 47 Tucanae, determined by modelling the properties of the cluster's white-dwarf cooling sequence. This is about two billion years younger than has been inferred for the metal-poor cluster NGC 6397 from the same models, and provides quantitative evidence that metal-rich clusters like 47 Tucanae formed later than metal-poor halo clusters like NGC 6397. PMID:23903747

  8. An ultraluminous quasar with a twelve-billion-solar-mass black hole at redshift 6.30.

    PubMed

    Wu, Xue-Bing; Wang, Feige; Fan, Xiaohui; Yi, Weimin; Zuo, Wenwen; Bian, Fuyan; Jiang, Linhua; McGreer, Ian D; Wang, Ran; Yang, Jinyi; Yang, Qian; Thompson, David; Beletsky, Yuri

    2015-02-26

    So far, roughly 40 quasars with redshifts greater than z = 6 have been discovered. Each quasar contains a black hole with a mass of about one billion solar masses (10(9) M Sun symbol). The existence of such black holes when the Universe was less than one billion years old presents substantial challenges to theories of the formation and growth of black holes and the coevolution of black holes and galaxies. Here we report the discovery of an ultraluminous quasar, SDSS J010013.02+280225.8, at redshift z = 6.30. It has an optical and near-infrared luminosity a few times greater than those of previously known z > 6 quasars. On the basis of the deep absorption trough on the blue side of the Lyman-α emission line in the spectrum, we estimate the proper size of the ionized proximity zone associated with the quasar to be about 26 million light years, larger than found with other z > 6.1 quasars with lower luminosities. We estimate (on the basis of a near-infrared spectrum) that the black hole has a mass of ∼1.2 × 10(10) M Sun symbol, which is consistent with the 1.3 × 10(10) M Sun symbol derived by assuming an Eddington-limited accretion rate. PMID:25719667

  9. A spin-down clock for cool stars from observations of a 2.5-billion-year-old cluster.

    PubMed

    Meibom, Søren; Barnes, Sydney A; Platais, Imants; Gilliland, Ronald L; Latham, David W; Mathieu, Robert D

    2015-01-29

    The ages of the most common stars--low-mass (cool) stars like the Sun, and smaller--are difficult to derive because traditional dating methods use stellar properties that either change little as the stars age or are hard to measure. The rotation rates of all cool stars decrease substantially with time as the stars steadily lose their angular momenta. If properly calibrated, rotation therefore can act as a reliable determinant of their ages based on the method of gyrochronology. To calibrate gyrochronology, the relationship between rotation period and age must be determined for cool stars of different masses, which is best accomplished with rotation period measurements for stars in clusters with well-known ages. Hitherto, such measurements have been possible only in clusters with ages of less than about one billion years, and gyrochronology ages for older stars have been inferred from model predictions. Here we report rotation period measurements for 30 cool stars in the 2.5-billion-year-old cluster NGC 6819. The periods reveal a well-defined relationship between rotation period and stellar mass at the cluster age, suggesting that ages with a precision of order 10 per cent can be derived for large numbers of cool Galactic field stars. PMID:25539085

  10. A spin-down clock for cool stars from observations of a 2.5-billion-year-old cluster

    NASA Astrophysics Data System (ADS)

    Meibom, Søren; Barnes, Sydney A.; Platais, Imants; Gilliland, Ronald L.; Latham, David W.; Mathieu, Robert D.

    2015-01-01

    The ages of the most common stars--low-mass (cool) stars like the Sun, and smaller--are difficult to derive because traditional dating methods use stellar properties that either change little as the stars age or are hard to measure. The rotation rates of all cool stars decrease substantially with time as the stars steadily lose their angular momenta. If properly calibrated, rotation therefore can act as a reliable determinant of their ages based on the method of gyrochronology. To calibrate gyrochronology, the relationship between rotation period and age must be determined for cool stars of different masses, which is best accomplished with rotation period measurements for stars in clusters with well-known ages. Hitherto, such measurements have been possible only in clusters with ages of less than about one billion years, and gyrochronology ages for older stars have been inferred from model predictions. Here we report rotation period measurements for 30 cool stars in the 2.5-billion-year-old cluster NGC 6819. The periods reveal a well-defined relationship between rotation period and stellar mass at the cluster age, suggesting that ages with a precision of order 10 per cent can be derived for large numbers of cool Galactic field stars.

  11. White Light Demonstration of One Hundred Parts per Billion Irradiance Suppression in Air by New Starshade Occulters

    NASA Technical Reports Server (NTRS)

    Levinton, Douglas B.; Cash, Webster C.; Gleason, Brian; Kaiser, Michael J.; Levine, Sara A.; Lo, Amy S.; Schindhelm, Eric; Shipley, Ann F.

    2007-01-01

    A new mission concept for the direct imaging of exo-solar planets called the New Worlds Observer (NWO) has been proposed. The concept involves flying a meter-class space telescope in formation with a newly-conceived, specially-shaped, deployable star-occulting shade several meters across at a separation of some tens of thousands of kilometers. The telescope would make its observations from behind the starshade in a volume of high suppression of incident irradiance from the star around which planets orbit. The required level of irradiance suppression created by the starshade for an efficacious mission is of order 0.1 to 10 parts per billion in broadband light. This paper discusses the experimental setup developed to accurately measure the suppression ratio of irradiance produced at the null position behind candidate starshade forms to these levels. It also presents results of broadband measurements which demonstrated suppression levels of just under 100 parts per billion in air using the Sun as a light source. Analytical modeling of spatial irradiance distributions surrounding the null are presented and compared with photographs of irradiance captured in situ behind candidate starshades.

  12. Compound-specific carbon and hydrogen isotope analysis of sub-parts per billion level waterborne petroleum hydrocarbons

    USGS Publications Warehouse

    Wang, Y.; Huang, Y.; Huckins, J.N.; Petty, J.D.

    2004-01-01

    Compound-specific carbon and hydrogen isotope analysis (CSCIA and CSHIA) has been increasingly used to study the source, transport, and bioremediation of organic contaminants such as petroleum hydrocarbons. In natural aquatic systems, dissolved contaminants represent the bioavailable fraction that generally is of the greatest toxicological significance. However, determining the isotopic ratios of waterborne hydrophobic contaminants in natural waters is very challenging because of their extremely low concentrations (often at sub-parts ber billion, or even lower). To acquire sufficient quantities of polycyclic aromatic hydrocarbons with 10 ng/L concentration for CSHIA, more than 1000 L of water must be extracted. Conventional liquid/liquid or solid-phase extraction is not suitable for such large volume extractions. We have developed a new approach that is capable of efficiently sampling sub-parts per billion level waterborne petroleum hydrocarbons for CSIA. We use semipermeable membrane devices (SPMDs) to accumulate hydrophobic contaminants from polluted waters and then recover the compounds in the laboratory for CSIA. In this study, we demonstrate, under a variety of experimental conditions (different concentrations, temperatures, and turbulence levels), that SPMD-associated processes do not induce C and H isotopic fractionations. The applicability of SPMD-CSIA technology to natural systems is further demonstrated by determining the ??13C and ??D values of petroleum hydrocarbons present in the Pawtuxet River, RI. Our results show that the combined SPMD-CSIA is an effective tool to investigate the source and fate of hydrophobic contaminants in the aquatic environments.

  13. Young Star Probably Ejected From Triple System

    NASA Astrophysics Data System (ADS)

    2003-01-01

    Astronomers analyzing nearly 20 years of data from the National Science Foundation's Very Large Array radio telescope have discovered that a small star in a multiple-star system in the constellation Taurus probably has been ejected from the system after a close encounter with one of the system's more-massive components, presumed to be a compact double star. This is the first time any such event has been observed. Path of Small Star, 1983-2001 "Our analysis shows a drastic change in the orbit of this young star after it made a close approach to another object in the system," said Luis Rodriguez of the Institute of Astronomy of the National Autonomous University of Mexico (UNAM). "The young star was accelerated to a large velocity by the close approach, and certainly now is in a very different, more remote orbit, and may even completely escape its companions," said Laurent Loinard, leader of the research team that also included Monica Rodriguez in addition to Luis Rodriguez. The UNAM astronomers presented their findings at the American Astronomical Society's meeting in Seattle, WA. The discovery of this chaotic event will be important for advancing our understanding of classical dynamic astronomy and of how stars evolve, including possibly providing an explanation for the production of the mysterious "brown dwarfs," the astronomers said. The scientists analyzed VLA observations of T Tauri, a multiple system of young stars some 450 light-years from Earth. The observations were made from 1983 to 2001. The T Tauri system includes a "Northern" star, the famous star that gives its name to the class of young visible stars, and a "Southern" system of stars, all orbiting each other. The VLA data were used to track the orbit of the smaller Southern star around the larger Southern object, presumed to be a pair of stars orbiting each other closely. The astronomers' plot of the smaller star's orbit shows that it followed an apparently elliptical orbit around its twin companions

  14. A physical-space approach for the probability hypothesis density and cardinalized probability hypothesis density filters

    NASA Astrophysics Data System (ADS)

    Erdinc, Ozgur; Willett, Peter; Bar-Shalom, Yaakov

    2006-05-01

    The probability hypothesis density (PHD) filter, an automatically track-managed multi-target tracker, is attracting increasing but cautious attention. Its derivation is elegant and mathematical, and thus of course many engineers fear it; perhaps that is currently limiting the number of researchers working on the subject. In this paper, we explore a physical-space approach - a bin model - which leads us to arrive the same filter equations as the PHD. Unlike the original derivation of the PHD filter, the concepts used are the familiar ones of conditional probability. The original PHD suffers from a "target-death" problem in which even a single missed detection can lead to the apparent disappearance of a target. To obviate this, PHD originator Mahler has recently developed a new "cardinalized" version of PHD (CPHD). We are able to extend our physical-space derivation to the CPHD case as well. We stress that the original derivations are mathematically correct, and need no embellishment from us; our contribution here is to offer an alternative derivation, one that we find appealing.

  15. ERP Correlates of Verbal and Numerical Probabilities in Risky Choices: A Two-Stage Probability Processing View

    PubMed Central

    Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin

    2016-01-01

    Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612

  16. A resolution to express the sense of the Senate in support of reducing its budget by at least 5 percent.

    THOMAS, 112th Congress

    Sen. Wicker, Roger F. [R-MS

    2011-03-08

    03/16/2011 Resolution agreed to in Senate without amendment and with a preamble by Unanimous Consent. (text: CR S1768) (All Actions) Tracker: This bill has the status Passed SenateHere are the steps for Status of Legislation:

  17. Orbiter entry trajectory corridors: 32000 pound payload, 67.5 percent center of gravity. [glide path data compilation

    NASA Technical Reports Server (NTRS)

    Treybig, J. H.

    1975-01-01

    Thermal and equilibrium glide boundaries were used to analyze and/or design shuttle orbiter entry trajectories. Plots are presented of orbiter thermal and equilibrium glide boundaries in the drag/mass-relative velocity dynamic pressure-relative velocity, and altitude-relative velocity planes for an orbiter having a 32,000 pound payload and a 67.5% center of gravity location. These boundaries were defined for control points 1 through 4 of the shuttle orbiter for 40 deg-30 deg and 38 deg-28 deg ramped angle of attack entry profiles and 40 deg, 38 deg, 35 deg, 30 deg, 28 deg, and 25 deg constant angle of attack entry profiles each at 20 deg, 15 deg, and 10 deg constant body flap settings.

  18. Forms of leg abnormality observed in male broilers fed on a diet containing 12.5 percent rapeseed meal.

    PubMed

    Timms, L M

    1983-09-01

    The incidence of leg abnormalities was studied in 216 male Ross I broilers, fed for 10 weeks on a diet containing 12.5 per cent extracted rapeseed. Regular serological examination showed that the birds remained free from Mycoplasma gallisepticum, Mycoplasma synoviae and avian reovirus throughout the period of investigation. Post mortem examination and radiographs were performed when birds were culled due to leg deformities or at the end of the experiment. Leg abnormalities were seen in 19.4 per cent of the birds which represents a very significant increase above that currently seen in commercial flocks. They consisted of a large range of skeletal deformities including valgus and varus deformities, dyschondroplasia, slipped gastrocnemius tendons, dislocated condyles, rotation and penetration of the distal tibiotarsus and fractured fibulas. Multiple forms of leg abnormality were often observed in individual birds and their association is briefly discussed. PMID:6635344

  19. Exploring non-signalling polytopes with negative probability

    NASA Astrophysics Data System (ADS)

    Oas, G.; Acacio de Barros, J.; Carvalhaes, C.

    2014-12-01

    Bipartite and tripartite EPR-Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory.

  20. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.