Gibbs, Adrian J.; Wood, Jeffrey; Garcia-Arenal, Fernando; Ohshima, Kazusato; Armstrong, John S.
2015-01-01
A phylogeny has been calculated by maximum likelihood comparisons of the concatenated consensus protein sequences of 29 tobamoviruses shown to be non-recombinant. This phylogeny has statistically significant support throughout, including its basal branches. The viruses form eight lineages that are congruent with the taxonomy of the hosts from which each was first isolated and, with the exception of three of the twenty-nine species, all fall into three clusters that have either asterid or rosid or caryophyllid hosts (i.e. the major subdivisions of eudicotyledonous plants). A modified Mantel permutation test showed that the patristic distances of virus and host phylogenies are significantly correlated, especially when the three anomalously placed viruses are removed. When the internal branches of the virus phylogeny were collapsed the congruence decreased. The simplest explanation of this congruence of the virus and host phylogenies is that most tobamovirus lineages have co-diverged with their primary plant hosts for more than 110 million years, and only the brassica-infecting lineage originated from a major host switch from asterids to rosids. Their co-divergence seems to have been ‘fuzzy’ rather than ‘strict’, permitting viruses to switch hosts within major host clades. Our conclusions support those of a coalesence analysis of tobamovirus sequences, that used proxy node dating, but not a similar analysis of nucleotide sequences from dated samples, which concluded that the tobamoviruses originated only 100 thousand years ago. PMID:27774289
30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...
30 CFR 57.22233 - Actions at 0.5 percent methane (I-C mines).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-C mines). 57... MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22233 Actions at 0.5 percent methane (I-C mines). If methane reaches 0.5 percent in the mine atmosphere, ventilation...
Median CBO Salary Rises by 4.5 Percent, Annual Study Finds.
ERIC Educational Resources Information Center
Business Officer, 1997
1997-01-01
An annual national survey of college and university salaries found chief business officers' salaries rose 4.5 percent in 1996-97, less than the previous year. Salaries of women and minority CBOs continued to gain equity with that of men. Rates of increase varied by institution type. Salary gains for all administrative job types were less than in…
16 CFR 303.3 - Fibers present in amounts of less than 5 percent.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Fibers present in amounts of less than 5... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... Act, as amended, no fiber present in the amount of less than 5 percent of the total fiber weight...
16 CFR 303.3 - Fibers present in amounts of less than 5 percent.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Fibers present in amounts of less than 5... OF CONGRESS RULES AND REGULATIONS UNDER THE TEXTILE FIBER PRODUCTS IDENTIFICATION ACT § 303.3 Fibers... Act, as amended, no fiber present in the amount of less than 5 percent of the total fiber weight...
Evaluation of EA-934NA with 2.5 percent Cab-O-Sil
NASA Technical Reports Server (NTRS)
Caldwell, Gordon A.
1990-01-01
Currently, Hysol adhesive EA-934NA is used to bond the Field Joint Protection System on the Shuttle rocket motors at Kennedy Space Center. However, due to processing problems, an adhesive with a higher viscosity is needed to alleviate these difficulties. One possible solution is to add Cab-O-Sil to the current adhesive. The adhesive strength and bond strengths that can be obtained when 2.5 percent Cab-O-Sil is added to adhesive EA-934NA are examined and tested over a range of test temperatures from -20 to 300 F. Tensile adhesion button and lap shear specimens were bonded to D6AC steel and uniaxial tensile specimens (testing for strength, initial tangent modulus, elongation and Poisson's ratio) were prepared using Hysol adhesive EA-934NA with 2.5 percent Cab-O-Sil added. These specimens were tested at -20, 20, 75, 100, 125, 150, 200, 250, and 300 F, respectively. Additional tensile adhesion button specimens bonding Rust-Oleum primed and painted D6AC steel to itself and to cork using adhesive EA-934NA with 2.5 percent Cab-O-Sil added were tested at 20, 75, 125, 200, and 300 F, respectively. Results generally show decreasing strength values with increasing test temperatures. The bond strengths obtained using cork as a substrate were totally dependent on the cohesive strength of the cork.
Code of Federal Regulations, 2010 CFR
2010-10-01
... less than 5 percent in trust or restricted land? 30.183 Section 30.183 Public Lands: Interior Office of... may receive a renounced interest of less than 5 percent in trust or restricted land? You may renounce... less than 5 percent of the entire undivided ownership of a parcel of land only in favor of: (a)...
30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22232 Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines). If methane reaches...
30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...
30 CFR 57.22237 - Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 2.0 to 2.5 percent methane in...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22237 Actions at 2.0 to 2.5 percent methane in bleeder systems (I-A and III mines). If methane reaches...
30 CFR 57.22232 - Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Actions at 0.5 percent methane (I-B, II-A, II-B...-UNDERGROUND METAL AND NONMETAL MINES Safety Standards for Methane in Metal and Nonmetal Mines Ventilation § 57.22232 Actions at 0.5 percent methane (I-B, II-A, II-B, IV, V-B, and VI mines). If methane reaches...
2003-11-01
Since its announcement in June 1997, the Million Solar Roofs Initiative has generated a major buzz in communities, states, and throughout the nation. With more than 300,000 installations, the buzz is getting louder. This brochure describes Million Solar Roofs activities and partnerships.
Rosselló-Móra, Ramon
2016-01-01
ABSTRACT An update on the census of species of Archaea and Bacteria published recently in mBio (P. D. Schloss, R. A. Girard, T. Martin, J. Edwards, and J. C. Thrash, mBio 7:e00201-16, 2016, http://dx.doi.org/10.1128/mBio.00201-16) showed again that, despite ever-increasing sequencing efforts, the PCR-based retrieval of 16S rRNA genes is approaching saturation. On average, 95% of the genes analyzed today are identical to those present in public databases, with rarefaction analysis indicating that about one-third of the bacterial and archaeal diversity has already been covered. Therefore, despite estimates of up to 1012 microbial species, the option should be considered that the census of Archaea and Bacteria on planet Earth might yield only millions of species after all. PMID:27381294
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] A Million Comet Pieces (poster version)
This infrared image from NASA's Spitzer Space Telescope shows the broken Comet 73P/Schwassman-Wachmann 3 skimming along a trail of debris left during its multiple trips around the sun. The flame-like objects are the comet's fragments and their tails, while the dusty comet trail is the line bridging the fragments.
Comet 73P /Schwassman-Wachmann 3 began to splinter apart in 1995 during one of its voyages around the sweltering sun. Since then, the comet has continued to disintegrate into dozens of fragments, at least 36 of which can be seen here. Astronomers believe the icy comet cracked due the thermal stress from the sun.
The Spitzer image provides the best look yet at the trail of debris left in the comet's wake after its 1995 breakup. The observatory's infrared eyes were able to see the dusty comet bits and pieces, which are warmed by sunlight and glow at infrared wavelengths. This comet debris ranges in size from pebbles to large boulders. When Earth passes near this rocky trail every year, the comet rubble burns up in our atmosphere, lighting up the sky in meteor showers. In 2022, Earth is expected to cross close to the comet's trail, producing a noticeable meteor shower.
Astronomers are studying the Spitzer image for clues to the comet's composition and how it fell apart. Like NASA's Deep Impact experiment, in which a probe smashed into comet Tempel 1, the cracked Comet 73P/Schwassman-Wachmann 3 provides a perfect laboratory for studying the pristine interior of a comet.
This image was taken from May 4 to May 6 by Spitzer's multi-band imaging photometer, using its 24-micron wavelength channel.
$425 million for space station
NASA Astrophysics Data System (ADS)
Maggs, William Ward
The Space Station will funded at only about half of the $767 million requested in the 1988 budget for the National Aeronautics and Space Administration (NASA), and overall the agency will receive $8,856 billion for the current fiscal year (FY) in the deficit-reduction package passed by Congress in late December. Despite an earlier complaint that reductions in the space station budget would kill the program and an apparent lack of support from the White House, NASA's official reaction was full of good cheer.NASA will be able to use the $425 million in two installments, $200 million now and $225 million in June. In October, NASA administrator James Fletcher stated in a letter to Senator Jake Garn (R-Utah) that if the space station received no more than $440 million, he would “recommend termination” of the program. But after the budget was approved, NASA said that the $425 million “reflected the strong commitment of the President and the Congress to proceed with the development of a space station.” A recent request to President Reagan from congressional proponents of the station for a letter of support for the multibillion dollar project was declined.
Mr Cameron's Three Million Apprenticeships
ERIC Educational Resources Information Center
Allen, Martin
2015-01-01
In the 2015 general election campaign David Cameron celebrated the success of apprenticeships during the Coalition and promised another 3 million. This article argues that the "reinvention" of apprenticeships has neither created real skills nor provided real alternatives for young people and that the UK schemes fall far short of those in…
NASA Astrophysics Data System (ADS)
Jaynes, E. T.; Bretthorst, G. Larry
2003-04-01
Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.
Photographer: JPL P-21741 C Range: 2.6 million kilometers (1.6 million miles) This picture of Io,
NASA Technical Reports Server (NTRS)
1979-01-01
Photographer: JPL P-21741 C Range: 2.6 million kilometers (1.6 million miles) This picture of Io, taken by Voyager 1, shows the region of the Jovian moon which will be monitored for volcanic eruptions by Voyager 2 during the 'Io movie' sequence. The white and orange patches probably are deposits of sulphur compounds and other volcanic materials. The Voyager 2 pictures of this region will be much more detailed.
Photographer: JPL P-21741 BW Range: 2.6 million kilometers (1.6 million miles) This picture of Io,
NASA Technical Reports Server (NTRS)
1979-01-01
Photographer: JPL P-21741 BW Range: 2.6 million kilometers (1.6 million miles) This picture of Io, taken by Voyager 1, shows the region of the Jovian moon which will be monitored for volcanic eruptions by Voyager 2 during the 'Io movie' sequence. The white and orange patches probably are deposits of sulphur compounds and other volcanic materials. The Voyager 2 pictures of this region will be much more detailed.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Genotype Imputation with Millions of Reference Samples.
Browning, Brian L; Browning, Sharon R
2016-01-07
We present a genotype imputation method that scales to millions of reference samples. The imputation method, based on the Li and Stephens model and implemented in Beagle v.4.1, is parallelized and memory efficient, making it well suited to multi-core computer processors. It achieves fast, accurate, and memory-efficient genotype imputation by restricting the probability model to markers that are genotyped in the target samples and by performing linear interpolation to impute ungenotyped variants. We compare Beagle v.4.1 with Impute2 and Minimac3 by using 1000 Genomes Project data, UK10K Project data, and simulated data. All three methods have similar accuracy but different memory requirements and different computation times. When imputing 10 Mb of sequence data from 50,000 reference samples, Beagle's throughput was more than 100× greater than Impute2's throughput on our computer servers. When imputing 10 Mb of sequence data from 200,000 reference samples in VCF format, Minimac3 consumed 26× more memory per computational thread and 15× more CPU time than Beagle. We demonstrate that Beagle v.4.1 scales to much larger reference panels by performing imputation from a simulated reference panel having 5 million samples and a mean marker density of one marker per four base pairs.
Confidence Probability versus Detection Probability
Axelrod, M
2005-08-18
In a discovery sampling activity the auditor seeks to vet an inventory by measuring (or inspecting) a random sample of items from the inventory. When the auditor finds every sample item in compliance, he must then make a confidence statement about the whole inventory. For example, the auditor might say: ''We believe that this inventory of 100 items contains no more than 5 defectives with 95% confidence.'' Note this is a retrospective statement in that it asserts something about the inventory after the sample was selected and measured. Contrast this to the prospective statement: ''We will detect the existence of more than 5 defective items in this inventory with 95% probability.'' The former uses confidence probability while the latter uses detection probability. For a given sample size, the two probabilities need not be equal, indeed they could differ significantly. Both these probabilities critically depend on the auditor's prior belief about the number of defectives in the inventory and how he defines non-compliance. In other words, the answer strongly depends on how the question is framed.
CDC Allocates $184 Million for Zika Protection
... fullstory_162694.html CDC Allocates $184 Million for Zika Protection Funds are earmarked for states, territories, local ... million has been earmarked to protect Americans against Zika virus infection, the U.S. Centers for Disease Control ...
(Updated) NCI Fiscal 2016 Bypass Budget Proposes $25 Million for Frederick National Lab | Poster
By Nancy Parrish, Staff Writer; image by Richard Frederickson, Staff Photographer The additional funding requested for Frederick National Laboratory for Cancer Research (FNLCR) in the Fiscal 2016 Bypass Budget was $25 million, or approximately 3.5 percent of the total additional funding request of $715 million. Officially called the Professional Judgment Budget, the Bypass Budget is a result of the National Cancer Act of 1971, which authorizes NCI to submit a budget directly to the president, to send to Congress. With a focus on NCI’s research priorities and areas of cancer research with potential for investment, the Bypass Budget specifies additional funding, over and above the current budget, that is needed to advance
NASA Technical Reports Server (NTRS)
Chu, Julio; Lawing, Pierce L.
1990-01-01
A high Reynolds number test of a 5 percent thick low aspect ratio semispan wing was conducted in the adaptive wall test section of the Langley 0.3 m Transonic Cryogenic Tunnel. The model tested had a planform and a NACA 64A-105 airfoil section that is similar to that of the pressure instrumented canard on the X-29 experimental aircraft. Chordwise pressure data for Mach numbers of 0.3, 0.7, and 0.9 were measured for an angle-of-attack range of -4 to 15 deg. The associated Reynolds numbers, based on the geometric mean chord, encompass most of the flight regime of the canard. This test was a free transition investigation. A summary of the wing pressures are presented without analysis as well as adapted test section top and bottom wall pressure signatures. However, the presented graphical data indicate Reynolds number dependent complex leading edge separation phenomena. This data set supplements the existing high Reynolds number database and are useful for computational codes comparison.
Children Adrift: Educating China's Millions of Migrants.
ERIC Educational Resources Information Center
Cao, Haili
1999-01-01
The population of migrants moving within China's borders has reached some 80 million, including 2-3 million school-aged children. As migrant workers flock to cities, their children strain urban school systems or receive no education. But independent schools for migrants are illegal and substandard. In some rural provinces, vocational training may…
Literacy--The 877 Million Left Behind.
ERIC Educational Resources Information Center
Muller, Anne, Ed.; Murtagh, Teresa, Ed.
2002-01-01
In 2000, approximately 877 million adults worldwide were illiterate and 113 million children did not attend school. More than two-thirds of those individuals lived in East and South Asia, and two-thirds were females. Functional illiteracy remains high in developed and developing nations alike. The reasons include weak training in how to teach…
Millions of Americans Bombarded by Loud Noises
... of the almost 35 million people who shot guns in the last year used hearing protection. And ... never used any protection. Seventy-seven percent of gun-related noise exposure occurred during recreational shooting, the ...
Ethylene capacity tops 77 million mty
Rhodes, A.K.; Knott, D.
1995-04-17
World ethylene production capacity is 77.8 million metric tons/year (mty). This total represents an increase of more than 6 million mty, or almost 9%, over last year`s survey. The biggest reason for the large change is more information about plants in the CIS. Also responsible for the increase in capacity is the start-up of several large ethylene plants during the past year. The paper discusses construction of ethylene plants, feedstocks, prices, new capacity, price outlook, and problems in Europe`s ethylene market.
Leading the Maricopa Millions OER Project
ERIC Educational Resources Information Center
Raneri, April; Young, Lisa
2016-01-01
With a reduced number of students purchasing required and necessary textbooks, higher education leaders must look to new opportunities to increase student success. While open educational resources have addressed this issue, they have not received widespread support from faculty, staff, and administrators. The Maricopa Millions OER Project: Scaling…
Saving Millions without Spending a Dime.
ERIC Educational Resources Information Center
Raman, Elizabeth
2003-01-01
Describes how the University of Hawaii at Hilo is using the $2.7 million it saved on utility bills during the past 5 years to repay campus energy improvements financed, installed, and maintained by an energy services company; the method is called energy savings performance contracting. (EV)
Clustering Millions of Faces by Identity.
Otto, Charles; Wang, Dayong; Jain, Anil
2017-03-07
Given a large collection of unlabeled face images, we address the problem of clustering faces into an unknown number of identities. This problem is of interest in social media, law enforcement, and other applications, where the number of faces can be of the order of hundreds of million, while the number of identities (clusters) can range from a few thousand to millions. To address the challenges of run-time complexity and cluster quality, we present an approximate Rank-Order clustering algorithm that performs better than popular clustering algorithms (k-Means and Spectral). Our experiments include clustering up to 123 million face images into over 10 million clusters. Clustering results are analyzed in terms of external (known face labels) and internal (unknown face labels) quality measures, and run-time. Our algorithm achieves an F-measure of 0:87 on the LFW benchmark (13K faces of 5; 749 individuals), which drops to 0:27 on the largest dataset considered (13K faces in LFW + 123M distractor images). Additionally, we show that frames in the YouTube benchmark can be clustered with an F-measure of 0:71. An internal per-cluster quality measure is developed to rank individual clusters for manual exploration of high quality clusters that are compact and isolated.
Learning Our Way to One Million
ERIC Educational Resources Information Center
Whitin, David J.
2008-01-01
David Schwartz's classic book "How Much Is a Million?" can be the catalyst for sparking many interesting mathematical investigations. This article describes five episodes in which children in grades 2-5 all heard this familiar story read aloud to them. At each grade level, they were encouraged to think of their own way to explore the concept of…
One Half Million Mile Solar Filament
NASAâs Solar Dynamics Observatory (SDO) captures a very long, whip-like solar filament extending over half a million miles in a long arc above the sunâs surface. Filaments are cooler clouds of ...
Evolution of global temperature over the past two million years.
Snyder, Carolyn W
2016-10-13
Reconstructions of Earth's past climate strongly influence our understanding of the dynamics and sensitivity of the climate system. Yet global temperature has been reconstructed for only a few isolated windows of time, and continuous reconstructions across glacial cycles remain elusive. Here I present a spatially weighted proxy reconstruction of global temperature over the past 2 million years estimated from a multi-proxy database of over 20,000 sea surface temperature point reconstructions. Global temperature gradually cooled until roughly 1.2 million years ago and cooling then stalled until the present. The cooling trend probably stalled before the beginning of the mid-Pleistocene transition, and pre-dated the increase in the maximum size of ice sheets around 0.9 million years ago. Thus, global cooling may have been a pre-condition for, but probably is not the sole causal mechanism of, the shift to quasi-100,000-year glacial cycles at the mid-Pleistocene transition. Over the past 800,000 years, polar amplification (the amplification of temperature change at the poles relative to global temperature change) has been stable over time, and global temperature and atmospheric greenhouse gas concentrations have been closely coupled across glacial cycles. A comparison of the new temperature reconstruction with radiative forcing from greenhouse gases estimates an Earth system sensitivity of 9 degrees Celsius (range 7 to 13 degrees Celsius, 95 per cent credible interval) change in global average surface temperature per doubling of atmospheric carbon dioxide over millennium timescales. This result suggests that stabilization at today's greenhouse gas levels may already commit Earth to an eventual total warming of 5 degrees Celsius (range 3 to 7 degrees Celsius, 95 per cent credible interval) over the next few millennia as ice sheets, vegetation and atmospheric dust continue to respond to global warming.
Evolution of global temperature over the past two million years
NASA Astrophysics Data System (ADS)
Snyder, Carolyn W.
2016-10-01
Reconstructions of Earth’s past climate strongly influence our understanding of the dynamics and sensitivity of the climate system. Yet global temperature has been reconstructed for only a few isolated windows of time, and continuous reconstructions across glacial cycles remain elusive. Here I present a spatially weighted proxy reconstruction of global temperature over the past 2 million years estimated from a multi-proxy database of over 20,000 sea surface temperature point reconstructions. Global temperature gradually cooled until roughly 1.2 million years ago and cooling then stalled until the present. The cooling trend probably stalled before the beginning of the mid-Pleistocene transition, and pre-dated the increase in the maximum size of ice sheets around 0.9 million years ago. Thus, global cooling may have been a pre-condition for, but probably is not the sole causal mechanism of, the shift to quasi-100,000-year glacial cycles at the mid-Pleistocene transition. Over the past 800,000 years, polar amplification (the amplification of temperature change at the poles relative to global temperature change) has been stable over time, and global temperature and atmospheric greenhouse gas concentrations have been closely coupled across glacial cycles. A comparison of the new temperature reconstruction with radiative forcing from greenhouse gases estimates an Earth system sensitivity of 9 degrees Celsius (range 7 to 13 degrees Celsius, 95 per cent credible interval) change in global average surface temperature per doubling of atmospheric carbon dioxide over millennium timescales. This result suggests that stabilization at today’s greenhouse gas levels may already commit Earth to an eventual total warming of 5 degrees Celsius (range 3 to 7 degrees Celsius, 95 per cent credible interval) over the next few millennia as ice sheets, vegetation and atmospheric dust continue to respond to global warming.
ERIC Educational Resources Information Center
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Emulating a million machines to investigate botnets.
Rudish, Donald W.
2010-06-01
Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousands or in some cases even millions of computers, making them among the world's most powerful computers for some applications.
Probability and Relative Frequency
NASA Astrophysics Data System (ADS)
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
Kumbh Mela 2013: Healthcare for the millions
Cariappa, M.P.; Singh, B.P.; Mahen, A.; Bansal, A.S.
2015-01-01
Mass gatherings pose challenges to healthcare systems anywhere in the world. The Kumbh Mela 2013 at Allahabad, India was the largest gathering of humanity in the history of mankind, and posed an exciting challenge to the provision of healthcare services. At the finale of the Mela, it was estimated that about 120 million pilgrims had visited the site. Equitable geospatial distribution of adhoc health care facilities were created on a standardised template with integrated planning of evacuation modalities. Innovative and low cost response measures for disaster mitigation were implemented. Emergency patient management kits were prepared and stocked across the health care facilities for crisis response. Dynamic resource allocation (in terms of manpower and supplies) based on patient volumes was done on a daily basis, in response to feedback. An adhoc mega township created on the banks of a perennial river (Ganga) in the Indian subcontinent for accommodating millions of Hindu pilgrims. Conventional mindset of merely providing limited and static healthcare through adhoc facilities was done away with. Innovative concepts such as riverine ambulances and disaster kits were introduced. Managing the medical aspects of a mass gathering mega event requires allocation of adequate funds, proactive and integrated medical planning and preparedness. PMID:26288497
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
ERIC Educational Resources Information Center
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Saudis map $450 million gulf spill cleanup
Not Available
1991-11-18
This paper reports on Saudi Arabia which has earmarked about $450 million to clean up Persian Gulf beaches polluted by history's worst oil spills, created during the Persian Gulf crisis. Details of the proposed cleanup measures were outlined by Saudi environmental officials at a seminar on the environment in Dubai, OPEC News Agency reported. The seminar was sponsored by the Gulf Area Oil Companies Mutual Aid Organization, an environmental cooperative agency set up by Persian Gulf governments. Meantime, a Saudi government report has outlined early efforts designed to contain the massive oil spills that hit the Saudi coast before oil could contaminate water intakes at the huge desalination plants serving Riyadh and cooling water facilities at Al Jubail.
Dynamical Simulation of Probabilities
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Disaster care for 15 million Californians.
ROBINSON, H G
1960-08-01
The urgency of the crisis following a nuclear attack staggers the imagination. We would have thousands or millions of survivors making a desperate struggle to survive. Safe water supplies and waste-disposal systems would be gone. In some areas, there would be little or no food or shelter. Yet California has already manned a medical arsenal that is second to none in the United States. We have stored 115 emergency hospitals at strategic points, and through the county medical associations we have appointed cadres including physicians, nurses and technicians. Plans have been made for workers who will assist in setting up the hospitals and first aid stations. In our future operations we will continue to place strong emphasis on the medical phase of our program of disaster care.The program would be just as essential in the event of major natural disaster as nuclear war. Our objective is a simple one. We are seeking to preserve the human resources which are necessary for recovery.California's medical profession, with the allied professions of nursing and technical skills, has a vital interest in continuing operations to the maximum extent even under the most trying conditions.
Leaf metallome preserved over 50 million years.
Edwards, N P; Manning, P L; Bergmann, U; Larson, P L; van Dongen, B E; Sellers, W I; Webb, S M; Sokaras, D; Alonso-Mori, R; Ignatyev, K; Barden, H E; van Veelen, A; Anné, J; Egerton, V M; Wogelius, R A
2014-04-01
Large-scale Synchrotron Rapid Scanning X-ray Fluorescence (SRS-XRF) elemental mapping and X-ray absorption spectroscopy are applied here to fossil leaf material from the 50 Mya Green River Formation (USA) in order to improve our understanding of the chemistry of fossilized plant remains. SRS-XRF of fossilized animals has previously shown that bioaccumulated trace metals and sulfur compounds may be preserved in their original distributions and these elements can also act as biomarkers for specific biosynthetic pathways. Similar spatially resolved chemical data for fossilized plants is sparsely represented in the literature despite the multitude of other chemical studies performed. Here, synchrotron data from multiple specimens consistently show that fossil leaves possess chemical inventories consisting of organometallic and organosulfur compounds that: (1) map discretely within the fossils, (2) resolve fine scale biological structures, and (3) are distinct from embedding sedimentary matrices. Additionally, the chemical distributions in fossil leaves are directly comparable to those of extant leaves. This evidence strongly suggests that a significant fraction of the chemical inventory of the examined fossil leaf material is derived from the living organisms and that original bioaccumulated elements have been preserved in situ for 50 million years. Chemical information of this kind has so far been unknown for fossilized plants and could for the first time allow the metallome of extinct flora to be studied.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
NASA Astrophysics Data System (ADS)
Laktineh, Imad
2010-04-01
This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p.) corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Probability of satellite collision
NASA Technical Reports Server (NTRS)
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
40 Million Years of the Iceland Plume
NASA Astrophysics Data System (ADS)
Parnell-Turner, R. E.; White, N.; Henstock, T.; Maclennan, J.; Murton, B. J.; Jones, S. M.
2011-12-01
The V-shaped ridges, straddling the mid oceanic ridges to the North and South of Iceland, provide us with a linear record of transient mantle convective circulation. Surprisingly, we know little about the structure of these ridges: prior to this study, the most recent regional seismic reflection profiles were acquired in the 1960s. During the Summer of 2010, we acquired over 3,000 km of seismic reflection data across the oceanic basin South of Iceland. The cornerstones of this programme are two 1000 km flowlines, which traverse the basin from Greenland to the European margin. The geometry of young V-shaped ridges near to the oceanic spreading center has been imaged in fine detail; older ridges, otherwise obscured in gravity datasets by sediment cover, have been resolved for the first time. We have mapped the sediment-basement interface, transformed each profile onto an astronomical time scale, and removed the effects of long wavelength plate cooling. The resulting chronology of Icelandic plume activity provides an important temporal frame of reference for plume flux over the past 40 million years. The profiles also cross major contourite drift deposits, notably the Gardar, Bjorn and Eirik drifts. Fine-scale sedimentary features imaged here demonstrate distinct episodes of drift construction; by making simple assumptions about sedimentation rates, we can show that periods of drift formation correspond to periods of enhanced deep water circulation which is in turn moderated by plume activity. From a regional point of view, this transient behaviour manifests itself in several important ways. Within sedimentary basins fringing the North Atlantic, short lived regional uplift events periodically interrupt thermal subsidence from Eocene times to the present day. From a paleoceanographic perspective, there is good correlation between V-shaped ridge activity and changes in overflow of the ancient precursor to North Atlantic Deep Water. This complete history of the Iceland
WISE PHOTOMETRY FOR 400 MILLION SDSS SOURCES
Lang, Dustin; Hogg, David W.; Schlegel, David J.
2016-02-15
We present photometry of images from the Wide-Field Infrared Survey Explorer (WISE) of over 400 million sources detected by the Sloan Digital Sky Survey (SDSS). We use a “forced photometry” technique, using measured SDSS source positions, star–galaxy classification, and galaxy profiles to define the sources whose fluxes are to be measured in the WISE images. We perform photometry with The Tractor image modeling code, working on our “unWISE” coaddds and taking account of the WISE point-spread function and a noise model. The result is a measurement of the flux of each SDSS source in each WISE band. Many sources have little flux in the WISE bands, so often the measurements we report are consistent with zero given our uncertainties. However, for many sources we get 3σ or 4σ measurements; these sources would not be reported by the “official” WISE pipeline and will not appear in the WISE catalog, yet they can be highly informative for some scientific questions. In addition, these small-signal measurements can be used in stacking analyses at the catalog level. The forced photometry approach has the advantage that we measure a consistent set of sources between SDSS and WISE, taking advantage of the resolution and depth of the SDSS images to interpret the WISE images; objects that are resolved in SDSS but blended together in WISE still have accurate measurements in our photometry. Our results, and the code used to produce them, are publicly available at http://unwise.me.
WISE Photometry for 400 Million SDSS Sources
NASA Astrophysics Data System (ADS)
Lang, Dustin; Hogg, David W.; Schlegel, David J.
2016-02-01
We present photometry of images from the Wide-Field Infrared Survey Explorer (WISE) of over 400 million sources detected by the Sloan Digital Sky Survey (SDSS). We use a “forced photometry” technique, using measured SDSS source positions, star-galaxy classification, and galaxy profiles to define the sources whose fluxes are to be measured in the WISE images. We perform photometry with The Tractor image modeling code, working on our “unWISE” coaddds and taking account of the WISE point-spread function and a noise model. The result is a measurement of the flux of each SDSS source in each WISE band. Many sources have little flux in the WISE bands, so often the measurements we report are consistent with zero given our uncertainties. However, for many sources we get 3σ or 4σ measurements; these sources would not be reported by the “official” WISE pipeline and will not appear in the WISE catalog, yet they can be highly informative for some scientific questions. In addition, these small-signal measurements can be used in stacking analyses at the catalog level. The forced photometry approach has the advantage that we measure a consistent set of sources between SDSS and WISE, taking advantage of the resolution and depth of the SDSS images to interpret the WISE images; objects that are resolved in SDSS but blended together in WISE still have accurate measurements in our photometry. Our results, and the code used to produce them, are publicly available at http://unwise.me.
Tulelake, California: The last 3 million years
Adam, D.P.; Sarna-Wojcicki, A. M.; Rieck, H.J.; Bradbury, J.P.; Dean, W.E.; Forester, R.M.
1989-01-01
The Tulelake basin, formed by east-west extension and faulting during the past several million years, contains at least 550 m of lacustrine sediment. Interdisciplinary studies of a 334 m-long cored section from the town of Tulelake, California, near the center of the basin, document a 3-m.y. record of environmental changes. The core consists of a thick sequence of diatomaceous clayey, silty, and marly lacustrine sediments interbedded with numerous tephra layers. Paleomagnetic study puts the base of the core at about 3.0 Ma. Twelve widespread silicic tephra units provide correlations with other areas and complement age control provided by magnetostratigraphy; mafic and silicic tephra units erupted from local sources are also common in the core. Widespread tephra units include the Llao Rock pumice (=Tsoyawata, 7 ka), the Trego Hot Springs Bed (23 ka), and the Rockland (0.40 Ma), Lava Creek (0.62 Ma), and Rio Dell (1.5 Ma) ash beds, as well as several ash beds also found at Summer Lake, Oregon, and an ash bed originally recognized in DSDP hole 173 in the northeastern Pacific. Several tephra layers found in the core also occur in lacustrine beds exposed around the margins of the basin and elsewhere in the ancestral lacustrine system. Diatoms are present throughout the section. Pollen is present in most of the section, but some barren zones are found in the interval between 50 and 140 m; the greatest change in behavior of the pollen record takes place just above the top of the Olduvai Normal-Polarity Subchronozone. Ostracodes are present only in high-carbonate (>10% CaCO3) intervals. Evolutionary changes are found in the diatom and ostracode records. Bulk geochemical analyses show significant changes in elemental composition of the sediment through time. ?? 1989.
Enhancing the view of a million galaxies
NASA Astrophysics Data System (ADS)
2004-06-01
Composite image hi-res Size hi-res: 851 KB Credits: ESA/Univ. of Leicester/I. Stewart and M. Watson XMM-Newton X-ray spectral colour composite image XMM-Newton X-ray spectral colour composite image of the Subaru/XMM-Newton Deep Field. The view gives an X-ray pseudo-colour representation of all the sources, coded according to their X-ray energy. More energetic sources are shown in blue and less energetic ones in red. This mosaic image, composed of 7 partially overlapping pointings, maps the full extent of the SXDF and corresponds to an exposure time exceeding one hundred hours. These data form the largest contiguous area over which deep X-ray observations have been performed. Composite image hi-res Size hi-res: 6215 KB Credits: NAOJ/Subaru Telescope XMM-Newton/Subaru colour composite image A colour composite image obtained by combining data taken with the Subaru Telescope in blue, red and near-infrared light. The image, worth over two hundred hours of exposure time, covers an area of sky seven times larger than the full moon. The images in blue light show details several hundred million times fainter than what can be seen with the naked eye. SXDS field hi-res Size hi-res: 448 KB Credits: NAOJ/Subaru Telescope SXDS field A particular of the SXDS field. The teardrop-shaped galaxy in the upper right portion of the frame is likely to have suffered from a collision with another galaxy. SXDS field hi-res Size hi-res: 358 KB Credits: NAOJ/Subaru Telescope SXDS field A particular of the SXDS field. The prominent spiral galaxy near the centre may be ineracting with a less-conspicuous dwarf galaxy to its lower right. One of the fundamental goals of modern astronomy is understanding the history of the Universe, and in particular learning about the processes that shape the formation and evolution of galaxies. To observe these processes as they unfold, astronomers must survey galaxies near and far, spanning a large enough volume of the Universe, so that local variations in the
ERIC Educational Resources Information Center
Barnes, Bernis, Ed.; And Others
This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…
Teachers' Understandings of Probability
ERIC Educational Resources Information Center
Liu, Yan; Thompson, Patrick
2007-01-01
Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…
Tobacco Use Costs World 6 Million Lives, $1 Trillion Annually
... gov/news/fullstory_162966.html Tobacco Use Costs World 6 Million Lives, $1 Trillion Annually: Report Higher ... 6 million people a year, and costs the world more than $1 trillion a year in health ...
NASA Technical Reports Server (NTRS)
Soneira, R. M.; Bahcall, J. N.
1981-01-01
Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Rationalizing Hybrid Earthquake Probabilities
NASA Astrophysics Data System (ADS)
Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.
2003-12-01
An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.
Asteroidal collision probabilities
NASA Astrophysics Data System (ADS)
Bottke, W. F.; Greenberg, R.
1993-05-01
Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.
Probabilities in implicit learning.
Tseng, Philip; Hsu, Tzu-Yu; Tzeng, Ovid J L; Hung, Daisy L; Juan, Chi-Hung
2011-01-01
The visual system possesses a remarkable ability in learning regularities from the environment. In the case of contextual cuing, predictive visual contexts such as spatial configurations are implicitly learned, retained, and used to facilitate visual search-all without one's subjective awareness and conscious effort. Here we investigated whether implicit learning and its facilitatory effects are sensitive to the statistical property of such implicit knowledge. In other words, are highly probable events learned better than less probable ones even when such learning is implicit? We systematically varied the frequencies of context repetition to alter the degrees of learning. Our results showed that search efficiency increased consistently as contextual probabilities increased. Thus, the visual contexts, along with their probability of occurrences, were both picked up by the visual system. Furthermore, even when the total number of exposures was held constant between each probability, the highest probability still enjoyed a greater cuing effect, suggesting that the temporal aspect of implicit learning is also an important factor to consider in addition to the effect of mere frequency. Together, these findings suggest that implicit learning, although bypassing observers' conscious encoding and retrieval effort, behaves much like explicit learning in the sense that its facilitatory effect also varies as a function of its associative strengths.
NASA Technical Reports Server (NTRS)
Bollenbacher, Gary; Guptill, James D.
1999-01-01
This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Experimental Probability in Elementary School
ERIC Educational Resources Information Center
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
A Unifying Probability Example.
ERIC Educational Resources Information Center
Maruszewski, Richard F., Jr.
2002-01-01
Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…
ERIC Educational Resources Information Center
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Univariate Probability Distributions
ERIC Educational Resources Information Center
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
Approximating Integrals Using Probability
ERIC Educational Resources Information Center
Maruszewski, Richard F., Jr.; Caudle, Kyle A.
2005-01-01
As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…
NASA Astrophysics Data System (ADS)
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Superpositions of probability distributions
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Efficient Probability Sequences
2014-08-18
Ungar (2014), to produce a distinct forecasting system. The system consists of the method for eliciting individual subjective forecasts together with...E. Stone, and L. H. Ungar (2014). Two reasons to make aggregated probability forecasts more extreme. Decision Analysis 11 (2), 133–145. Bickel, J. E...Letters 91 (3), 425–429. Mellers, B., L. Ungar , J. Baron, J. Ramos, B. Gurcay, K. Fincher, S. E. Scott, D. Moore, P. Atanasov, S. A. Swift, et al. (2014
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Troutman, B.M.; Karlinger, M.R.
2003-01-01
The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.
12 Scientists Will Share $120-Million from Saudis
ERIC Educational Resources Information Center
Guterman, Lila
2008-01-01
This spring 12 scientists found themselves in an unusual position--they have to figure out how to spend $2-million every year for the next five years. The money adds up to $10-million per researcher. In May the researchers made a pilgrimage to the source of the generous grants: King Abdullah University of Science and Technology, a graduate…
Strategies to choose from millions of imputed sequence variants
Technology Transfer Automated Retrieval System (TEKTRAN)
Millions of sequence variants are known, but subsets are needed for routine genomic predictions or to include on genotyping arrays. Variant selection and imputation strategies were tested using 26 984 simulated reference bulls, of which 1 000 had 30 million sequence variants, 773 had 600 000 markers...
Retrieve Tether Survival Probability
2007-11-02
cuts of the tether by meteorites and orbital debris , is calculated to be 99.934% for the planned experiment duration of six months or less. This is...due to the unlikely event of a strike by a large piece of orbital debris greater than 1 meter in size cutting all the lines of the tether at once. The...probability of the tether surviving multiple cuts by meteoroid and orbital debris impactors smaller than 5 cm in diameter is 99.9993% at six months
Monsanto Gives Washington U. $23.5 Million.
ERIC Educational Resources Information Center
Culliton, Barbara J.
1982-01-01
Reviews various provisions of a five-year, $23.5-million research agreement between Washington University and the Monsanto Company. The scientific focus of this venture will be on proteins and peptides which modify cellular behavior. (SK)
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Glaciation in southern Argentina more than two million years ago.
Mercer, J H
1969-05-16
In southern Argentina till beds interbedded with lava flows were deposited by ice that extended at least 40 kilometers east of the present crest of the cordillera. The flow covering the oldest till bed is 3.2 +/- 1 million years old. The flow that constitutes the present surface and covers the youngest till bed, is 1.7 +/- 0.5 million years old.
Fourier spectroscopy with a one-million-point transformation
NASA Technical Reports Server (NTRS)
Connes, J.; Delouis, H.; Connes, P.; Guelachvili, G.; Maillard, J.; Michel, G.
1972-01-01
A new type of interferometer for use in Fourier spectroscopy has been devised at the Aime Cotton Laboratory of the National Center for Scientific Research (CNRS), Orsay, France. With this interferometer and newly developed computational techniques, interferograms comprising as many as one million samples can now be transformed. The techniques are described, and examples of spectra of thorium and holmium, derived from one million-point interferograms, are presented.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Discovery of 505-million-year old chitin in the basal demosponge Vauxia gracilenta.
Ehrlich, H; Rigby, J Keith; Botting, J P; Tsurkan, M V; Werner, C; Schwille, P; Petrášek, Z; Pisera, A; Simon, P; Sivkov, V N; Vyalikh, D V; Molodtsov, S L; Kurek, D; Kammer, M; Hunoldt, S; Born, R; Stawski, D; Steinhof, A; Bazhenov, V V; Geisler, T
2013-12-13
Sponges are probably the earliest branching animals, and their fossil record dates back to the Precambrian. Identifying their skeletal structure and composition is thus a crucial step in improving our understanding of the early evolution of metazoans. Here, we present the discovery of 505-million-year-old chitin, found in exceptionally well preserved Vauxia gracilenta sponges from the Middle Cambrian Burgess Shale. Our new findings indicate that, given the right fossilization conditions, chitin is stable for much longer than previously suspected. The preservation of chitin in these fossils opens new avenues for research into other ancient fossil groups.
Discovery of 505-million-year old chitin in the basal demosponge Vauxia gracilenta
Ehrlich, H.; Rigby, J. Keith; Botting, J. P.; Tsurkan, M. V.; Werner, C.; Schwille, P.; Petrášek, Z.; Pisera, A.; Simon, P.; Sivkov, V. N.; Vyalikh, D. V.; Molodtsov, S. L.; Kurek, D.; Kammer, M.; Hunoldt, S.; Born, R.; Stawski, D.; Steinhof, A.; Bazhenov, V. V.; Geisler, T.
2013-01-01
Sponges are probably the earliest branching animals, and their fossil record dates back to the Precambrian. Identifying their skeletal structure and composition is thus a crucial step in improving our understanding of the early evolution of metazoans. Here, we present the discovery of 505–million-year-old chitin, found in exceptionally well preserved Vauxia gracilenta sponges from the Middle Cambrian Burgess Shale. Our new findings indicate that, given the right fossilization conditions, chitin is stable for much longer than previously suspected. The preservation of chitin in these fossils opens new avenues for research into other ancient fossil groups. PMID:24336573
ERIC Educational Resources Information Center
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Coherent Assessment of Subjective Probability
1981-03-01
known results of de Finetti (1937, 1972, 1974), Smith (1961), and Savage (1971) and some recent results of Lind- ley (1980) concerning the use of...provides the motivation for de Finettis definition of subjective probabilities as coherent bet prices. From the definition of the probability measure...subjective probability, the probability laws which are traditionally stated as axioms or definitions are obtained instead as theorems. (De Finetti F -7
Gates Foundation donates $25 million for AIDS vaccine.
1999-05-07
The International AIDS Vaccine Initiative (IAVI) received a $25 million five-year grant from Bill and Melinda Gates through the William H. Gates Foundation. This is the largest gift seen in the AIDS epidemic, and will allow IAVI to more than double vaccine development efforts. IAVI is currently developing two potential vaccines, hopes to study three others, and is working with the business community to insure that a successful vaccine is affordable in developing countries. With 16,000 new infections occurring daily, a vaccine is seen as the most effective way to stop the epidemic. The William H. Gates Foundation had donated $1.5 million to IAVI and $100 million for programs to speed the delivery of vaccines to children in poor countries. Internet addresses are included for both IAVI and the William H. Gates Foundation.
Probabilities of transversions and transitions.
Vol'kenshtein, M V
1976-01-01
The values of the mean relative probabilities of transversions and transitions have been refined on the basis of the data collected by Jukes and found to be equal to 0.34 and 0.66, respectively. Evolutionary factors increase the probability of transversions to 0.44. The relative probabilities of individual substitutions have been determined, and a detailed classification of the nonsense mutations has been given. Such mutations are especially probable in the UGG (Trp) codon. The highest probability of AG, GA transitions correlates with the lowest mean change in the hydrophobic nature of the amino acids coded.
A magnified young galaxy from about 500 million years after the Big Bang.
Zheng, Wei; Postman, Marc; Zitrin, Adi; Moustakas, John; Shu, Xinwen; Jouvel, Stephanie; Høst, Ole; Molino, Alberto; Bradley, Larry; Coe, Dan; Moustakas, Leonidas A; Carrasco, Mauricio; Ford, Holland; Benítez, Narciso; Lauer, Tod R; Seitz, Stella; Bouwens, Rychard; Koekemoer, Anton; Medezinski, Elinor; Bartelmann, Matthias; Broadhurst, Tom; Donahue, Megan; Grillo, Claudio; Infante, Leopoldo; Jha, Saurabh W; Kelson, Daniel D; Lahav, Ofer; Lemze, Doron; Melchior, Peter; Meneghetti, Massimo; Merten, Julian; Nonino, Mario; Ogaz, Sara; Rosati, Piero; Umetsu, Keiichi; van der Wel, Arjen
2012-09-20
Re-ionization of the intergalactic medium occurred in the early Universe at redshift z ≈ 6-11, following the formation of the first generation of stars. Those young galaxies (where the bulk of stars formed) at a cosmic age of less than about 500 million years (z ≲ 10) remain largely unexplored because they are at or beyond the sensitivity limits of existing large telescopes. Understanding the properties of these galaxies is critical to identifying the source of the radiation that re-ionized the intergalactic medium. Gravitational lensing by galaxy clusters allows the detection of high-redshift galaxies fainter than what otherwise could be found in the deepest images of the sky. Here we report multiband observations of the cluster MACS J1149+2223 that have revealed (with high probability) a gravitationally magnified galaxy from the early Universe, at a redshift of z = 9.6 ± 0.2 (that is, a cosmic age of 490 ± 15 million years, or 3.6 per cent of the age of the Universe). We estimate that it formed less than 200 million years after the Big Bang (at the 95 per cent confidence level), implying a formation redshift of ≲14. Given the small sky area that our observations cover, faint galaxies seem to be abundant at such a young cosmic age, suggesting that they may be the dominant source for the early re-ionization of the intergalactic medium.
Million Hearts: Key to Collaboration to Reduce Heart Disease
ERIC Educational Resources Information Center
Brinkman, Patricia
2016-01-01
Extension has taught successful classes to address heart disease, yet heart disease remains the number one killer in the United States. The U.S. government's Million Hearts initiative seeks collaboration among colleges, local and state health departments, Extension and other organizations, and medical providers in imparting a consistent message…
The MET Project: The Wrong 45 Million Dollar Question
ERIC Educational Resources Information Center
Gabriel, Rachael; Allington, Richard
2012-01-01
In 2009, the Bill and Melinda Gates Foundation funded the investigation of a $45 million question: How can we identify and develop effective teaching? Now that the findings from their Measures of Effective Teaching (MET) project have been released, it's clear they asked a simpler question, namely, What other measures match up well with value-added…
The Million-Dollar President, Soon to Be Commonplace?
ERIC Educational Resources Information Center
Chronicle of Higher Education, 2006
2006-01-01
This article reports the results of a survey conducted by "The Chronicle" that examined college presidents' compensation. The survey found a 53-percent increase in presidents' compensation. While the salaries do not have an eye-popping quotient as those of corporate CEOs'--whose median compensation was just over $6 million among the 350 largest US…
Harvard Will Seek $30-Million for Program on Business Ethics.
ERIC Educational Resources Information Center
Desruisseaux, Paul
1987-01-01
The Harvard University Business School will establish a new program on ethics, leadership, and competitiveness in business, to be financed with $30 million in private gifts. The major contributor is the chairman of the United States Securities and Exchange Commission, an alumnus and former ambassador. (MSE)
The Million Dollar Bowl. OSHA in the Office.
ERIC Educational Resources Information Center
Swartz, Carl
Accidents to office workers add up to 40,000 injuries and more than 200 deaths a year, amounting to expenses from medical assistance and loss of productivity of $100 million. Leading types of accidents are falling caused by slipping on slick or wet floors, tripping over file drawers, slipping on debris on stairs, injuries from poor lighting,…
Millions Learning: Scaling up Quality Education in Developing Countries
ERIC Educational Resources Information Center
Robinson, Jenny Perlman; Winthrop, Rebecca
2016-01-01
"Millions Learning: Scaling up Quality Education in Developing Countries" tells the story of where and how quality education has scaled in low- and middle-income countries. The story emerges from wide-ranging research on scaling and learning, including 14 in-depth case studies from around the globe. Ultimately, "Millions…
Uncovered: Social Security, Retirement Uncertainty, and 1 Million Teachers
ERIC Educational Resources Information Center
Kan, Leslie; Aldeman, Chad
2014-01-01
Retirement savings are often described as a three-legged stool: Social Security, employer retirement plans, and personal savings. For many American workers, Social Security is the most consistent portion of the three-legged model, providing a solid plank of retirement savings. But nationwide, more than 1 million teachers--about 40 percent of all…
Universities' Royalty Income Increased 33% in 1997, Reaching $446-Million.
ERIC Educational Resources Information Center
Basinger, Julianne
1999-01-01
According to an annual survey, 132 U.S. research universities earned over $446 million in royalties from inventions in fiscal 1997, and received 2,239 patents. The University of California was the top earner. Data provided on the top-earning institutions includes dollar amount of adjusted gross royalties received, number of licenses generating…
Universities Collected $642-Million in Royalties on Inventions in 1999.
ERIC Educational Resources Information Center
Blumenstyk, Goldie
2000-01-01
U.S. universities collected more than $641 million from royalties on their inventions in the 1999 fiscal year, and they filed for 7,612 patents. Findings from a survey by the Association of University Technology Managers show licensing revenues, patent activity, and income from technology developments of U.S. higher education institutions. (SLD)
50 million dollar washing machine on-line at Galatia
Wright, A.
1985-02-01
The coal preparation plant at Kerr-McGee's Galatia mine in Illinois is designed to process 2 million ton/year. Details of the coal from the two-seam mine are given and a flow-sheet of the cleaning process is presented.
EPA Provides Puerto Rico $27 Million for Clean Water Projects
(New York, N.Y.) The U.S. Environmental Protection Agency has allotted $27 million to Puerto Rico to help finance improvements to water projects that are essential to protecting public health and the environment. The funds will be used to finance water qua
EPA Provides New Jersey $74 Million for Clean Water Projects
(New York, N.Y.) The U.S. Environmental Protection Agency has allotted $74 million to New Jersey to help finance improvements to water projects that are essential to protecting public health and the environment. The funds will be used to finance water qual
New Program Aims $300-Million at Young Biomedical Researchers
ERIC Educational Resources Information Center
Goodall, Hurley
2008-01-01
Medical scientists just starting at universities have been, more and more often, left empty-handed when the federal government awards grants. To offset this, the Howard Hughes Medical Institute, a nonprofit organization dedicated to medical research, announced a new program that will award $300-million to as many as 70 young scientists. The Early…
Capital Campaigns to Raise $100-Million or More.
ERIC Educational Resources Information Center
Chronicle of Higher Education, 1987
1987-01-01
A table lists the colleges and universities that have initiated capital campaigns to raise $100-million or more. Names of the universities, their goals, public announcement dates, completion dates, and gifts and pledges as of June 30, 1987 are given. (MLW)
ONE MILLION GALLON WATER TANK, PUMP HEADER PIPE (AT LEFT), ...
ONE MILLION GALLON WATER TANK, PUMP HEADER PIPE (AT LEFT), HEADER BYPASS PIPE (AT RIGHT), AND PUMPHOUSE FOUNDATIONS. Looking northeast - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Flame Deflector Water System, Test Area 1-120, north end of Jupiter Boulevard, Boron, Kern County, CA
Lender Allowed to Keep Federal Overpayment of $278-Million
ERIC Educational Resources Information Center
Field, Kelly
2007-01-01
This article reports that the US Education Department has announced that it will not require the National Education Loan Network (Nelnet), a major for-profit student-loan provider based in Nebraska, to return hundreds of millions of dollars in government subsidies, but it will cut off the overpayments going forward. The department will also stop…
Once in a Million Years: Teaching Geologic Time
ERIC Educational Resources Information Center
Lewis, Susan E.; Lampe, Kristen A.; Lloyd, Andrew J.
2005-01-01
The authors advocate that students frequently lack fundamental numerical literacy on the order of millions or billions, and that this comprehension is critical to grasping key evolutionary concepts related to the geologic time scale, the origin and diversification of life on earth, and other concepts such as the national debt, human population…
Probability workshop to be better in probability topic
NASA Astrophysics Data System (ADS)
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Photographer : JPL Range : 4.2 million km. ( 2.6 million miles ) Jupiter's moon Europa, the size of
NASA Technical Reports Server (NTRS)
1979-01-01
Photographer : JPL Range : 4.2 million km. ( 2.6 million miles ) Jupiter's moon Europa, the size of earth's moon, is apparently covered by water ice, as indicated by ground spectrometers and its brightness. In this view, global scale dark sreaks discovered by Voyager 1 that criss-cross the the satelite are becoming visible. Bright rayed impact craters, which are abundant on Ganymede and Callisto, would be easily visible at this range, suggesting that Europa's surface is young and that the streaks are reflections of currently active internal dynamic processes.
Propensity, Probability, and Quantum Theory
NASA Astrophysics Data System (ADS)
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Dose Reconstruction for the Million Worker Study: Status and Guidelines
Bouville, André; Toohey, Richard E.; Boice, John D.; ...
2015-02-01
The primary aim of the epidemiologic study of one million U.S. radiation workers and veterans (the Million-Worker study) is to provide scientifically valid information on the level of radiation risk when exposures are received gradually over time, and not acutely as was the case for Japanese atomic bomb survivors. The primary outcome of the epidemiological study is cancer mortality but other causes of death such as cardiovascular disease and cerebrovascular disease will be evaluated. The success of the study is tied to the validity of the dose reconstruction approaches to provide unbiased estimates of organ-specific radiation absorbed doses and theirmore » accompanying uncertainties. The dosimetry aspects for the Million-Worker study are challenging in that they address diverse exposure scenarios for diverse occupational groups being studied over a period of up to 70 years. The dosimetric issues differ among the varied exposed populations that are considered: atomic veterans, DOE workers exposed to both penetrating radiation and intakes of radionuclides, nuclear power plant workers, medical radiation workers, and industrial radiographers. While a major source of radiation exposure to the study population comes from external gamma-ray or x-ray sources, for certain of the study groups there is a meaningful component of radionuclide intakes that require internal radiation dosimetry measures. Scientific Committee 6-9 has been established by NCRP to produce a report on the comprehensive organ dose assessment (including uncertainty analysis) for the Million-Worker study. The Committee’s report will cover the specifics of practical dose reconstruction for the ongoing epidemiologic studies with uncertainty analysis discussions and will be a specific application of the guidance provided in NCRP Reports 158, 163, 164, and 171. The main role of the Committee is to provide guidelines to the various groups of dosimetrists involved in the various components of the Million
Dose Reconstruction for the Million Worker Study: Status and Guidelines
Bouville, André; Toohey, Richard E.; Boice, John D.; Beck, Harold L.; Dauer, Larry T.; Eckerman, Keith F.; Hagemeyer, Derek; Leggett, Richard W.; Mumma, Michael T.; Napier, Bruce; Pryor, Kathy H.; Rosenstein, Marvin; Schauer, David A.; Sherbini, Sami; Stram, Daniel O.; Thompson, James L.; Till, John E.; Yoder, Craig; Zeitlin, Cary
2015-02-01
The primary aim of the epidemiologic study of one million U.S. radiation workers and veterans (the Million-Worker study) is to provide scientifically valid information on the level of radiation risk when exposures are received gradually over time, and not acutely as was the case for Japanese atomic bomb survivors. The primary outcome of the epidemiological study is cancer mortality but other causes of death such as cardiovascular disease and cerebrovascular disease will be evaluated. The success of the study is tied to the validity of the dose reconstruction approaches to provide unbiased estimates of organ-specific radiation absorbed doses and their accompanying uncertainties. The dosimetry aspects for the Million-Worker study are challenging in that they address diverse exposure scenarios for diverse occupational groups being studied over a period of up to 70 years. The dosimetric issues differ among the varied exposed populations that are considered: atomic veterans, DOE workers exposed to both penetrating radiation and intakes of radionuclides, nuclear power plant workers, medical radiation workers, and industrial radiographers. While a major source of radiation exposure to the study population comes from external gamma-ray or x-ray sources, for certain of the study groups there is a meaningful component of radionuclide intakes that require internal radiation dosimetry measures. Scientific Committee 6-9 has been established by NCRP to produce a report on the comprehensive organ dose assessment (including uncertainty analysis) for the Million-Worker study. The Committee’s report will cover the specifics of practical dose reconstruction for the ongoing epidemiologic studies with uncertainty analysis discussions and will be a specific application of the guidance provided in NCRP Reports 158, 163, 164, and 171. The main role of the Committee is to provide guidelines to the various groups of dosimetrists involved in the various components of the Million
The Probabilities of Unique Events
2012-08-30
probabilities into quantum mechanics, and some psychologists have argued that they have a role to play in accounting for errors in judgment [30]. But, in...Discussion The mechanisms underlying naive estimates of the probabilities of unique events are largely inaccessible to consciousness , but they...Can quantum probability provide a new direc- tion for cognitive modeling? Behavioral and Brain Sciences (in press). 31. Paolacci G, Chandler J
Probability Surveys, Conditional Probability, and Ecological Risk Assessment
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Information Processing Using Quantum Probability
NASA Astrophysics Data System (ADS)
Behera, Laxmidhar
2006-11-01
This paper presents an information processing paradigm that introduces collective response of multiple agents (computational units) while the level of intelligence associated with the information processing has been increased manifold. It is shown that if the potential field of the Schroedinger wave equation is modulated using a self-organized learning scheme, then the probability density function associated with the stochastic data is transferred to the probability amplitude function which is the response of the Schroedinger wave equation. This approach illustrates that information processing of data with stochastic behavior can be efficiently done using quantum probability instead of classical probability. The proposed scheme has been demonstrated through two applications: denoising and adaptive control.
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
Million-degree plasma pervading the extended Orion Nebula.
Güdel, Manuel; Briggs, Kevin R; Montmerle, Thierry; Audard, Marc; Rebull, Luisa; Skinner, Stephen L
2008-01-18
Most stars form as members of large associations within dense, very cold (10 to 100 kelvin) molecular clouds. The nearby giant molecular cloud in Orion hosts several thousand stars of ages less than a few million years, many of which are located in or around the famous Orion Nebula, a prominent gas structure illuminated and ionized by a small group of massive stars (the Trapezium). We present x-ray observations obtained with the X-ray Multi-Mirror satellite XMM-Newton, revealing that a hot plasma with a temperature of 1.7 to 2.1 million kelvin pervades the southwest extension of the nebula. The plasma flows into the adjacent interstellar medium. This x-ray outflow phenomenon must be widespread throughout our Galaxy.
Learning-assisted theorem proving with millions of lemmas☆
Kaliszyk, Cezary; Urban, Josef
2015-01-01
Large formal mathematical libraries consist of millions of atomic inference steps that give rise to a corresponding number of proved statements (lemmas). Analogously to the informal mathematical practice, only a tiny fraction of such statements is named and re-used in later proofs by formal mathematicians. In this work, we suggest and implement criteria defining the estimated usefulness of the HOL Light lemmas for proving further theorems. We use these criteria to mine the large inference graph of the lemmas in the HOL Light and Flyspeck libraries, adding up to millions of the best lemmas to the pool of statements that can be re-used in later proofs. We show that in combination with learning-based relevance filtering, such methods significantly strengthen automated theorem proving of new conjectures over large formal mathematical libraries such as Flyspeck. PMID:26525678
Flood basalt volcanism during the past 250 million years.
Rampino, M R; Stothers, R B
1988-08-05
A chronology of the initiation dates of major continental flood basalt volcanism is established from published potassium-argon (K-Ar) and argon-argon (Ar-Ar) ages of basaltic rocks and related basic intrusions. The dating is therefore independent of the biostratigraphic and paleomagnetic time scales. Estimated errors of the initation dates of the volcanic episodes determined from the distributions of the radiometric ages are, approximately, plus or minus 4 percent. There were 11 distinct episodes during the past 250 million years. Sometimes appearing in pairs, the episodes have occurred quasi-periodically with a mean cycle time of 32 +/- 1 (estimated, error of the mean) million years. The initiation dates of the episodes are close to the estimated dates of mass extinctions of marine organisms. Showers of impacting comets may be the cause.
Flood basalt volcanism during the past 250 million years
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Stothers, Richard B.
1988-01-01
A chronology of the initiation dates of major continental flood basalt volcanism is established from published potassium-argon (K-Ar) and argon-argon (Ar-Ar) ages of basaltic rocks and related basic intrusions. The dating is therefore independent of the biostratigraphic and paleomagnetic time scales. Estimated errors of the initiation dates of the volcanic episodes determined from the distributions of the radiometric ages are, approximately, + or - 4 percent. There were 11 distinct episodes during the past 250 million years. Sometimes appearing in pairs, the episodes have occurred quasi-periodically with a mean cycle time of 32 + or - 1 (estimated error of the mean) million years. The initiation dates of the episodes are close to the estimated dates of mass extinctions of marine organisms. Showers of impacting comets may be the cause.
The 13 million year Cenozoic pulse of the Earth
NASA Astrophysics Data System (ADS)
Chen, Jiasheng; Kravchinsky, Vadim A.; Liu, Xiuming
2015-12-01
The geomagnetic polarity reversal rate changes radically from very low to extremely high. Such process indicates fundamental changes in the Earth's core reorganization and core-mantle boundary heat flow fluctuations. However, we still do not know how critical such changes are to surface geology and climate processes. Our analysis of the geomagnetic reversal frequency, oxygen isotope record, and tectonic plate subduction rate, which are indicators of the changes in the heat flux at the core mantle boundary, climate and plate tectonic activity, shows that all these changes indicate similar rhythms on million years' timescale in the Cenozoic Era occurring with the common fundamental periodicity of ∼13 Myr during most of the time. The periodicity is disrupted only during the last 20 Myr. Such periodic behavior suggests that large scale climate and tectonic changes at the Earth's surface are closely connected with the million year timescale cyclical reorganization of the Earth's interior.
Improving Multi-Million Virtual Rank MPI Execution in
Perumalla, Kalyan S; Park, Alfred J
2011-01-01
(MUPI) is a parallel discrete event simulator designed for enabling software-based experimentation via simulated execution across a range of synthetic to unmodified parallel programs using the Message Passing Interface (MPI) with millions of tasks. Here, we report work in progress in improving the efficiency of . Among the issues uncovered are the scaling problems with implementing barriers and intertask message ordering. Preliminary performance shows the possibility of supporting hundreds of virtual MPI ranks per real processor core. Performance improvements of at least 2 are observed, and enable execution of benchmark MPI runs with over 16 million virtual ranks synchronized in a discrete event fashion on as few as 16,128 real cores of a Cray XT5.
Berkeley Lab scientists develop criteria for $20 million energy challenge
Walker, Iain
2016-07-12
Berkeley Labs Iain Walker and his colleagues in environmental energy research helped the Siebel Foundation develop the criteria for its Energy Free Home Challenge, which comes with a $20 million global incentive prize. The Challenge is a competition to create a new generation of systems and technologies for practical homes that realize a net-zero, non-renewable energy footprint without increasing the cost of ownership. It is open to everyone everywhere â university teams to handymen and hobbyists.
Sky Brightness Analysis using a Million GEODSS Observations
NASA Astrophysics Data System (ADS)
Mandeville, W. Jody; McLaughlin, Tim; Six, Steve; Hollm, Rick
2012-09-01
Brightness of the sky background due to lunar phase and location can dramatically affect the limiting magnitude of astronomical detectors. Formerly, theoretical models have attained limited data sets with 10-20% differences between model and observation. This paper compares and contrasts previous investigations with over a million data points collected from various GEODSS sites located around the world and attempts to refine predictive modeling of sky brightness for use in scheduling as well as modeling and simulation tools.
Capture probabilities for secondary resonances
NASA Technical Reports Server (NTRS)
Malhotra, Renu
1990-01-01
A perturbed pendulum model is used to analyze secondary resonances, and it is shown that a self-similarity between secondary and primary resonances exists. Henrard's (1982) theory is used to obtain formulas for the capture probability into secondary resonances. The tidal evolution of Miranda and Umbriel is considered as an example, and significant probabilities of capture into secondary resonances are found.
Definition of the Neutrosophic Probability
NASA Astrophysics Data System (ADS)
Smarandache, Florentin
2014-03-01
Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.
Cluster membership probability: polarimetric approach
NASA Astrophysics Data System (ADS)
Medhi, Biman J.; Tamura, Motohide
2013-04-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.
Minnich, Ronald G.; Rudish, Donald W.
2009-01-01
In this paper we describe Megatux, a set of tools we are developing for rapid provisioning of millions of virtual machines and controlling and monitoring them, as well as what we've learned from booting one million Linux virtual machines on the Thunderbird (4660 nodes) and 550,000 Linux virtual machines on the Hyperion (1024 nodes) clusters. As might be expected, our tools use hierarchical structures. In contrast to existing HPC systems, our tools do not require perfect hardware; that all systems be booted at the same time; and static configuration files that define the role of each node. While we believe these tools will be useful for future HPC systems, we are using them today to construct botnets. Botnets have been in the news recently, as discoveries of their scale (millions of infected machines for even a single botnet) and their reach (global) and their impact on organizations (devastating in financial costs and time lost to recovery) have become more apparent. A distinguishing feature of botnets is their emergent behavior: fairly simple operational rule sets can result in behavior that cannot be predicted. In general, there is no reducible understanding of how a large network will behave ahead of 'running it'. 'Running it' means observing the actual network in operation or simulating/emulating it. Unfortunately, this behavior is only seen at scale, i.e. when at minimum 10s of thousands of machines are infected. To add to the problem, botnets typically change at least 11% of the machines they are using in any given week, and this changing population is an integral part of their behavior. The use of virtual machines to assist in the forensics of malware is not new to the cyber security world. Reverse engineering techniques often use virtual machines in combination with code debuggers. Nevertheless, this task largely remains a manual process to get past code obfuscation and is inherently slow. As part of our cyber security work at Sandia National Laboratories
Holographic Probabilities in Eternal Inflation
NASA Astrophysics Data System (ADS)
Bousso, Raphael
2006-11-01
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Millions of extra deaths a result of shortfalls in ODA.
1997-01-01
This news brief summarizes the conclusions of a newly released report by the UN Population Fund for its executive board. The report is "Meeting the Goals of the 1994 International Conference on Population and Development (ICPD): Consequences of Resource Shortfalls up to the Year 2000." Donor spending, including World Bank loans, increased by 40% in 1994, increased by 40% in 1995, and stabilized. The ICPD indicates a need for an additional $5.7 billion in donor assistance, an increase of 23% each year until the year 2000. Donors must meet their 33% share of the US$17 billion required in the year 2000 and the $21.7 billion by 2015. This level of assistance is needed in order to meet the goals of the ICPD agreed upon by attending countries. Developing countries have continued to increase their allocations for reproductive health in accordance with the ICPD recommendations. Most donor countries were not close to meeting the official development assistance target of 0.7% of gross national product. France, Italy, and Belgium have been slow in responding to population program assistance. However, Norway and the Netherlands have passed laws requiring that 4% of their official development assistance be applied to population programs. Denmark is cooperative. Japan is a leading donor, followed by the UK. It is estimated in the UN executive board report that during 1995-2000 shortfalls in assistance will result in at least 120 million additional unwanted pregnancies, 49 million abortions, 5 million infant and child deaths, and 65,000 maternal deaths.
[The Six Million Dollar Man: from fiction to reality].
Langeveld, C H Kees
2013-01-01
The term 'bionic' has been in existence since 1958, but only gained general recognition from the television series 'The Six Million Dollar Man'. Following a crash, the central figure in this series - test pilot Steve Austin - has an eye, an arm and both legs replaced by prostheses which make him stronger and faster than a normal person. This story is based on the science fiction book 'Cyborg' by Martin Caidin. In the world of comic books and films there are a number of examples of people who are given superhuman powers by having technological gadgets built in. Although the latter is not yet possible, the bionic human has now become reality.
MULTI - MILLION - TURN BEAM POSITION MONITORS FOR RHIC.
SATOGATA,T.CAMERON,P.CERNIGLIA,P.CUPOLO,J.DAWSON,CDEGEN,CMEAD,JVETTER,K
2003-05-12
During the RHIC 2003 run, two beam position monitors (BPMs) in each transverse plane in the RHIC blue ring were upgraded with high-capacity mezzanine cards. This upgrade provided these planes with the capability to digitize up to 128 million consecutive turns of RHIC beam, or almost 30 minutes of continuous beam centroid phase space evolution for a single RHIC bunch. This paper describes necessary hardware and software changes and initial system performance. We discuss early uses and results for diagnosis of coherent beam oscillations, turn-by-turn (TBT) acquisition through a RHIC acceleration ramp, and ac-dipole nonlinear dynamics studies.
15 million preterm births annually: what has changed this year?
2012-01-01
Each year, more than 1 in 10 of the world’s babies are born preterm, resulting in 15 million babies born too soon. World Prematurity Day, November 17, is a global effort to raise awareness about prematurity. This past year, there has been increased awareness of the problem, through new data and evidence, global partnership and country champions. Actions to improve care would save hundreds of thousands of babies born too soon from death and disability. Accelerated prevention requires urgent research breakthroughs. PMID:23148557
A Million-Second Chandra View of Cassiopeia A
NASA Technical Reports Server (NTRS)
Hwang, Una; Laming, J. Martin; Badenes, Carles; Berendse, Fred; Blondin, John; Cioffi, Denis; DeLaney, Tracey; Dewey, Daniel; Fesen, Robert; Flanagan, Kathryn A.
2004-01-01
We introduce a million-second observation of the supernova remnant Cassiopeia A with the Chandra X-ray Observatory. The bipolar structure of the Si-rich ejecta (NE jet and SW counterpart) is clearly evident in the new images, and their chemical similarity is confirmed by their spectra. These are most likely due to jets of ejecta as opposed to cavities in the circumstellar medium, since we can reject simple models for the latter. The properties of these jets and the Fe-rich ejecta will provide clues to the explosion of Cas A.
Over 30 million psychedelic users in the United States.
Krebs, Teri S; Johansen, Pål-Ørjan
2013-01-01
We estimated lifetime prevalence of psychedelic use (lysergic acid diethylamide (LSD), psilocybin (magic mushrooms), mescaline, and peyote) by age category using data from a 2010 US population survey of 57,873 individuals aged 12 years and older. There were approximately 32 million lifetime psychedelic users in the US in 2010; including 17% of people aged 21 to 64 years (22% of males and 12% of females). Rate of lifetime psychedelic use was greatest among people aged 30 to 34 (total 20%, including 26% of males and 15% of females).
The Anguilla spp. migration problem: 40 million years of evolution and two millennia of speculation.
Righton, D; Aarestrup, K; Jellyman, D; Sébert, P; van den Thillart, G; Tsukamoto, K
2012-07-01
Anguillid eels Anguilla spp. evolved between 20 and 40 million years ago and possess a number of remarkable migratory traits that have fascinated scientists for millennia. Despite centuries of effort, the spawning areas and migrations are known only for a few species. Even for these species, information on migratory behaviour is remarkably sketchy. The latest knowledge on the requirements for successful migration and field data on the migrations of adults and larvae are presented, how experiments on swimming efficiency have progressed the understanding of migration are highlighted and the challenges of swimming at depth considered. The decline of Anguilla spp. across the world is an ongoing concern for fisheries and environmental managers. New developments in the knowledge of eel migration will, in addition to solving a centuries old mystery, probably help to identify how this decline might be halted or even reversed.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Dinosaurs, Dinosaur Eggs, and Probability.
ERIC Educational Resources Information Center
Teppo, Anne R.; Hodgson, Ted
2001-01-01
Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)
Scher, Howie D; Whittaker, Joanne M; Williams, Simon E; Latimer, Jennifer C; Kordesch, Wendy E C; Delaney, Margaret L
2015-07-30
Earth's mightiest ocean current, the Antarctic Circumpolar Current (ACC), regulates the exchange of heat and carbon between the ocean and the atmosphere, and influences vertical ocean structure, deep-water production and the global distribution of nutrients and chemical tracers. The eastward-flowing ACC occupies a unique circumglobal pathway in the Southern Ocean that was enabled by the tectonic opening of key oceanic gateways during the break-up of Gondwana (for example, by the opening of the Tasmanian Gateway, which connects the Indian and Pacific oceans). Although the ACC is a key component of Earth's present and past climate system, the timing of the appearance of diagnostic features of the ACC (for example, low zonal gradients in water-mass tracer fields) is poorly known and represents a fundamental gap in our understanding of Earth history. Here we show, using geophysically determined positions of continent-ocean boundaries, that the deep Tasmanian Gateway opened 33.5 ± 1.5 million years ago (the errors indicate uncertainty in the boundary positions). Following this opening, sediments from Indian and Pacific cores recorded Pacific-type neodymium isotope ratios, revealing deep westward flow equivalent to the present-day Antarctic Slope Current. We observe onset of the ACC at around 30 million years ago, when Southern Ocean neodymium isotopes record a permanent shift to modern Indian-Atlantic ratios. Our reconstructions of ocean circulation show that massive reorganization and homogenization of Southern Ocean water masses coincided with migration of the northern margin of the Tasmanian Gateway into the mid-latitude westerly wind band, which we reconstruct at 64° S, near to the northern margin. Onset of the ACC about 30 million years ago coincided with major changes in global ocean circulation and probably contributed to the lower atmospheric carbon dioxide levels that appear after this time.
NASA Astrophysics Data System (ADS)
Scher, Howie D.; Whittaker, Joanne M.; Williams, Simon E.; Latimer, Jennifer C.; Kordesch, Wendy E. C.; Delaney, Margaret L.
2015-07-01
Earth's mightiest ocean current, the Antarctic Circumpolar Current (ACC), regulates the exchange of heat and carbon between the ocean and the atmosphere, and influences vertical ocean structure, deep-water production and the global distribution of nutrients and chemical tracers. The eastward-flowing ACC occupies a unique circumglobal pathway in the Southern Ocean that was enabled by the tectonic opening of key oceanic gateways during the break-up of Gondwana (for example, by the opening of the Tasmanian Gateway, which connects the Indian and Pacific oceans). Although the ACC is a key component of Earth's present and past climate system, the timing of the appearance of diagnostic features of the ACC (for example, low zonal gradients in water-mass tracer fields) is poorly known and represents a fundamental gap in our understanding of Earth history. Here we show, using geophysically determined positions of continent-ocean boundaries, that the deep Tasmanian Gateway opened 33.5 +/- 1.5 million years ago (the errors indicate uncertainty in the boundary positions). Following this opening, sediments from Indian and Pacific cores recorded Pacific-type neodymium isotope ratios, revealing deep westward flow equivalent to the present-day Antarctic Slope Current. We observe onset of the ACC at around 30 million years ago, when Southern Ocean neodymium isotopes record a permanent shift to modern Indian-Atlantic ratios. Our reconstructions of ocean circulation show that massive reorganization and homogenization of Southern Ocean water masses coincided with migration of the northern margin of the Tasmanian Gateway into the mid-latitude westerly wind band, which we reconstruct at 64° S, near to the northern margin. Onset of the ACC about 30 million years ago coincided with major changes in global ocean circulation and probably contributed to the lower atmospheric carbon dioxide levels that appear after this time.
Joint probabilities and quantum cognition
NASA Astrophysics Data System (ADS)
de Barros, J. Acacio
2012-12-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Isolation of a 250 million-year-old halotolerant bacterium from a primary salt crystal
NASA Astrophysics Data System (ADS)
Vreeland, Russell H.; Rosenzweig, William D.; Powers, Dennis W.
2000-10-01
Bacteria have been found associated with a variety of ancient samples, however few studies are generally accepted due to questions about sample quality and contamination. When Cano and Borucki isolated a strain of Bacillus sphaericus from an extinct bee trapped in 25-30 million-year-old amber, careful sample selection and stringent sterilization techniques were the keys to acceptance. Here we report the isolation and growth of a previously unrecognized spore-forming bacterium (Bacillus species, designated 2-9-3) from a brine inclusion within a 250million-year-old salt crystal from the Permian Salado Formation. Complete gene sequences of the 16S ribosomal DNA show that the organism is part of the lineage of Bacillus marismortui and Virgibacillus pantothenticus. Delicate crystal structures and sedimentary features indicate the salt has not recrystallized since formation. Samples were rejected if brine inclusions showed physical signs of possible contamination. Surfaces of salt crystal samples were sterilized with strong alkali and acid before extracting brines from inclusions. Sterilization procedures reduce the probability of contamination to less than 1 in 10 9.
Earliest Porotic Hyperostosis on a 1.5-Million-Year-Old Hominin, Olduvai Gorge, Tanzania
Domínguez-Rodrigo, Manuel; Pickering, Travis Rayne; Diez-Martín, Fernando; Mabulla, Audax; Musiba, Charles; Trancho, Gonzalo; Baquedano, Enrique; Bunn, Henry T.; Barboni, Doris; Santonja, Manuel; Uribelarrea, David; Ashley, Gail M.; Martínez-Ávila, María del Sol; Barba, Rebeca; Gidna, Agness; Yravedra, José; Arriaza, Carmen
2012-01-01
Meat-eating was an important factor affecting early hominin brain expansion, social organization and geographic movement. Stone tool butchery marks on ungulate fossils in several African archaeological assemblages demonstrate a significant level of carnivory by Pleistocene hominins, but the discovery at Olduvai Gorge of a child's pathological cranial fragments indicates that some hominins probably experienced scarcity of animal foods during various stages of their life histories. The child's parietal fragments, excavated from 1.5-million-year-old sediments, show porotic hyperostosis, a pathology associated with anemia. Nutritional deficiencies, including anemia, are most common at weaning, when children lose passive immunity received through their mothers' milk. Our results suggest, alternatively, that (1) the developmentally disruptive potential of weaning reached far beyond sedentary Holocene food-producing societies and into the early Pleistocene, or that (2) a hominin mother's meat-deficient diet negatively altered the nutritional content of her breast milk to the extent that her nursing child ultimately died from malnourishment. Either way, this discovery highlights that by at least 1.5 million years ago early human physiology was already adapted to a diet that included the regular consumption of meat. PMID:23056303
Osteopathology in Rhinocerotidae from 50 Million Years to the Present
Stilson, Kelsey T.; Hopkins, Samantha S. B.; Davis, Edward Byrd
2016-01-01
Individual elements of many extinct and extant North American rhinocerotids display osteopathologies, particularly exostoses, abnormal textures, and joint margin porosity, that are commonly associated with localized bone trauma. When we evaluated six extinct rhinocerotid species spanning 50 million years (Ma), we found the incidence of osteopathology increases from 28% of all elements of Eocene Hyrachyus eximius to 65–80% of all elements in more derived species. The only extant species in this study, Diceros bicornis, displayed less osteopathologies (50%) than the more derived extinct taxa. To get a finer-grained picture, we scored each fossil for seven pathological indicators on a scale of 1–4. We estimated the average mass of each taxon using M1-3 length and compared mass to average pathological score for each category. We found that with increasing mass, osteopathology also significantly increases. We then ran a phylogenetically-controlled regression analysis using a time-calibrated phylogeny of our study taxa. Mass estimates were found to significantly covary with abnormal foramen shape and abnormal bone textures. This pattern in osteopathological expression may reflect a part of the complex system of adaptations in the Rhinocerotidae over millions of years, where increased mass, cursoriality, and/or increased life span are selected for, to the detriment of long-term bone health. This work has important implications for the future health of hoofed animals and humans alike. PMID:26840633
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
A Million Turnover Molecular Anode for Catalytic Water Oxidation.
Creus, Jordi; Matheu, Roc; Peñafiel, Itziar; Moonshiram, Dooshaye; Blondeau, Pascal; Benet-Buchholz, Jordi; García-Antón, Jordi; Sala, Xavier; Godard, Cyril; Llobet, Antoni
2016-12-05
Molecular ruthenium-based water oxidation catalyst precursors of general formula [Ru(tda)(L(i) )2 ] (tda(2-) is [2,2':6',2''-terpyridine]-6,6''-dicarboxylato; L(1) =4-(pyren-1-yl)-N-(pyridin-4-ylmethyl)butanamide, 1 b; L(2) =4-(pyren-1-yl)pyridine), 1 c), have been prepared and thoroughly characterized. Both complexes contain a pyrene group allowing ready and efficiently anchoring via π interactions on multi-walled carbon nanotubes (MWCNT). These hybrid solid state materials are exceptionally stable molecular water-oxidation anodes capable of carrying out more than a million turnover numbers (TNs) at pH 7 with an Eapp =1.45 V vs. NHE without any sign of degradation. XAS spectroscopy analysis before, during, and after catalysis together with electrochemical techniques allow their unprecedented oxidative ruggedness to be monitored and verified.
A million peptide motifs for the molecular biologist.
Tompa, Peter; Davey, Norman E; Gibson, Toby J; Babu, M Madan
2014-07-17
A molecular description of functional modules in the cell is the focus of many high-throughput studies in the postgenomic era. A large portion of biomolecular interactions in virtually all cellular processes is mediated by compact interaction modules, referred to as peptide motifs. Such motifs are typically less than ten residues in length, occur within intrinsically disordered regions, and are recognized and/or posttranslationally modified by structured domains of the interacting partner. In this review, we suggest that there might be over a million instances of peptide motifs in the human proteome. While this staggering number suggests that peptide motifs are numerous and the most understudied functional module in the cell, it also holds great opportunities for new discoveries.
Bilaterian burrows and grazing behavior at >585 million years ago.
Pecoits, Ernesto; Konhauser, Kurt O; Aubet, Natalie R; Heaman, Larry M; Veroslavsky, Gerardo; Stern, Richard A; Gingras, Murray K
2012-06-29
Based on molecular clocks and biomarker studies, it is possible that bilaterian life emerged early in the Ediacaran, but at present, no fossils or trace fossils from this time have been reported. Here we report the discovery of the oldest bilaterian burrows in shallow-water glaciomarine sediments from the Tacuarí Formation, Uruguay. Uranium-lead dating of zircons in cross-cutting granite dykes constrains the age of these burrows to be at least 585 million years old. Their features indicate infaunal grazing activity by early eumetazoans. Active backfill within the burrow, an ability to wander upward and downward to exploit shallowly situated sedimentary laminae, and sinuous meandering suggest advanced behavioral adaptations. These findings unite the paleontological and molecular data pertaining to the evolution of bilaterians, and link bilaterian origins to the environmental changes that took place during the Neoproterozoic glaciations.
One million served: Rhode Island`s recycling facility
Malloy, M.G.
1997-11-01
Rhode Island`s landfill and adjacent materials recovery facility (MRF) in Johnston, both owned by the quasi-public Rhode Island Resource Recovery Corp. (RIRRC, Johnston), serve the entire state. The $12-million recycling facility was built in 1989 next to the state`s sole landfill, the Central Landfill, which accepts only in-state trash. The MRF is operated for RIRRC by New England CRInc. (Hampton, N.H.), a unit of Waste Management, Inc. (WMI, Oak Brook, Ill.). It handles a wide variety of materials, from the usual newspaper, cardboard, and mixed containers to new streams such as wood waste, scrap metal, aseptic packaging (milk and juice boxes), and even textiles. State municipalities are in the process of adding many of these new recyclable streams into their curbside collection programs, all of which feed the facility.
Millions and billions - The META and BETA searches at Harvard
NASA Astrophysics Data System (ADS)
Leigh, Darren; Horowitz, Paul
1997-01-01
The META and BETA searches for microwave carriers have now completed a decade of observations of the northern sky, using the 26-m radiotelescope at the Agassiz Station. META's results set limits on the prevalence of advanced civilizations transmitting carrier beacons at the H I wavelength. BETA, the next-generation search system, combines greater bandwidth with quick lookback of candidate signals and robust rejection of signals of terrestrial origin. It covers the 'waterhole' of 1.4-1.7 GHz, using a 250-million-channel spectrometer, in conjunction with a three-beam antenna system, producing a 250 Mbytes/s output stream. The philosophy, implementation, and results of these systems are described.
Predicting loss exceedance probabilities for US hurricane landfalls
NASA Astrophysics Data System (ADS)
Murnane, R.
2003-04-01
The extreme winds, rains, and floods produced by landfalling hurricanes kill and injure many people and cause severe economic losses. Many business, planning, and emergency management decisions are based on the probability of hurricane landfall and associated emergency management considerations; however, the expected total economic and insured losses also are important. Insured losses generally are assumed to be half the total economic loss from hurricanes in the United States. Here I describe a simple model that can be used to estimate deterministic and probabilistic exceedance probabilities for insured losses associated with landfalling hurricanes along the US coastline. The model combines wind speed exceedance probabilities with loss records from historical hurricanes striking land. The wind speed exceedance probabilities are based on the HURDAT best track data and use the storm’s maximum sustained wind just prior to landfall. The loss records are normalized to present-day values using a risk model and account for historical changes in inflation, population, housing stock, and other factors. Analysis of the correlation between normalized losses and a number of storm-related parameters suggests that the most relevant, statistically-significant predictor for insured loss is the storm’s maximum sustained wind at landfall. Insured loss exceedance probabilities thus are estimated using a linear relationship between the log of the maximum sustained winds and normalized insured loss. Model estimates for insured losses from Hurricanes Isidore (US45 million) and Lili (US275 million) compare well with loss estimates from more sophisticated risk models and recorded losses. The model can also be used to estimate how exceedance probabilities for insured loss vary as a function of the North Atlantic Oscillation and the El Niño-Southern Oscillation.
Talking dirty: how to save a million lives.
Curtis, V
2003-06-01
Infectious diseases are still the number one threat to public health in developing countries. Diarrhoeal diseases alone are responsible for the deaths of at least 2 million children yearly - hygiene is paramount to resolving this problem. The function of hygienic behaviour is to prevent the transmission of the agents of infection. The most effective way of stopping infection is to stop faecal material getting into the child's environment by safe disposal of faeces and washing hands with soap once faecal material has contaminated them in the home. A review of the literature on handwashing puts it top in a list of possible interventions to prevent diarrhoea. Handwashing with soap has been calculated to save a million lives. However, few people do wash their hands with soap at these critical times. Obtaining a massive increase in handwashing worldwide requires a sea-change in thinking. Initial results from a new programme led by the World Bank, with many partner organisations, suggest that health is low on people's list of motives, rather, hands are washed to remove dirt, to rinse food off after eating, to make hands look and smell good, and as an act of motherly caring. Professional consumer and market research agencies are being used to work with the soap industry to design professional communications programmes to reach whole populations in Ghana and India. Tools and techniques for marketing handwashing and for measuring the actual impact on behaviour will be applied in new public-private handwashing programmes, which are to start up soon in Nepal, China, Peru and Senegal.
Atmospheric Oxygen Variation Over the Last 100 Million Years
NASA Astrophysics Data System (ADS)
Watson, A. J.; Mills, B.; Daines, S. J.; Lenton, T. M.; Belcher, C.
2014-12-01
There is no agreement over how atmospheric oxygen has varied over recent Earth history. Our knowledge of past O2 concentrations relies on biogeochemical modelling, constrained by geochemical data and proxies. There are however few direct indicators of oxygen concentrations, though the presence of fossil charcoal indicates that levels have not strayed outside the "fire window", say below 16% or above 35%, during the last hundred million years. Different model predictions encompass both decreasing and increasing trends over this period however. These predictions are sensitive to weathering of continental rocks, which provide a sink for O2, but also a supply of phosphorus and sediment to the ocean, both of which increase carbon burial and thereby provide an oxygen source. Here we update our COPSE model with a more detailed treatment than hitherto, incorporating new input data, seafloor weathering processes, and different compositions and weatherability of granites and basalts. Our model suggests a broadly declining O2 trend over the late Mesozoic to present. An alternative forcing uses the phosphorus deposition curve of Follmi (1995), which is constructed from P measurements in ocean cores, and indicates P fluxes to the oceans that have varied over time by two orders of magnitude. Used to drive the model this also results in a declining long-term trend for atmospheric O2 over the last hundred million years, but with dramatic shorter-term variations superposed on the trend. These however stay (just) within the "fire window" for oxygen concentrations, and can be tentatively related to the evolution of fire adaptations in plants.
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Interference of probabilities in dynamics
Zak, Michail
2014-08-15
A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.
Knowledge typology for imprecise probabilities.
Wilson, G. D.; Zucker, L. J.
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Fiore, Michael C.; Croyle, Robert T.; Curry, Susan J.; Cutler, Charles M.; Davis, Ronald M.; Gordon, Catherine; Healton, Cheryl; Koh, Howard K.; Orleans, C. Tracy; Richling, Dennis; Satcher, David; Seffrin, John; Williams, Christine; Williams, Larry N.; Keller, Paula A.; Baker, Timothy B.
2004-01-01
In August 2002, the Subcommittee on Cessation of the Interagency Committee on Smoking and Health (ICSH) was charged with developing recommendations to substantially increase rates of tobacco cessation in the United States. The subcommittee’s report, A National Action Plan for Tobacco Cessation, outlines 10 recommendations for reducing premature morbidity and mortality by helping millions of Americans stop using tobacco. The plan includes both evidence-based, population-wide strategies designed to promote cessation (e.g., a national quitline network) and a Smokers’ Health Fund to finance the programs (through a $2 per pack excise tax increase). The subcommittee report was presented to the ICSH (February 11, 2003), which unanimously endorsed sending it to Secretary Thompson for his consideration. In this article, we summarize the national action plan. PMID:14759928
Stretching Probability Explorations with Geoboards
ERIC Educational Resources Information Center
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
On probability-possibility transformations
NASA Technical Reports Server (NTRS)
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Children's Understanding of Posterior Probability
ERIC Educational Resources Information Center
Girotto, Vittorio; Gonzalez, Michael
2008-01-01
Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five,…
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Probability Simulation in Middle School.
ERIC Educational Resources Information Center
Lappan, Glenda; Winter, M. J.
1980-01-01
Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)
GPS: Geometry, Probability, and Statistics
ERIC Educational Resources Information Center
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
DOSE RECONSTRUCTION FOR THE MILLION WORKER STUDY: STATUS AND GUIDELINES
Bouville, André; Toohey, Richard E.; Boice, John D.; Beck, Harold L.; Dauer, Larry T.; Eckerman, Keith F.; Hagemeyer, Derek; Leggett, Richard W.; Mumma, Michael T.; Napier, Bruce; Pryor, Kathy H.; Rosenstein, Marvin; Schauer, David A.; Sherbini, Sami; Stram, Daniel O.; Thompson, James L.; Till, John E.; Yoder, Craig; Zeitlin, Cary
2016-01-01
The primary aim of the epidemiologic study of one million U.S. radiation workers and veterans [the Million Worker Study (MWS)] is to provide scientifically valid information on the level of radiation risk when exposures are received gradually over time, and not within seconds as was the case for Japanese atomic-bomb survivors. The primary outcome of the epidemiologic study is cancer mortality but other causes of death such as cardiovascular disease and cerebrovascular disease will be evaluated. The success of the study is tied to the validity of the dose reconstruction approaches to provide realistic estimates of organ-specific radiation absorbed doses that are as accurate and precise as possible and to properly evaluate their accompanying uncertainties. The dosimetry aspects for the MWS are challenging in that they address diverse exposure scenarios for diverse occupational groups being studied over a period of up to 70 y. The dosimetric issues differ among the varied exposed populations that are considered: atomic veterans, U.S. Department of Energy workers exposed to both penetrating radiation and intakes of radionuclides, nuclear power plant workers, medical radiation workers, and industrial radiographers. While a major source of radiation exposure to the study population comes from external gamma- or x-ray sources, for some of the study groups there is a meaningful component of radionuclide intakes that require internal radiation dosimetry assessments. Scientific Committee 6–9 has been established by the National Council on Radiation Protection and Measurements (NCRP) to produce a report on the comprehensive organ dose assessment (including uncertainty analysis) for the MWS. The NCRP dosimetry report will cover the specifics of practical dose reconstruction for the ongoing epidemiologic studies with uncertainty analysis discussions and will be a specific application of the guidance provided in NCRP Report Nos. 158, 163, 164, and 171. The main role of the
Time-dependent earthquake probabilities
Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.
2005-01-01
We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.
Simulation of LHC events on a millions threads
NASA Astrophysics Data System (ADS)
Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.
2015-12-01
Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.
Building an automated 100 million+ variable star catalogue for Gaia
NASA Astrophysics Data System (ADS)
Holl, Berry; Eyer, Laurent; Mowlavi, Nami; Evans, Dafydd W.; Clementini, Gisella; Cuypers, Jan; Lanzafame, Alessandro; De Ridder, Joris; Sarro, Luis; Ordoñez-Blanco, Diego; Nienartowicz, Krzysztof; Charnas, Jonathan; Guy, Leanne; Jévardat de Fombelle, Grégory; Lecoeur-Taïbi, Isabelle; Rimoldini, Lorenzo; Süveges, Maria; Bouchy, François
2015-08-01
Gaia is currently monitoring over a billion sources in and around our Galaxy, of which of the order of hundred million are expected to be variable stars. This unmatched sample will revolutionise research on stars and stellar physics not only because of its sheer size, but also because of the availability of simultaneous photometric, astrometric, and, for the brighter stars, radial velocity measurements. The public release of the Gaia data will be accompanied by many catalogues produced by the Gaia Data Processing and Analysis Consortium, amongst which the variable star catalogue provided by the Coordination Unit 7 (CU7). This catalogue will be the starting point for many stellar studies following the data release and therefore has to be of very high quality.In this presentation we present an initial overview of the information that can be expected to be part of this variable star catalogue. Additionally, we discuss the important aspects of the CU7 automated pipeline that will lead to the production of this catalogue: i) the motivation of its design, ii) the modelling of periodic sources, iii) the synergy of various classifiers, and iv) variable type-specific modelling. Additionally the advantages of combining photometric, spectroscopic and astrometric measurements will be highlighted.
Lessons learned at 208K: Towards Debugging Millions of Cores
Lee, G L; Ahn, D H; Arnold, D C; de Supinski, B R; Legendre, M; Miller, B P; Schulz, M J; Liblit, B
2008-04-14
Petascale systems will present several new challenges to performance and correctness tools. Such machines may contain millions of cores, requiring that tools use scalable data structures and analysis algorithms to collect and to process application data. In addition, at such scales, each tool itself will become a large parallel application--already, debugging the full Blue-Gene/L (BG/L) installation at the Lawrence Livermore National Laboratory requires employing 1664 tool daemons. To reach such sizes and beyond, tools must use a scalable communication infrastructure and manage their own tool processes efficiently. Some system resources, such as the file system, may also become tool bottlenecks. In this paper, we present challenges to petascale tool development, using the Stack Trace Analysis Tool (STAT) as a case study. STAT is a lightweight tool that gathers and merges stack traces from a parallel application to identify process equivalence classes. We use results gathered at thousands of tasks on an Infiniband cluster and results up to 208K processes on BG/L to identify current scalability issues as well as challenges that will be faced at the petascale. We then present implemented solutions to these challenges and show the resulting performance improvements. We also discuss future plans to meet the debugging demands of petascale machines.
"Dry Eye" Is the Wrong Diagnosis for Millions.
Korb, Donald R; Blackie, Caroline A
2015-09-01
The clinical perspective that dry eye is, at best, an incomplete diagnosis and the benefit of an etiology-based approach to dry eye are presented. To provide context for this perspective, the historical and current definition of dry eye is reviewed. The paradigm shift introduced by the Meibomian Gland Dysfunction (MGD) Workshop, that MGD is likely the leading cause of dry eye, is discussed in combination with the advancements in the diagnosis and treatment of MGD. To facilitate discussion on the benefit of an etiology-based approach, a retrospective observational analysis was performed on deidentified data from eligible, fully consented, refractory dry eye patients, where conventional sequelae-based dry eye treatment had failed. In this refractory population, the diagnosis of MGD, which directed treatment to evacuating gland obstructions and rehabilitating gland function, was successful. The clinical perspective that "dry eye" is the wrong diagnosis for millions is provocative. However, the MGD-first approach has the potential to revolutionize the timing of diagnosis and the choice of frontline therapy in most patients with dry eye. Additionally, the ability to screen for MGD in its earliest stages, during routine care, expands the scope of clinical practice to include early intervention. For most patients, we are no longer constrained to delay diagnosis until the tear film has decompensated and the cascade of inflammation has ensued. We do not have to wait for our patients to tell us there is a problem.
The Geological Grading Scale: Every million Points Counts!
NASA Astrophysics Data System (ADS)
Stegman, D. R.; Cooper, C. M.
2006-12-01
The concept of geological time, ranging from thousands to billions of years, is naturally quite difficult for students to grasp initially, as it is much longer than the timescales over which they experience everyday life. Moreover, universities operate on a few key timescales (hourly lectures, weekly assignments, mid-term examinations) to which students' maximum attention is focused, largely driven by graded assessment. The geological grading scale exploits the overwhelming interest students have in grades as an opportunity to instill familiarity with geological time. With the geological grading scale, the number of possible points/marks/grades available in the course is scaled to 4.5 billion points --- collapsing the entirety of Earth history into one semester. Alternatively, geological time can be compressed into each assignment, with scores for weekly homeworks not worth 100 points each, but 4.5 billion! Homeworks left incomplete with questions unanswered lose 100's of millions of points - equivalent to missing the Paleozoic era. The expected quality of presentation for problem sets can be established with great impact in the first week by docking assignments an insignificant amount points for handing in messy work; though likely more points than they've lost in their entire schooling history combined. Use this grading scale and your students will gradually begin to appreciate exactly how much time represents a geological blink of the eye.
Possible shell disease in 100 million-year-old crabs.
Klompmaker, Adiël A; Chistoserdov, Andrei Y; Felder, Darryl L
2016-05-03
Modern organisms exhibit evidence of many diseases, but recognizing such evidence in fossils remains difficult, thus hampering the study of the evolution of disease. We report on 2 molts of the goniodromitid crabs Distefania incerta and Goniodromites laevis from the mid-Cretaceous (late Albian) of Spain, with both species exhibiting damage to the dorsal carapace in otherwise well-preserved specimens. The subcircular to quadratical holes, found in <0.2% of the specimens, resemble damage caused by bacterial infections on the cuticle of modern decapods in terms of size and shape. Abiotic damage, predation, and encrustation followed by damage to the shell provide less satisfactory explanations, although these agents cannot be completely excluded from a role in shell disease etiology. We hypothesize that the observed fossil lesions are caused primarily by bacterial disease that started prior to molting, with or without other agents of initiation. If correct, this is the only known example of such bacterial infections in decapod crustaceans from the fossil record thus far, pushing back the evolutionary history of this type of shell disease by ~100 million years.
Hominins on Flores, Indonesia, by one million years ago.
Brumm, Adam; Jensen, Gitte M; van den Bergh, Gert D; Morwood, Michael J; Kurniawan, Iwan; Aziz, Fachroel; Storey, Michael
2010-04-01
Previous excavations at Mata Menge and Boa Lesa in the Soa Basin of Flores, Indonesia, recovered stone artefacts in association with fossilized remains of the large-bodied Stegodon florensis florensis. Zircon fission-track ages from these sites indicated that hominins had colonized the island by 0.88 +/- 0.07 million years (Myr) ago. Here we describe the contents, context and age of Wolo Sege, a recently discovered archaeological site in the Soa Basin that has in situ stone artefacts and that lies stratigraphically below Mata Menge and immediately above the basement breccias of the basin. We show using (40)Ar/(39)Ar dating that an ignimbrite overlying the artefact layers at Wolo Sege was erupted 1.02 +/- 0.02 Myr ago, providing a new minimum age for hominins on Flores. This predates the disappearance from the Soa Basin of 'pygmy' Stegodon sondaari and Geochelone spp. (giant tortoise), as evident at the nearby site of Tangi Talo, which has been dated to 0.90 +/- 0.07 Myr ago. It now seems that this extirpation or possible extinction event and the associated faunal turnover were the result of natural processes rather than the arrival of hominins. It also appears that the volcanic and fluvio-lacustrine deposits infilling the Soa Basin may not be old enough to register the initial arrival of hominins on the island.
$1.5 million female condom order awarded.
1997-12-01
The Female Health Co. of Chicago, Illinois, has reported receiving an order for 1.5 million female condoms from South Africa's Department of Health. Shipments are scheduled to begin immediately and are expected to be completed by early 1998. Earlier, South Africa ordered 90,000 female condoms in order to test the device. This order is part of the company's multi-year agreement with the Joint UN Program on AIDS (UNAIDS) which provides a special price based upon global public sector demand. The launch of the female condom in South Africa is just one of a series planned in Africa and other areas of the developing world. The globalization of the female condom, albeit in its early stages, affords the Female Health Co. with the opportunity to explore other options for the future development of its business. The company has engaged Vector Securities International, Inc. to help identify, develop, and evaluate those options. The female condom is currently marketed in the US, the UK, Canada, South Korea, Taiwan, and Holland, and will soon be launched in Brazil. Female Health Co. is also engaged in discussions with potential partners for Europe, the US, India, China, and other countries. The female condom was also recently launched in Zimbabwe as pert of the Joint UNAIDS, and an application had been submitted to Koseisho for marketing approval in Japan.
Determining conserved metabolic biomarkers from a million database queries
Kurczy, Michael E.; Ivanisevic, Julijana; Johnson, Caroline H.; Uritboonthai, Winnie; Hoang, Linh; Fang, Mingliang; Hicks, Matthew; Aldebot, Anthony; Rinehart, Duane; Mellander, Lisa J.; Tautenhahn, Ralf; Patti, Gary J.; Spilker, Mary E.; Benton, H. Paul; Siuzdak, Gary
2015-01-01
Motivation: Metabolite databases provide a unique window into metabolome research allowing the most commonly searched biomarkers to be catalogued. Omic scale metabolite profiling, or metabolomics, is finding increased utility in biomarker discovery largely driven by improvements in analytical technologies and the concurrent developments in bioinformatics. However, the successful translation of biomarkers into clinical or biologically relevant indicators is limited. Results: With the aim of improving the discovery of translatable metabolite biomarkers, we present search analytics for over one million METLIN metabolite database queries. The most common metabolites found in METLIN were cross-correlated against XCMS Online, the widely used cloud-based data processing and pathway analysis platform. Analysis of the METLIN and XCMS common metabolite data has two primary implications: these metabolites, might indicate a conserved metabolic response to stressors and, this data may be used to gauge the relative uniqueness of potential biomarkers. Availability and implementation. METLIN can be accessed by logging on to: https://metlin.scripps.edu Contact: siuzdak@scripps.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26275895
Anthropogenic carbon release rate unprecedented during past 66 million years
NASA Astrophysics Data System (ADS)
Zeebe, R. E.; Ridgwell, A.; Zachos, J. C.
2015-12-01
Carbon release rates from anthropogenic sources have reached a record high of about 10 Pg C/y in 2013. However, due to uncertainties in the strength of climate system feedbacks, the full impact of the rapid carbon release on the Earth system is difficult to predict with confidence. Geologic analogues from past transient climate changes could provide invaluable constraints but only if the associated carbon release rates can be reliably reconstructed. We present a new technique - based on combined data-model analysis - to extract rates of change from the geological record, without the need for a stratigraphic age model. Given currently available records, we then show that the present anthropogenic carbon release rate is unprecedented during the Cenozoic (past 66 million years) by at least an order of magnitude. Our results have important implications for our ability to use past analogues to predict future changes, including constraints on climate sensitivity, ocean acidification, and impacts on marine and terrestrial ecosystems. For example, the fact that we have effectively entered an era of 'no analogue' state presents fundamental challenges to constraining forward modeling. Furthermore, future ecosystem disruptions will likely exceed the relatively limited extinctions observed during climate aberrations throughout the Cenozoic.
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Probability summation--a critique.
Laming, Donald
2013-03-01
This Discussion Paper seeks to kill off probability summation, specifically the high-threshold assumption, as an explanatory idea in visual science. In combination with a Weibull function of a parameter of about 4, probability summation can accommodate, to within the limits of experimental error, the shape of the detectability function for contrast, the reduction in threshold that results from the combination of widely separated grating components, summation with respect to duration at threshold, and some instances, but not all, of spatial summation. But it has repeated difficulty with stimuli below threshold, because it denies the availability of input from such stimuli. All the phenomena listed above, and many more, can be accommodated equally accurately by signal-detection theory combined with an accelerated nonlinear transform of small, near-threshold, contrasts. This is illustrated with a transform that is the fourth power for the smallest contrasts, but tends to linear above threshold. Moreover, this particular transform can be derived from elementary properties of sensory neurons. Probability summation cannot be regarded as a special case of a more general theory, because it depends essentially on the 19th-century notion of a high fixed threshold. It is simply an obstruction to further progress.
Multi-million atom electronic structure calculations for quantum dots
NASA Astrophysics Data System (ADS)
Usman, Muhammad
Quantum dots grown by self-assembly process are typically constructed by 50,000 to 5,000,000 structural atoms which confine a small, countable number of extra electrons or holes in a space that is comparable in size to the electron wavelength. Under such conditions quantum dots can be interpreted as artificial atoms with the potential to be custom tailored to new functionality. In the past decade or so, these nanostructures have attracted significant experimental and theoretical attention in the field of nanoscience. The new and tunable optical and electrical properties of these artificial atoms have been proposed in a variety of different fields, for example in communication and computing systems, medical and quantum computing applications. Predictive and quantitative modeling and simulation of these structures can help to narrow down the vast design space to a range that is experimentally affordable and move this part of nanoscience to nano-Technology. Modeling of such quantum dots pose a formidable challenge to theoretical physicists because: (1) Strain originating from the lattice mismatch of the materials penetrates deep inside the buffer surrounding the quantum dots and require large scale (multi-million atom) simulations to correctly capture its effect on the electronic structure, (2) The interface roughness, the alloy randomness, and the atomistic granularity require the calculation of electronic structure at the atomistic scale. Most of the current or past theoretical calculations are based on continuum approach such as effective mass approximation or k.p modeling capturing either no or one of the above mentioned effects, thus missing some of the essential physics. The Objectives of this thesis are: (1) to model and simulate the experimental quantum dot topologies at the atomistic scale; (2) to theoretically explore the essential physics i.e. long range strain, linear and quadratic piezoelectricity, interband optical transition strengths, quantum confined
Blending of Radioactive Salt Solutions in Million Gallon Tanks - 13002
Leishear, Robert A.; Lee, Si Y.; Fowley, Mark D.; Poirier, Michael R.
2013-07-01
Research was completed at Savannah River National Laboratory (SRNL) to investigate processes related to the blending of radioactive, liquid waste, salt solutions in 4920 cubic meter, 25.9 meter diameter storage tanks. One process was the blending of large salt solution batches (up to 1135 - 3028 cubic meters), using submerged centrifugal pumps. A second process was the disturbance of a settled layer of solids, or sludge, on the tank bottom. And a third investigated process was the settling rate of sludge solids if suspended into slurries by the blending pump. To investigate these processes, experiments, CFD models (computational fluid dynamics), and theory were applied. Experiments were performed using simulated, non-radioactive, salt solutions referred to as supernates, and a layer of settled solids referred to as sludge. Blending experiments were performed in a 2.44 meter diameter pilot scale tank, and flow rate measurements and settling tests were performed at both pilot scale and full scale. A summary of the research is presented here to demonstrate the adage that, 'One good experiment fixes a lot of good theory'. Experimental testing was required to benchmark CFD models, or the models would have been incorrectly used. In fact, CFD safety factors were established by this research to predict full-scale blending performance. CFD models were used to determine pump design requirements, predict blending times, and cut costs several million dollars by reducing the number of required blending pumps. This research contributed to DOE missions to permanently close the remaining 47 of 51 SRS waste storage tanks. (authors)
Blending Of Radioactive Salt Solutions In Million Gallon Tanks
Leishear, Robert A.; Lee, Si Y.; Fowley, Mark D.; Poirier, Michael R.
2012-12-10
Research was completed at Savannah River National Laboratory (SRNL) to investigate processes related to the blending of radioactive, liquid waste, salt solutions in 4920 cubic meter, 25.9 meter diameter storage tanks. One process was the blending of large salt solution batches (up to 1135 ? 3028 cubic meters), using submerged centrifugal pumps. A second process was the disturbance of a settled layer of solids, or sludge, on the tank bottom. And a third investigated process was the settling rate of sludge solids if suspended into slurries by the blending pump. To investigate these processes, experiments, CFD models (computational fluid dynamics), and theory were applied. Experiments were performed using simulated, non-radioactive, salt solutions referred to as supernates, and a layer of settled solids referred to as sludge. Blending experiments were performed in a 2.44 meter diameter pilot scale tank, and flow rate measurements and settling tests were performed at both pilot scale and full scale. A summary of the research is presented here to demonstrate the adage that, ?One good experiment fixes a lot of good theory?. Experimental testing was required to benchmark CFD models, or the models would have been incorrectly used. In fact, CFD safety factors were established by this research to predict full-scale blending performance. CFD models were used to determine pump design requirements, predict blending times, and cut costs several million dollars by reducing the number of required blending pumps. This research contributed to DOE missions to permanently close the remaining 47 of 51 SRS waste storage tanks.
STBase: one million species trees for comparative biology.
McMahon, Michelle M; Deepak, Akshay; Fernández-Baca, David; Boss, Darren; Sanderson, Michael J
2015-01-01
Comprehensively sampled phylogenetic trees provide the most compelling foundations for strong inferences in comparative evolutionary biology. Mismatches are common, however, between the taxa for which comparative data are available and the taxa sampled by published phylogenetic analyses. Moreover, many published phylogenies are gene trees, which cannot always be adapted immediately for species level comparisons because of discordance, gene duplication, and other confounding biological processes. A new database, STBase, lets comparative biologists quickly retrieve species level phylogenetic hypotheses in response to a query list of species names. The database consists of 1 million single- and multi-locus data sets, each with a confidence set of 1000 putative species trees, computed from GenBank sequence data for 413,000 eukaryotic taxa. Two bodies of theoretical work are leveraged to aid in the assembly of multi-locus concatenated data sets for species tree construction. First, multiply labeled gene trees are pruned to conflict-free singly-labeled species-level trees that can be combined between loci. Second, impacts of missing data in multi-locus data sets are ameliorated by assembling only decisive data sets. Data sets overlapping with the user's query are ranked using a scheme that depends on user-provided weights for tree quality and for taxonomic overlap of the tree with the query. Retrieval times are independent of the size of the database, typically a few seconds. Tree quality is assessed by a real-time evaluation of bootstrap support on just the overlapping subtree. Associated sequence alignments, tree files and metadata can be downloaded for subsequent analysis. STBase provides a tool for comparative biologists interested in exploiting the most relevant sequence data available for the taxa of interest. It may also serve as a prototype for future species tree oriented databases and as a resource for assembly of larger species phylogenies from precomputed
Objective Probability and Quantum Fuzziness
NASA Astrophysics Data System (ADS)
Mohrhoff, U.
2009-02-01
This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which
Cobb Hotspot Volcanism Prior to 7 Million Years ago
NASA Astrophysics Data System (ADS)
Keller, R.; Fisk, M.; Duncan, R.; Rowe, M.; Russo, C.; Dziak, R.
2003-12-01
From where the Cobb hotspot currently resides beneath Axial Seamount on the Juan de Fuca Ridge, a discontinuous trail of seamounts of increasing age extends 1800 km to the northwest, all the way to the Alaskan Trench off of the southern tip of Kodiak Island. These seamounts record the evolution of mantle melting and volcanism at the Cobb hotspot over the past 30+ million years, including how the approach of the Juan de Fuca Ridge from the east affected the hotspot. We conducted multibeam mapping and stratigraphically-controlled rock sampling of several of the seamounts created by the Cobb hotspot up until 7 Ma. Using the Alvin submersible to do depth transects for geological observations and rock sampling allowed us to establish the volcanic style and setting represented by each sample, and to avoid the thick ferro-manganese oxide coatings and abundant ice-rafted debris common in the Gulf of Alaska. Our goal is to understand the volcanic histories and morphologies of these seamounts with an eye to how volcanism at the hotspot was affected by the approaching ridge. Our targeted seamounts included, from SE to NW, Warwick ( ˜7 Ma on 9 Ma crust), Murray ( ˜28 Ma on 39 Ma crust), Patton ( ˜30 Ma on 42 Ma crust), and Marchand (30+? Ma on 43 Ma crust). Marchand Seamount, though small compared to the others, appears to be the oldest unsubducted volcanic product of the Cobb hotspot. So far, we have XRF data for our samples, and argon dating and trace element analyses are underway. Warwick Seamount yielded only tholeiitic basalts, while most of the samples from the other seamounts are evolved alkalic rocks. Murray samples are entirely alkalic, being dominantly trachytes and trachydacites, with a few mugearites. Rocks from Patton are mainly hawaiites and mugearites, with rare tholeiitic to transitional basalts and a single trachyte. Marchand samples are trachydacites and trachytes similar to the differentiated Patton and Murray samples. Basement drilling at ODP Hole 887D
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
The Black Hole Formation Probability
NASA Astrophysics Data System (ADS)
Clausen, Drew R.; Piro, Anthony; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
A hot Jupiter orbiting a 2-million-year-old solar-mass T Tauri star.
Donati, J F; Moutou, C; Malo, L; Baruteau, C; Yu, L; Hébrard, E; Hussain, G; Alencar, S; Ménard, F; Bouvier, J; Petit, P; Takami, M; Doyon, R; Collier Cameron, A
2016-06-30
Hot Jupiters are giant Jupiter-like exoplanets that orbit their host stars 100 times more closely than Jupiter orbits the Sun. These planets presumably form in the outer part of the primordial disk from which both the central star and surrounding planets are born, then migrate inwards and yet avoid falling into their host star. It is, however, unclear whether this occurs early in the lives of hot Jupiters, when they are still embedded within protoplanetary disks, or later, once multiple planets are formed and interact. Although numerous hot Jupiters have been detected around mature Sun-like stars, their existence has not yet been firmly demonstrated for young stars, whose magnetic activity is so intense that it overshadows the radial velocity signal that close-in giant planets can induce. Here we report that the radial velocities of the young star V830 Tau exhibit a sine wave of period 4.93 days and semi-amplitude 75 metres per second, detected with a false-alarm probability of less than 0.03 per cent, after filtering out the magnetic activity plaguing the spectra. We find that this signal is unrelated to the 2.741-day rotation period of V830 Tau and we attribute it to the presence of a planet of mass 0.77 times that of Jupiter, orbiting at a distance of 0.057 astronomical units from the host star. Our result demonstrates that hot Jupiters can migrate inwards in less than two million years, probably as a result of planet–disk interactions.
A hot Jupiter orbiting a 2-million-year-old solar-mass T Tauri star
NASA Astrophysics Data System (ADS)
Donati, J. F.; Moutou, C.; Malo, L.; Baruteau, C.; Yu, L.; Hébrard, E.; Hussain, G.; Alencar, S.; Ménard, F.; Bouvier, J.; Petit, P.; Takami, M.; Doyon, R.; Cameron, A. Collier
2016-06-01
Hot Jupiters are giant Jupiter-like exoplanets that orbit their host stars 100 times more closely than Jupiter orbits the Sun. These planets presumably form in the outer part of the primordial disk from which both the central star and surrounding planets are born, then migrate inwards and yet avoid falling into their host star. It is, however, unclear whether this occurs early in the lives of hot Jupiters, when they are still embedded within protoplanetary disks, or later, once multiple planets are formed and interact. Although numerous hot Jupiters have been detected around mature Sun-like stars, their existence has not yet been firmly demonstrated for young stars, whose magnetic activity is so intense that it overshadows the radial velocity signal that close-in giant planets can induce. Here we report that the radial velocities of the young star V830 Tau exhibit a sine wave of period 4.93 days and semi-amplitude 75 metres per second, detected with a false-alarm probability of less than 0.03 per cent, after filtering out the magnetic activity plaguing the spectra. We find that this signal is unrelated to the 2.741-day rotation period of V830 Tau and we attribute it to the presence of a planet of mass 0.77 times that of Jupiter, orbiting at a distance of 0.057 astronomical units from the host star. Our result demonstrates that hot Jupiters can migrate inwards in less than two million years, probably as a result of planet-disk interactions.
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Modality, probability, and mental models.
Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N
2016-10-01
We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both. Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning.
Computational Metabolomics: A Framework for the Million Metabolome
Uppal, Karan; Walker, Douglas I.; Liu, Ken; Li, Shuzhao; Go, Young-Mi; Jones, Dean P.
2017-01-01
“Sola dosis facit venenum.” These words of Paracelsus, “the dose makes the poison”, can lead to a cavalier attitude concerning potential toxicities of the vast array of low abundance environmental chemicals to which humans are exposed. Exposome research teaches that 80–85% of human disease is linked to environmental exposures. The human exposome is estimated to include >400,000 environmental chemicals, most of which are uncharacterized with regard to human health. In fact, mass spectrometry measures >200,000 m/z features (ions) in microliter volumes derived from human samples; most are unidentified. This crystallizes a grand challenge for chemical research in toxicology: to develop reliable and affordable analytical methods to understand health impacts of the extensive human chemical experience. To this end, there appears to be no choice but to abandon the limitations of measuring one chemical at a time. The present review looks at progress in computational metabolomics to provide probability based annotation linking ions to known chemicals and serve as a foundation for unambiguous designation of unidentified ions for toxicologic study. We review methods to characterize ions in terms of accurate mass m/z, chromatographic retention time, correlation of adduct, isotopic and fragment forms, association with metabolic pathways and measurement of collision-induced dissociation products, collision cross section, and chirality. Such information can support a largely unambiguous system for documenting unidentified ions in environmental surveillance and human biomonitoring. Assembly of this data would provide a resource to characterize and understand health risks of the array of low-abundance chemicals to which humans are exposed. PMID:27629808
Iridium profile for 10 million years across the Cretaceous-Tertiary boundary at Gubbio (Italy)
NASA Technical Reports Server (NTRS)
Alvarez, Walter; Asaro, Frank; Montanari, Alessandro
1990-01-01
The iridium anomaly at the Cretaceous-Tertiary (KT) boundary was discovered in the pelagic limestone sequence at Gubbio on the basis of 12 samples analyzed by neutron activation analysis (NAA) and was interpreted as indicating impact of a large extraterrestrial object at exactly the time of the KT mass extinction. Continuing controversy over the shape of the Ir profile at the Gubbio KT boundary and its interpretation called for a more detailed follow-up study. Analysis of a 57-meter-thick, 10-million-year-old part of the Gubbio sequence using improved NAA techniques revealed that there is only one Ir anomaly at the KT boundary, but this anomaly shows an intricate fine structure, the origin of which cannot yet be entirely explained. The KT Ir anomaly peaks in a 1-centimeter-thick clay layer, where the average Ir concentration is 3000 parts per trillion (ppt); this peak is flanked by tails with Ir concentrations of 20 to 80 ppt that rise above a background of 12 to 13 ppt. The fine structure of the tails is probably due in part to lateral reworking, diffusion, burrowing, and perhaps Milankovitch cyclicity.
Iridium profile for 10 million years across the Cretaceous-Tertiary boundary at Gubbio (Italy).
Alvarez, W; Asaro, F; Montanari, A
1990-12-21
The iridium anomaly at the Cretaceous-Tertiary (KT) boundary was discovered in the pelagic limestone sequence at Gubbio on the basis of 12 samples analyzed by neutron activation analysis (NAA) and was interpreted as indicating impact of a large extraterrestrial object at exactly the time of the KT mass extinction. Continuing controversy over the shape of the Ir profile at the Gubbio KT boundary and its interpretation called for a more detailed follow-up study. Analysis of a 57-meter-thick, 10-million-year-old part of the Gubbio sequence using improved NAA techniques revealed that there is only one Ir anomaly at the KT boundary, but this anomaly shows an intricate fine structure, the origin of which cannot yet be entirely explained. The KT Ir anomaly peaks in a 1-centimeter-thick clay layer, where average Ir concentration is 3000 parts per trillion (ppt); this peak is flanked by tails with Ir concentrations of 20 to 80 ppt that rise above a background of 12 to 13 ppt. The fine structure of the tails is probably due in part to lateral reworking, diffusion, burrowing, and perhaps Milankovitch cyclicity.
Warner, Kelly L.; Arnold, Terri L.
2010-01-01
Nitrate in private wells in the glacial aquifer system is a concern for an estimated 17 million people using private wells because of the proximity of many private wells to nitrogen sources. Yet, less than 5 percent of private wells sampled in this study contained nitrate in concentrations that exceeded the U.S. Environmental Protection Agency (USEPA) Maximum Contaminant Level (MCL) of 10 mg/L (milligrams per liter) as N (nitrogen). However, this small group with nitrate concentrations above the USEPA MCL includes some of the highest nitrate concentrations detected in groundwater from private wells (77 mg/L). Median nitrate concentration measured in groundwater from private wells in the glacial aquifer system (0.11 mg/L as N) is lower than that in water from other unconsolidated aquifers and is not strongly related to surface sources of nitrate. Background concentration of nitrate is less than 1 mg/L as N. Although overall nitrate concentration in private wells was low relative to the MCL, concentrations were highly variable over short distances and at various depths below land surface. Groundwater from wells in the glacial aquifer system at all depths was a mixture of old and young water. Oxidation and reduction potential changes with depth and groundwater age were important influences on nitrate concentrations in private wells. A series of 10 logistic regression models was developed to estimate the probability of nitrate concentration above various thresholds. The threshold concentration (1 to 10 mg/L) affected the number of variables in the model. Fewer explanatory variables are needed to predict nitrate at higher threshold concentrations. The variables that were identified as significant predictors for nitrate concentration above 4 mg/L as N included well characteristics such as open-interval diameter, open-interval length, and depth to top of open interval. Environmental variables in the models were mean percent silt in soil, soil type, and mean depth to
MSPI False Indication Probability Simulations
Dana Kelly; Kurt Vedros; Robert Youngblood
2011-03-01
This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values
Methods for estimating drought streamflow probabilities for Virginia streams
Austin, Samuel H.
2014-01-01
Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.
WITPO (What Is the Probability Of).
ERIC Educational Resources Information Center
Ericksen, Donna Bird; And Others
1991-01-01
Included in this probability board game are the requirements, the rules, the board, and 44 sample questions. This game can be used as a probability unit review for practice on basic skills and algorithms, such as computing compound probability and using Pascal's triangle to solve binomial probability problems. (JJK)
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Fusion probability in heavy nuclei
NASA Astrophysics Data System (ADS)
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability,
Trajectory versus probability density entropy.
Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A
2001-07-01
We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.
A 25.5 percent AM0 gallium arsenide grating solar cell
NASA Technical Reports Server (NTRS)
Weizer, V. G.; Godlewski, M. P.
1985-01-01
Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltage approach 1.120 V and efficiencies well over 25 percent (AM0) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.
A 25.5 percent AMO gallium arsenide grating solar cell
NASA Technical Reports Server (NTRS)
Weizer, V. G.; Godlewski, M. P.
1985-01-01
Recent calculations have shown that significant open circuit voltage gains are possible with a dot grating junction geometry. The feasibility of applying the dot geometry to the GaAs cell was investigated. This geometry is shown to result in voltages approach 1.120 V and efficiencies well over 25 percent (AMO) if good collection efficiency can be maintained. The latter is shown to be possible if one chooses the proper base resistivity and cell thickness. The above advances in efficiency are shown to be possible in the P-base cell with only minor improvements in existing technology.
5 Percent Ares I Scale Model Acoustic Test: Overpressure Characterization and Analysis
NASA Technical Reports Server (NTRS)
Alvord, David; Casiano, Matthew; McDaniels, Dave
2011-01-01
During the ignition of a ducted solid rocket motor (SRM), rapid expansion of injected hot gases from the motor into a confined volume causes the development of a steep fronted wave. This low frequency transient wave propagates outward from the exhaust duct, impinging the vehicle and ground structures. An unsuppressed overpressure wave can potentially cause modal excitation in the structures and vehicle, subsequently leading to damage. This presentation details the ignition transient f indings from the 5% Ares I Scale Model Acoustic Test (ASMAT). The primary events of the ignition transient environment induced by the SRM are the ignition overpressure (IOP), duct overpressure (DOP), and source overpressure (SOP). The resulting observations include successful knockdown of the IOP environment through use of a Space Shuttle derived IOP suppression system, a potential load applied to the vehicle stemming from instantaneous asymmetrical IOP and DOP wave impingement, and launch complex geometric influences on the environment. The results are scaled to a full-scale Ares I equivalent and compared with heritage data including Ares I-X and both suppressed and unsuppressed Space Shuttle IOP environments.
26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.
Code of Federal Regulations, 2012 CFR
2012-04-01
... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...
26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.
Code of Federal Regulations, 2011 CFR
2011-04-01
... members. However, the participation by creditors in formulating a plan for an insolvency workout or a... receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or reorganization do... advisor is also the underwriter (without regard to whether it is a firm commitment or best...
26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.
Code of Federal Regulations, 2014 CFR
2014-04-01
... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...
26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.
Code of Federal Regulations, 2010 CFR
2010-04-01
... members. However, the participation by creditors in formulating a plan for an insolvency workout or a... receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or reorganization do... advisor is also the underwriter (without regard to whether it is a firm commitment or best...
26 CFR 1.382-3 - Definitions and rules relating to a 5-percent shareholder.
Code of Federal Regulations, 2013 CFR
2013-04-01
... workout or a reorganization in a title 11 or similar case (whether as members of a creditors' committee or otherwise) and the receipt of stock by creditors in satisfaction of indebtedness pursuant to the workout or... best efforts underwriting) for a primary or secondary offering of L stock. (iv) Assume that the...
THE BLACK HOLE FORMATION PROBABILITY
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
The Black Hole Formation Probability
NASA Astrophysics Data System (ADS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
Erratum: The linear polarization of Southern bright stars measured at the parts-per-million level
NASA Astrophysics Data System (ADS)
Cotton, Daniel V.; Bailey, Jeremy; Kedziora-Chudczer, Lucyna; Bott, Kimberly; Lucas, P. W.; Hough, J. H.; Marshall, Jonathan P.
2016-07-01
Our recent article, The linear polarization of Southern bright stars measured at the parts-per-million level (Cotton et al. 2016a), contains two errors that we correct here. The first error is a formulaic error in the propagation of errors. Although the errors in q, u and p given throughout the paper are properly the 1σ error, the stated error in the polarization angle is the 2σ error. To correct this, the reader has simply to divide the given polarization angle error by 2. The correction of this error does not alter any of the conclusions drawn in the paper. We note here that this error also affects the polarization angle errors for the telescope polarization given in one of our earlier works (Bailey et al. 2015). The errors there are very small, and so this has little consequence. The second error is a transcription error resulting in an erroneous value for the measured polarization of α Phe (HIP 2081, BS 99) being reported in Table 5. The correct measurement for this object is as follows: q = -10.7 ± 7.2, u = 15.6 ± 6.1, or p = 18.9 ± 6.7, θ = 62.3 ± 10.5. α Phe is identified as an outlier in Figs 5 and 6 of the paper, and marked accordingly; its corrected (debiased) p/d value of 0.73 ppm/pc is unremarkable. Consequently, its identification in Section 4.101 as a late giant probably intrinsically polarized is recanted. This makes κ Lyr (BS 6872, K2III) the earliest late giant we can identify as probably intrinsically polarized. The incorrect polarization magnitude for α Phe was also used in Fig. 2, however the scale used there would render a correction largely invisible. The above errors were identified before the publication of three recent papers (Bott et al. 2016; Cotton et al. 2016b; Marshall et al. 2016) that reference the results, and none of them are affected. %K errata, addenda, polarization, techniques: polarimetric, binaries: close, stars: emission-line, Be, stars: late-type, ISM: magnetic fields
Linking Inverse Square Law with Quantum Mechanical Probabilities
NASA Astrophysics Data System (ADS)
Goradia, Shantilal
2008-03-01
(2007 by S Goradia) I modify the Newtonian inverse square law with a postulate that the probability of interaction between two elementary particles varies inversely as the statistical number of Planck lengths separating them. For two nucleons a million Planck lengths apart, the probability of an interaction is a trillionth (almost never), seemingly contradicting gravity. Likewise, statistical expression of the size of the universe implicitly addresses the issue of dark energy by linking fine-structure constant α = 1/137 with the cosmological constant λ = 1/R^2 (abstract submitted 11/11/07 for APS APR2008 meeting). Since light travels one Planck length per Planck time, the radius R of the spherical shape of the universe is 10^60 Planck lengths, linking the cosmological constant λ = 1/10^120 (see equation 14 in Einstein's 1917 paper) with α by the relationship 1/α ln(1/ λ). Intuitive answers to the questions raised suggest that the elementary particles interact via Planck scale mouths ^(1), with higher probabilities at smaller distances. This intuition may be supported by genetics, explaining issues such DNA -- nucleosome interaction ^(2) (3). [1] http://www.arxiv.org/pdf/physics/0210040 [v. 3] [2] www.gravityresearchinstitute.org [3] Segal E. et al, A genomic code for nucleosome positioning. Nature 442, pp. 772-778, 2006.
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Teaching Probabilities and Statistics to Preschool Children
ERIC Educational Resources Information Center
Pange, Jenny
2003-01-01
This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…
The Cognitive Substrate of Subjective Probability
ERIC Educational Resources Information Center
Nilsson, Hakan; Olsson, Henrik; Juslin, Peter
2005-01-01
The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…
Corning Inc.: Proposed Changes at Glass Plant Indicate $26 Million in Potential Savings
2004-01-01
In 2000, the Corning glass plant in Greenville, Ohio, consumed almost 114 million kWh of electricity and nearly 308,000 MMBtu of natural gas in its glassmaking processes for a total cost of approximately $6.4 million. A plant-wide assessment indicated that improvement projects could save nearly $26 million and reduce natural gas use by 122,900 MMBtu per year, reduce electrical use by 72,300,000 kWh per year, and reduce CO2 emissions by 180 million pounds per year.
IDENTIFICATION OF 1.4 MILLION ACTIVE GALACTIC NUCLEI IN THE MID-INFRARED USING WISE DATA
Secrest, N. J.; Dudik, R. P.; Dorland, B. N.; Zacharias, N.; Makarov, V.; Fey, A.; Frouard, J.; Finch, C.
2015-11-15
We present an all-sky sample of ≈1.4 million active galactic nuclei (AGNs) meeting a two-color infrared photometric selection criteria for AGNs as applied to sources from the Wide-field Infrared Survey Explorer final catalog release (AllWISE). We assess the spatial distribution and optical properties of our sample and find that the results are consistent with expectations for AGNs. These sources have a mean density of ≈38 AGNs per square degree on the sky, and their apparent magnitude distribution peaks at g ≈ 20, extending to objects as faint as g ≈ 26. We test the AGN selection criteria against a large sample of optically identified stars and determine the “leakage” (that is, the probability that a star detected in an optical survey will be misidentified as a quasi-stellar object (QSO) in our sample) rate to be ≤4.0 × 10{sup −5}. We conclude that our sample contains almost no optically identified stars (≤0.041%), making this sample highly promising for future celestial reference frame work as it significantly increases the number of all-sky, compact extragalactic objects. We further compare our sample to catalogs of known AGNs/QSOs and find a completeness value of ≳84% (that is, the probability of correctly identifying a known AGN/QSO is at least 84%) for AGNs brighter than a limiting magnitude of R ≲ 19. Our sample includes approximately 1.1 million previously uncataloged AGNs.
Evidence for life on Earth more than 3850 million years ago.
Holland, H D
1997-01-03
A recent study by Mojzsis et al., (Nature 384, 55, 1996) found evidence of life in rocks in Greenland estimated by new isotopic data to be more than 3800 million years old. The author examines this study in relation to studies conducted on rocks between 3250 and 3800 million years old and presents reasons to agree and disagree with the interpretation of data.
2.5 Million U.S. Women Have Condition That Can Cause Infertility
... page: https://medlineplus.gov/news/fullstory_163399.html 2.5 Million U.S. Women Have Condition That Can ... 2017 FRIDAY, Feb. 3, 2017 (HealthDay News) -- About 2.5 million American women have had pelvic inflammatory ...
Teaching about the Big Three-O (300 Million) Using the Internet
ERIC Educational Resources Information Center
Risinger, C. Frederick
2006-01-01
Most researchers and the Census Bureau expect the U.S. population to hit the 300 million mark sometime in October. This will make the United States the world's third most populous nation--behind China and India. In this article, the author found several websites dealing with the specific 300 million target, population growth in general, and…
Combined Heat and Power System Achieves Millions in Cost Savings at Large University - Case Study
2013-05-29
Texas A&M University is operating a high-efficiency combined heat and power (CHP) system at its district energy campus in College Station, Texas. Texas A&M received $10 million in U.S. Department of Energy funding from the American Recovery and Reinvestment Act (ARRA) of 2009 for this project. Private-sector cost share totaled $40 million.
EPA awards $2.5 million to Arizona to improve surface water quality
SAN FRANCISCO - The Environmental Protection Agency awarded $2.5 million to the State of Arizona for projects to help restore water quality in the state's polluted water bodies. With an additional $1.6 million leveraged by the state for these activi
UT Biomedical Informatics Lab (BMIL) probability wheel
NASA Astrophysics Data System (ADS)
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
UT Biomedical Informatics Lab (BMIL) Probability Wheel.
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Derivation of quantum probability from measurement
NASA Astrophysics Data System (ADS)
Herbut, Fedor
2016-05-01
To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).
Error probability performance of unbalanced QPSK receivers
NASA Technical Reports Server (NTRS)
Simon, M. K.
1978-01-01
A simple technique for calculating the error probability performance and associated noisy reference loss of practical unbalanced QPSK receivers is presented. The approach is based on expanding the error probability conditioned on the loop phase error in a power series in the loop phase error and then, keeping only the first few terms of this series, averaging this conditional error probability over the probability density function of the loop phase error. Doing so results in an expression for the average error probability which is in the form of a leading term representing the ideal (perfect synchronization references) performance plus a term proportional to the mean-squared crosstalk. Thus, the additional error probability due to noisy synchronization references occurs as an additive term proportional to the mean-squared phase jitter directly associated with the receiver's tracking loop. Similar arguments are advanced to give closed-form results for the noisy reference loss itself.
UT Biomedical Informatics Lab (BMIL) Probability Wheel
Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant,” about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences. PMID:28105462
Probability and Quantum Paradigms: the Interplay
Kracklauer, A. F.
2007-12-03
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
Location probability learning requires focal attention.
Kabata, Takashi; Yokoyama, Takemasa; Noguchi, Yasuki; Kita, Shinichi
2014-01-01
Target identification is related to the frequency with which targets appear at a given location, with greater frequency enhancing identification. This phenomenon suggests that location probability learned through repeated experience with the target modulates cognitive processing. However, it remains unclear whether attentive processing of the target is required to learn location probability. Here, we used a dual-task paradigm to test the location probability effect of attended and unattended stimuli. Observers performed an attentionally demanding central-letter task and a peripheral-bar discrimination task in which location probability was manipulated. Thus, we were able to compare performance on the peripheral task when attention was fully engaged to the target (single-task condition) versus when attentional resources were drawn away by the central task (dual-task condition). The location probability effect occurred only in the single-task condition, when attention resources were fully available. This suggests that location probability learning requires attention to the target stimuli.
Experience matters: information acquisition optimizes probability gain.
Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J
2010-07-01
Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.
Total variation denoising of probability measures using iterated function systems with probabilities
NASA Astrophysics Data System (ADS)
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
Tombari, C.
2005-09-01
The U.S. Department of Energy's Million Solar Roofs Initiative (MSR) is a unique public-private partnership aimed at overcoming market barriers for photovoltaics (PV), solar water heating, transpired solar collectors, solar space heating and cooling, and pool heating. This report contains annual progress reports from 866 partners across the United States.
Classifying proteins into functional groups based on all-versus-all BLAST of 10 million proteins.
Kolker, Natali; Higdon, Roger; Broomall, William; Stanberry, Larissa; Welch, Dean; Lu, Wei; Haynes, Winston; Barga, Roger; Kolker, Eugene
2011-01-01
To address the monumental challenge of assigning function to millions of sequenced proteins, we completed the first of a kind all-versus-all sequence alignments using BLAST for 9.9 million proteins in the UniRef100 database. Microsoft Windows Azure produced over 3 billion filtered records in 6 days using 475 eight-core virtual machines. Protein classification into functional groups was then performed using Hive and custom jars implemented on top of Apache Hadoop utilizing the MapReduce paradigm. First, using the Clusters of Orthologous Genes (COG) database, a length normalized bit score (LNBS) was determined to be the best similarity measure for classification of proteins. LNBS achieved sensitivity and specificity of 98% each. Second, out of 5.1 million bacterial proteins, about two-thirds were assigned to significantly extended COG groups, encompassing 30 times more assigned proteins. Third, the remaining proteins were classified into protein functional groups using an innovative implementation of a single-linkage algorithm on an in-house Hadoop compute cluster. This implementation significantly reduces the run time for nonindexed queries and optimizes efficient clustering on a large scale. The performance was also verified on Amazon Elastic MapReduce. This clustering assigned nearly 2 million proteins to approximately half a million different functional groups. A similar approach was applied to classify 2.8 million eukaryotic sequences resulting in over 1 million proteins being assign to existing KOG groups and the remainder clustered into 100,000 functional groups.
Heightened odds of large earthquakes near istanbul: An interaction-based probability calculation
Parsons; Toda; Stein; Barka; Dieterich
2000-04-28
We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium. Departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 +/- 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 +/- 12% during the next decade.
Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation
Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.
2000-01-01
We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.
New four-million-year-old hominid species from Kanapoi and Allia Bay, Kenya.
Leakey, M G; Feibel, C S; McDougall, I; Walker, A
1995-08-17
Nine hominid dental, cranial and postcranial specimens from Kanapoi, Kenya, and 12 specimens from Allia Bay, Kenya, are described here as a new species of Australopithecus dating from between about 3.9 million and 4.2 million years ago. The mosaic of primitive and derived features shows this species to be a possible ancestor to Australopithecus afarensis and suggests that Ardipithecus ramidus is a sister species to this and all later hominids. A tibia establishes that hominids were bipedal at least half a million years before the previous earliest evidence showed.
Estimating the cost of new drug development: is it really 802 million dollars?
Adams, Christopher P; Brantner, Van V
2006-01-01
This paper replicates the drug development cost estimates of Joseph DiMasi and colleagues ("The Price of Innovation"), using their published cost estimates along with information on success rates and durations from a publicly available data set. For drugs entering human clinical trials for the first time between 1989 and 2002, the paper estimated the cost per new drug to be 868 million dollars. However, our estimates vary from around 500 million dollars to more than 2,000 million dollars, depending on the therapy or the developing firm.
One million years of cultural evolution in a stable environment at Atapuerca (Burgos, Spain)
NASA Astrophysics Data System (ADS)
Rodríguez, J.; Burjachs, F.; Cuenca-Bescós, G.; García, N.; Van der Made, J.; Pérez González, A.; Blain, H.-A.; Expósito, I.; López-García, J. M.; García Antón, M.; Allué, E.; Cáceres, I.; Huguet, R.; Mosquera, M.; Ollé, A.; Rosell, J.; Parés, J. M.; Rodríguez, X. P.; Díez, C.; Rofes, J.; Sala, R.; Saladié, P.; Vallverdú, J.; Bennasar, M. L.; Blasco, R.; Bermúdez de Castro, J. M.; Carbonell, E.
2011-06-01
The present paper analyses the evidence provided by three sites (Sima del Elefante, Gran Dolina, and Galería) located in the Trinchera del Ferrocarril of the Sierra de Atapuerca. These three sites are cave infillings that contain sediments deposited from approximately 1.2 Ma to 200 kyr. Pollen, herpetofauna, and small and large mammal remains are used as proxies to obtain a general picture of the environmental changes that occurred at the Sierra de Atapuerca throughout the one million-year period represented at these sites. Similarly, cultural changes are tracked analyzing the evidence of human behavior obtained from the study of several bone and lithic assemblages from these three sites. At least three periods with different cultural features, involving technology, subsistence and behavior, are determined from the available evidence. The first two periods correspond to the Mode 1 technology and Homo antecessor: the first is dated around 1.2 to 1.0 Ma and reflects opportunistic behavior both in technology and subsistence. The second period is around 800 kyr BP. Mode 1 technology is still maintained, but subsistence strategies include systematic hunting and the use of base camps. The third period is dated between 500 ka and 200 ka and corresponds to the Mode 2 technology and the acquisition of directional hunting and other organizational strategies by Homo heidelbergensis. A transition from Mode 2 to Mode 3 seems to appear at the end of this time-range, and may reflect the early phases of a fourth cultural change. With regard to the environment, our main conclusion is that there was an absence of extremely harsh conditions at Atapuerca throughout this time period. The presence of Mediterranean taxa was constant and the dominant landscape was a savannah-like open environment, probably with small forest patches. An alternation of Mediterranean and mesic species as the dominant component of the tree storey was induced by the climatic cycles, and steppes spread across
Simulations of Probabilities for Quantum Computing
NASA Technical Reports Server (NTRS)
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
47 CFR 1.1623 - Probability calculation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...
Correlation as Probability of Common Descent.
ERIC Educational Resources Information Center
Falk, Ruma; Well, Arnold D.
1996-01-01
One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the…
Probability: A Matter of Life and Death
ERIC Educational Resources Information Center
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Phonotactic Probabilities in Young Children's Speech Production
ERIC Educational Resources Information Center
Zamuner, Tania S.; Gerken, Louann; Hammond, Michael
2004-01-01
This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…
47 CFR 1.1623 - Probability calculation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than...
Teaching Statistics and Probability: 1981 Yearbook.
ERIC Educational Resources Information Center
Shulte, Albert P., Ed.; Smart, James R., Ed.
This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…
Teaching Probability: A Socio-Constructivist Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
ERIC Educational Resources Information Center
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
WPE: A Mathematical Microworld for Learning Probability
ERIC Educational Resources Information Center
Kiew, Su Ding; Sam, Hong Kian
2006-01-01
In this study, the researchers developed the Web-based Probability Explorer (WPE), a mathematical microworld and investigated the effectiveness of the microworld's constructivist learning environment in enhancing the learning of probability and improving students' attitudes toward mathematics. This study also determined the students' satisfaction…
Malawian Students' Meanings for Probability Vocabulary
ERIC Educational Resources Information Center
Kazima, Mercy
2007-01-01
The paper discusses findings of a study that investigated Malawian students' meanings for some probability vocabulary. The study explores the meanings that, prior to instruction, students assign to some words that are commonly used in teaching probability. The aim is to have some insight into the meanings that students bring to the classroom. The…
Probability Simulations by Non-Lipschitz Chaos
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Laboratory-Tutorial Activities for Teaching Probability
ERIC Educational Resources Information Center
Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We…
Probability Issues in without Replacement Sampling
ERIC Educational Resources Information Center
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Average Transmission Probability of a Random Stack
ERIC Educational Resources Information Center
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Assessment of the probability of contaminating Mars
NASA Technical Reports Server (NTRS)
Judd, B. R.; North, D. W.; Pezier, J. P.
1974-01-01
New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.
Optimizing Probability of Detection Point Estimate Demonstration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Time-dependent landslide probability mapping
Campbell, Russell H.; Bernknopf, Richard L.; ,
1993-01-01
Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
An introductory analysis of satellite collision probabilities
NASA Astrophysics Data System (ADS)
Carlton-Wippern, Kitt C.
This paper addresses a probailistic approach in assessing the probabilities of a satellite collision occurring due to relative trajectory analyses and probability density functions representing the satellites' position/momentum vectors. The paper is divided into 2 parts: Static and Dynamic Collision Probabilities. In the Static Collision Probability section, the basic phenomenon under study is: given the mean positions and associated position probability density functions for the two objects, calculate the probability that the two objects collide (defined as being within some distance of each other). The paper presents the classic Laplace problem of the probability of arrival, using standard uniform distribution functions. This problem is then extrapolated to show how 'arrival' can be classified as 'collision', how the arrival space geometries map to collision space geometries and how arbitrary position density functions can then be included and integrated into the analysis. In the Dynamic Collision Probability section, the nature of collisions based upon both trajectory and energy considerations is discussed, and that energy states alone cannot be used to completely describe whether or not a collision occurs. This fact invalidates some earlier work on the subject and demonstrates why Liouville's theorem cannot be used in general to describe the constant density of the position/momentum space in which a collision may occur. Future position probability density functions are then shown to be the convolution of the current position and momentum density functions (linear analysis), and the paper further demonstrates the dependency of the future position density functions on time. Strategies for assessing the collision probabilities for two point masses with uncertainties in position and momentum at some given time, and thes integrated with some arbitrary impact volume schema, are then discussed. This presentation concludes with the formulation of a high level design
EPA Provides $8.5 Million to Protect Air Quality in a Changing Climate
WASHINGTON - Today, the U.S. Environmental Protection Agency (EPA) announced $8.5 million in research funding to 12 universities to protect air quality from the current and future challenges associated with the impacts of climate change.
15-OPA124 CHICAGO -- The U.S. Environmental Protection Agency today announced the award of 15 Great Lakes Restoration Initiative grants totaling more than $8 million for projects to combat invasive species in the Great Lakes basin. These Great
Vermont Receives over $2 million EPA Brownfields Funding and Announces State BERA Program winners
EPA is awarding a total of $2 million in Brownfield Assessment and Cleanup Grant dollars to municipalities and organizations across the state of Vermont. Additionally, three communities have been selected by the State of Vermont.
EPA Provides State of Rhode Island $18.2 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $18.2 million to the State of Rhode Island to help finance improvements to water projects that are essential to protecting public health and the environment.
EPA Provides State of New Hampshire $22.7 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $22.7 million to the State of New Hampshire to help finance improvements to water projects that are essential to protecting public health and the environment.
EPA Provides State of Massachusetts $63.7 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $63.7 million to the Commonwealth of Massachusetts to help finance improvements to water projects that are essential to protecting public health and the environment.
EPA Provides State of Maine $19.6 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $19.6 million to the State of Maine to help finance improvements to water projects that are essential to protecting public health and the environment.
EPA Provides State of Connecticut $26 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $26 million to the State of Connecticut to help finance improvements to water projects that are essential to protecting public health and the environment.
EPA Provides State of Vermont $15.6 Million for Water Infrastructure Projects
The U.S. Environmental Protection Agency has awarded $15.6 million to the State of Vermont to help finance improvements to water projects that are essential to protecting public health and the environment.
4. VIEW SOUTHWEST OF 15MILLION GALLON UNDERGROUND CLEARWELL (foreground), HEAD ...
4. VIEW SOUTHWEST OF 15-MILLION GALLON UNDERGROUND CLEARWELL (foreground), HEAD HOUSE (left), OLD PUMP STATION (center), AND EAST FILTER BUILDING (background) - Dalecarlia Water Treatment Plant, 5900 MacArthur Boulevard, Northwest, Washington, District of Columbia, DC
EPA Announces more than $15 Million for Environmental Improvements on Tribal Lands in Arizona
SAN FRANCISCO - The U.S. Environmental Protection Agency announced over $15 million in funding to invest in Arizona tribes for environmental programs, water and wastewater infrastructure development, community education and capacity building. The an
EPA Awards $4 Million in Grants to Research the Impact of Drought on Water Quality
Washington -Today, the U.S. Environmental Protection Agency (EPA) announced $4 million to four institutions to conduct research to combat the effects of drought and extreme events on water quality in watersheds and at drinking water utilities.
Plant-wide assessment summary: $1.6 million in savings identified in Augusta Newsprint assessment
None, None
2003-08-01
Augusta Newsprint and its partners conducted a systematic plant-wide assessment (PWA) to identify energy- and cost-saving opportunities at the company's plant in Augusta, Georgia. The assessment team identified $1.6 million in potential annual savings.
EPA Announces more than $4 Million for Environmental Improvements on Tribal Lands in Nevada
SAN FRANCISCO - The U.S. Environmental Protection Agency announced over $4 million in funding to invest in Nevada tribes for environmental programs, community education and capacity building. The announcement was made at the 23rd Annual Regional Tri
Gulf Coast Ecosystem Restoration Council Seeks Public Comment on Priorities for $139.6 million
DALLAS - (Aug. 19, 2015) The Gulf Coast Ecosystem Restoration Council (Council) recently released a draft Initial Funded Priorities List that would fund approximately $139.6 million in restoration activities. The funds are derived from the recent se
EPA Announces Availability of $26 Million to Clean Up Diesel Engines Nationwide
(Washington, D.C.) - The U.S. Environmental Protection Agency (EPA) today announced the availability of $26 million in grant funding to establish clean diesel projects aimed at reducing emissions from the nation's existing fleet of diesel engines. Di
Formation of the Grand Canyon 5 to 6 million years ago through integration of older palaeocanyons
NASA Astrophysics Data System (ADS)
Karlstrom, Karl E.; Lee, John P.; Kelley, Shari A.; Crow, Ryan S.; Crossey, Laura J.; Young, Richard A.; Lazear, Greg; Beard, L. Sue; Ricketts, Jason W.; Fox, Matthew; Shuster, David L.
2014-03-01
The timing of formation of the Grand Canyon, USA, is vigorously debated. In one view, most of the canyon was carved by the Colorado River relatively recently, in the past 5-6 million years. Alternatively, the Grand Canyon could have been cut by precursor rivers in the same location and to within about 200 m of its modern depth as early as 70-55 million years ago. Here we investigate the time of formation of four out of five segments of the Grand Canyon, using apatite fission-track dating, track-length measurements and apatite helium dating: if any segment is young, the old canyon hypothesis is falsified. We reconstruct the thermal histories of samples taken from the modern canyon base and the adjacent canyon rim 1,500 m above, to constrain when the rocks cooled as a result of canyon incision. We find that two of the three middle segments, the Hurricane segment and the Eastern Grand Canyon, formed between 70 and 50 million years ago and between 25 and 15 million years ago, respectively. However, the two end segments, the Marble Canyon and the Westernmost Grand Canyon, are both young and were carved in the past 5-6 million years. Thus, although parts of the canyon are old, we conclude that the integration of the Colorado River through older palaeocanyons carved the Grand Canyon, beginning 5-6 million years ago.
Liquefaction probability curves for surficial geologic deposits
Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.
2011-01-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA) = 0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.
Seismicity alert probabilities at Parkfield, California, revisited
Michael, A.J.; Jones, L.M.
1998-01-01
For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.
The probability distribution of intense daily precipitation
NASA Astrophysics Data System (ADS)
Cavanaugh, Nicholas R.; Gershunov, Alexander; Panorska, Anna K.; Kozubowski, Tomasz J.
2015-03-01
The probability tail structure of over 22,000 weather stations globally is examined in order to identify the physically and mathematically consistent distribution type for modeling the probability of intense daily precipitation and extremes. Results indicate that when aggregating data annually, most locations are to be considered heavy tailed with statistical significance. When aggregating data by season, it becomes evident that the thickness of the probability tail is related to the variability in precipitation causing events and thus that the fundamental cause of precipitation volatility is weather diversity. These results have both theoretical and practical implications for the modeling of high-frequency climate variability worldwide.
Class probability estimation for medical studies.
Simon, Richard
2014-07-01
I provide a commentary on two papers "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. Those papers provide an up-to-date review of some popular machine learning methods for class probability estimation and compare those methods to logistic regression modeling in real and simulated datasets.
Objective and subjective probability in gene expression.
Velasco, Joel D
2012-09-01
In this paper I address the question of whether the probabilities that appear in models of stochastic gene expression are objective or subjective. I argue that while our best models of the phenomena in question are stochastic models, this fact should not lead us to automatically assume that the processes are inherently stochastic. After distinguishing between models and reality, I give a brief introduction to the philosophical problem of the interpretation of probability statements. I argue that the objective vs. subjective distinction is a false dichotomy and is an unhelpful distinction in this case. Instead, the probabilities in our models of gene expression exhibit standard features of both objectivity and subjectivity.
Characteristic length of the knotting probability revisited
NASA Astrophysics Data System (ADS)
Uehara, Erica; Deguchi, Tetsuo
2015-09-01
We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.
Transition Probability and the ESR Experiment
ERIC Educational Resources Information Center
McBrierty, Vincent J.
1974-01-01
Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)
Inclusion probability with dropout: an operational formula.
Milot, E; Courteau, J; Crispino, F; Mailly, F
2015-05-01
In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.
The low synaptic release probability in vivo.
Borst, J Gerard G
2010-06-01
The release probability, the average probability that an active zone of a presynaptic terminal releases one or more vesicles following an action potential, is tightly regulated. Measurements in cultured neurons or in slices indicate that this probability can vary greatly between synapses, but on average it is estimated to be as high as 0.5. In vivo, however, the size of synaptic potentials is relatively independent of recent history, suggesting that release probability is much lower. Possible causes for this discrepancy include maturational differences, a higher spontaneous activity, a lower extracellular calcium concentration and more prominent tonic inhibition by ambient neurotransmitters during in vivo recordings. Existing evidence thus suggests that under physiological conditions in vivo, presynaptic action potentials trigger the release of neurotransmitter much less frequently than what is observed in in vitro preparations.
Classical and Quantum Spreading of Position Probability
ERIC Educational Resources Information Center
Farina, J. E. G.
1977-01-01
Demonstrates that the standard deviation of the position probability of a particle moving freely in one dimension is a function of the standard deviation of its velocity distribution and time in classical or quantum mechanics. (SL)
On Convergent Probability of a Random Walk
ERIC Educational Resources Information Center
Lee, Y.-F.; Ching, W.-K.
2006-01-01
This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.
Robust satisficing and the probability of survival
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2014-01-01
Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.
Probability, clinical decision making and hypothesis testing
Banerjee, A.; Jadhav, S. L.; Bhawalkar, J. S.
2009-01-01
Few clinicians grasp the true concept of probability expressed in the ‘P value.’ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing. PMID:21234167
Grounding quantum probability in psychological mechanism.
Love, Bradley C
2013-06-01
Pothos & Busemeyer (P&B) provide a compelling case that quantum probability (QP) theory is a better match to human judgment than is classical probability (CP) theory. However, any theory (QP, CP, or other) phrased solely at the computational level runs the risk of being underconstrained. One suggestion is to ground QP accounts in mechanism, to leverage a wide range of process-level data.
A Manual for Encoding Probability Distributions.
1978-09-01
summary of the most significant information contained in the report. If the report contains a significant bibliography or literature survey, mention it...probability distri- bution. Some terms in the literature that are used synonymously to Encoding: Assessment, Assignment (used for single events in this...sessions conducted as parts of practical decision analyses as well as on experimental evidence in the literature . Probability encoding can be applied
Imprecise Probability Methods for Weapons UQ
Picard, Richard Roy; Vander Wiel, Scott Alan
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Probability distribution of the vacuum energy density
Duplancic, Goran; Stefancic, Hrvoje; Glavan, Drazen
2010-12-15
As the vacuum state of a quantum field is not an eigenstate of the Hamiltonian density, the vacuum energy density can be represented as a random variable. We present an analytical calculation of the probability distribution of the vacuum energy density for real and complex massless scalar fields in Minkowski space. The obtained probability distributions are broad and the vacuum expectation value of the Hamiltonian density is not fully representative of the vacuum energy density.
When probability trees don't work
NASA Astrophysics Data System (ADS)
Chan, K. C.; Lenard, C. T.; Mills, T. M.
2016-08-01
Tree diagrams arise naturally in courses on probability at high school or university, even at an elementary level. Often they are used to depict outcomes and associated probabilities from a sequence of games. A subtle issue is whether or not the Markov condition holds in the sequence of games. We present two examples that illustrate the importance of this issue. Suggestions as to how these examples may be used in a classroom are offered.
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions
Larget, Bret
2013-01-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066
Tsunami probability in the Caribbean Region
Parsons, T.; Geist, E.L.
2008-01-01
We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.
Probability detection mechanisms and motor learning.
Lungu, O V; Wächter, T; Liu, T; Willingham, D T; Ashe, J
2004-11-01
The automatic detection of patterns or regularities in the environment is central to certain forms of motor learning, which are largely procedural and implicit. The rules underlying the detection and use of probabilistic information in the perceptual-motor domain are largely unknown. We conducted two experiments involving a motor learning task with direct and crossed mapping of motor responses in which probabilities were present at the stimulus set level, the response set level, and at the level of stimulus-response (S-R) mapping. We manipulated only one level at a time, while controlling for the other two. The results show that probabilities were detected only when present at the S-R mapping and motor levels, but not at the perceptual one (experiment 1), unless the perceptual features have a dimensional overlap with the S-R mapping rule (experiment 2). The effects of probability detection were mostly facilitatory at the S-R mapping, both facilitatory and inhibitory at the perceptual level, and predominantly inhibitory at the response-set level. The facilitatory effects were based on learning the absolute frequencies first and transitional probabilities later (for the S-R mapping rule) or both types of information at the same time (for perceptual level), whereas the inhibitory effects were based on learning first the transitional probabilities. Our data suggest that both absolute frequencies and transitional probabilities are used in motor learning, but in different temporal orders, according to the probabilistic properties of the environment. The results support the idea that separate neural circuits may be involved in detecting absolute frequencies as compared to transitional probabilities.
The Frequency and Predicted Consequences of Cosmic Impacts in the Last 65 Million Years
NASA Astrophysics Data System (ADS)
Paine, Michael; Peiser, Benny
2004-06-01
Sixty five million years ago a huge asteroid collided with the Earth and ended the long reign of the dinosaurs. In the aftermath of this catastrophic event, the mammals arose and eventually mankind came to dominate the surface of the planet. The Earth, however, has not been free from severe impacts since the time of the dinosaur killer. We examine the likely frequency of major impact events over the past 65 million years, the evidence for these impacts and the predicted consequences of various types of impacts. It is evident that the mammals had to survive frequent severe disruptions to the global climate, and it is likely that over the past 5 million years hominids were faced with several catastrophic global events. Smaller but strategically located impact events could bring down our civilisation if they occurred today. Mankind has recently developed the expertise to predict and mitigate future impacts, but political and financial support are lacking.
A dynamic marine calcium cycle during the past 28 million years
Griffith, E.M.; Paytan, A.; Caldeira, K.; Bullen, T.D.; Thomas, E.
2008-01-01
Multiple lines of evidence have shown that the isotopic composition and concentration of calcium in seawater have changed over the past 28 million years. A high-resolution, continuous seawater calcium isotope ratio curve from marine (pelagic) barite reveals distinct features in the evolution of the seawater calcium isotopic ratio suggesting changes in seawater calcium concentrations. The most pronounced increase in the ??44/40Ca value of seawater (of 0.3 per mil) occurred over roughly 4 million years following a period of low values around 13 million years ago. The major change in marine calcium corresponds to a climatic transition and global change in the carbon cycle and suggests a reorganization of the global biogeochemical system.
Woody cover and hominin environments in the past 6million years
NASA Astrophysics Data System (ADS)
Cerling, Thure E.; Wynn, Jonathan G.; Andanje, Samuel A.; Bird, Michael I.; Korir, David Kimutai; Levin, Naomi E.; Mace, William; Macharia, Anthony N.; Quade, Jay; Remien, Christopher H.
2011-08-01
The role of African savannahs in the evolution of early hominins has been debated for nearly a century. Resolution of this issue has been hindered by difficulty in quantifying the fraction of woody cover in the fossil record. Here we show that the fraction of woody cover in tropical ecosystems can be quantified using stable carbon isotopes in soils. Furthermore, we use fossil soils from hominin sites in the Awash and Omo-Turkana basins in eastern Africa to reconstruct the fraction of woody cover since the Late Miocene epoch (about 7 million years ago). 13C/12C ratio data from 1,300 palaeosols at or adjacent to hominin sites dating to at least 6million years ago show that woody cover was predominantly less than ~40% at most sites. These data point to the prevalence of open environments at the majority of hominin fossil sites in eastern Africa over the past 6million years.
Role of seasonality in the evolution of climate during the last 100 million years
NASA Technical Reports Server (NTRS)
Crowley, T. J.; Short, D. A.; North, G. R.; Mengel, J. G.
1986-01-01
A simple climate model has been used to calculate the effect of past changes in the land-sea distribution on the seasonal cycle of temperatures during the last 100 million years. Modeled summer temperature decreased over Greenland by more than 10 C and over Antarctica by 5 to 8 C. For the last 80 million years, this thermal response is comparable in magnitude to estimated atmospheric carbon dioxide effects. Analysis of paleontological data provides some support for the proposed hypothesis that large changes due to seasonality may have sometimes resulted in an ice-free state due to high summer temperatures rather than year-round warmth. Such 'cool' nonglacials may have prevailed for as much as one-third of the last 100 million years.
[The future population of Mexico. 123 million by the year 2010].
Madrigal Hinojosa, R
1988-01-01
Recent data on fertility in Mexico have allowed identification of the most likely of 2 alternative population projections through the year 2010. The projection assumes an increase in life expectancy for men and women respectively from 64.08 and 70.47 in 1980-85 and 77.00 in 2005-10. The migration assumption is that there will be a net loss of 529,274 Mexicans every 5 years. The total fertility rate is expected to decline from to 2.7. The total population was projected at 82.8 million in 1988, 104.0 million at the end of the century, and 123.2 million in 2010. The 0-14 age group will decline from 44.23% of the population in 1980 and 40.33% in 1985 to 31.41% in 2000 and 29.50% in 2010. The proportion aged 15-64 will increase from 52.45% in 1980 and 56.22% in 1985 to 63.96% in 2000 and 64 75% in 2010. The proportion of the population in localities with under 2500 inhabitants is expected to remain stable at about 24.3 million persons. Mexico City, Guadalajara, Monterrey, and Puebla will have a combined population of 35 million by the year 2000. In 2010, the Federal District and the State of Mexico which includes Mexico City are expected to contain 29.3% of the total population. The projected population increase over the next 22 years is 40.4 million, 16% greater than the national population in 1960. The implications for providing food and consumer goods, and especially for improving the quality of life are serious. The relative demand for primary and secondary education and for maternal-child health care will decline, but the demand for jobs and for family planning services will increase as the proportion of the population in the economically active age groups increases.
Minimal entropy probability paths between genome families.
Ahlbrandt, Calvin; Benson, Gary; Casey, William
2004-05-01
We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non
Occupation and cancer - follow-up of 15 million people in five Nordic countries.
Pukkala, Eero; Martinsen, Jan Ivar; Lynge, Elsebeth; Gunnarsdottir, Holmfridur Kolbrun; Sparén, Pär; Tryggvadottir, Laufey; Weiderpass, Elisabete; Kjaerheim, Kristina
2009-01-01
We present up to 45 years of cancer incidence data by occupational category for the Nordic populations. The study covers the 15 million people aged 30-64 years in the 1960, 1970, 1980/1981 and/or 1990 censuses in Denmark, Finland, Iceland, Norway and Sweden, and the 2.8 million incident cancer cases diagnosed in these people in a follow-up until about 2005. The study was undertaken as a cohort study with linkage of individual records based on the personal identity codes used in all the Nordic countries. In the censuses, information on occupation for each person was provided through free text in self-administered questionnaires. The data were centrally coded and computerised in the statistical offices. For the present study, the original occupational codes were reclassified into 53 occupational categories and one group of economically inactive persons. All Nordic countries have a nation-wide registration of incident cancer cases during the entire study period. For the present study the incident cancer cases were classified into 49 primary diagnostic categories. Some categories have been further divided according to sub-site or morphological type. The observed number of cancer cases in each group of persons defined by country, sex, age, period and occupation was compared with the expected number calculated from the stratum specific person years and the incidence rates for the national population. The result was presented as a standardised incidence ratio, SIR, defined as the observed number of cases divided by the expected number. For all cancers combined (excluding non-melanoma skin cancer), the study showed a wide variation among men from an SIR of 0.79 (95% confidence interval 0.66-0.95) in domestic assistants to 1.48 (1.43-1.54) in waiters. The occupations with the highest SIRs also included workers producing beverage and tobacco, seamen and chimney sweeps. Among women, the SIRs varied from 0.58 (0.37-0.87) in seafarers to 1.27 (1.19-1.35) in tobacco workers. Low
Approximation of Failure Probability Using Conditional Sampling
NASA Technical Reports Server (NTRS)
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.
Computing Earthquake Probabilities on Global Scales
NASA Astrophysics Data System (ADS)
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
The role of probabilities in physics.
Le Bellac, Michel
2012-09-01
Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.
Are there one million nerve fibres in the human medullary pyramid?
Wada, A; Goto, J; Goto, N; Kawamura, N; Matsumoto, K
2001-03-01
It has been the accepted opinion that there are one million nerve fibres in the human medullary pyramid. This seemed to be confirmed in several old reports. But we cannot agree with this opinion. We made nitrocellulose-embedded sections from three normal male brains, and stained them by our modification of Masson-Goldner method. With this method, myelinated axons appeared in blue, whereas the glial processes were coloured in red, which allowed easy discrimination between the two. After morphometric evaluation of the pyramidal axons under the microscope, it appeared without the slightest doubt, that the number of axons does not exceed one-tenth of one million.
Discoidal impressions and trace-like fossils more than 1200 million years old.
Rasmussen, Birger; Bengtson, Stefan; Fletcher, Ian R; McNaughton, Neal J
2002-05-10
The Stirling Range Formation of southwestern Australia contains discoidal impressions and trace-like fossils in tidal sandstones. The various disks have previously been linked to the Ediacaran biota, younger than 600 million years old. From this unit, we report U-Th-Pb geochronology of detrital zircon and monazite, as well as low-grade metamorphic monazite, constraining the depositional age to between 2016 +/- 6 and 1215 +/- 20 million years old. Although nonbiological origins for the discoidal impressions cannot be completely discounted, the structures resembling trace fossils clearly have a biological origin and suggest the presence of vermiform, mucus-producing, motile organisms.
Probability, arrow of time and decoherence
NASA Astrophysics Data System (ADS)
Bacciagaluppi, Guido
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
Match probabilities in racially admixed populations.
Lange, K
1993-01-01
The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors. PMID:8430693
Local Directed Percolation Probability in Two Dimensions
NASA Astrophysics Data System (ADS)
Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi
1998-01-01
Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.
Low-probability flood risk modeling for New York City.
Aerts, Jeroen C J H; Lin, Ning; Botzen, Wouter; Emanuel, Kerry; de Moel, Hans
2013-05-01
The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low-lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low-probability/high-impact flood hazard faced by the city. Exceedance probability-loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100-year storm surge is within a range of US$2 bn-5 bn, while this is between US$5 bn and 11 bn for a 1/500-year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.
Explosion probability of unexploded ordnance: expert beliefs.
MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G
2008-08-01
This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies
Exact probability distribution functions for Parrondo's games
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Intrinsic Probability of a Multifractal Set
NASA Astrophysics Data System (ADS)
Hosokawa, Iwao
1991-12-01
It is shown that a self-similar measure isotropically distributed in a d-dimensional set should have its own intermittency exponents equivalent to its own generalized dimensions (in the sense of Hentschel and Procaccia), and that the intermittency exponents are completely designated by an intrinsic probability which governs the spatial distribution of the measure. Based on this, it is proven that the intrinsic probability uniquely determines the spatial distribution of the scaling index α of the measure as well as the so-called f-α spectrum of the multifractal set.
Atomic transition probabilities of Nd I
NASA Astrophysics Data System (ADS)
Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.
2011-12-01
Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.
Probabilities for separating sets of order statistics.
Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E
2010-04-01
Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.
Steering in spin tomographic probability representation
NASA Astrophysics Data System (ADS)
Man'ko, V. I.; Markovich, L. A.
2016-09-01
The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.
Determining system maintainability as a probability
Wright, R.E.; Atwood, C.L.
1988-01-01
Maintainability has often been defined in principle as the probability that a system or component can be repaired in a specific time given that it is in a failed state, but presented in practice in terms of mean-time-to-repair. In this paper, formulas are developed for maintainability as a probability, analogous to those for reliability and availability. This formulation is expressed in terms of cut sets, and leads to a natural definition of unmaintainability importance for cut sets and basic events. 6 refs.
Probability in biology: overview of a comprehensive theory of probability in living systems.
Nakajima, Toshiyuki
2013-09-01
Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.
NASA Astrophysics Data System (ADS)
Kotthoff, U.; Greenwood, D. R.; McCarthy, F. M. G.; Müller-Navarra, K.; Hesselbo, S. P.
2013-12-01
We have investigated the palynology of sediment cores from Sites M0027 and M0029 of IODP Expedition 313 on the New Jersey shallow shelf, east coast of North America, spanning an age range of 33 to 13 million years before present. Additionally, a pollen assemblage from the Pleistocene was examined. The palynological results were statistically analyzed and complemented with pollen-based quantitative climate reconstructions. Transport-related bias of the pollen assemblages was identified via analysis of the ratio of terrestrial to marine palynomorphs, and considered when interpreting palaeovegetation and palaeoclimate from the pollen data. Results indicate that from the early Oligocene to the middle Miocene, the hinterland vegetation of the New Jersey shelf was characterized by oak-hickory forests in the lowlands and conifer-dominated vegetation in the highlands. The Oligocene witnessed several expansions of conifer forest, probably related to cooling events. The pollen-based climate data imply an increase in annual temperatures from ~12 °C to more than 15 °C during the Oligocene. The Mi-1 cooling event at the onset of the Miocene is reflected by an expansion of conifers and an annual temperature decrease by almost 3 °C, from 15 °C to 12.5 °C around 23 million years before present. Particularly low annual temperatures are also recorded for an interval around ~20 million years before present, which probably reflects the Mi-1aa cooling event. Generally, the Miocene ecosystem and climate conditions were very similar to those of the Oligocene in the hinterland of the New Jersey shelf. Miocene grasslands, as known from other areas in the USA during that time period, are not evident for the hinterland of the New Jersey shelf. Surprisingly, the palaeovegetation data for the hinterland of the New Jersey shelf do not show extraordinary changes during the Mid-Miocene climatic optimum at ~15 million years before present, except for a minor increase in deciduous
Thompson, Jeffrey R.; Petsios, Elizabeth; Davidson, Eric H.; Erkenbrack, Eric M.; Gao, Feng; Bottjer, David J.
2015-01-01
Echinoids, or sea urchins, are rare in the Palaeozoic fossil record, and thus the details regarding the early diversification of crown group echinoids are unclear. Here we report on the earliest probable crown group echinoid from the fossil record, recovered from Permian (Roadian-Capitanian) rocks of west Texas, which has important implications for the timing of the divergence of crown group echinoids. The presence of apophyses and rigidly sutured interambulacral areas with two columns of plates indicates this species is a cidaroid echinoid. The species, Eotiaris guadalupensis, n. sp. is therefore the earliest stem group cidaroid. The occurrence of this species in Roadian strata pushes back the divergence of cidaroids and euechinoids, the clades that comprise all living echinoids, to at least 268.8 Ma, ten million years older than the previously oldest known cidaroid. Furthermore, the genomic regulation of development in echinoids is amongst the best known, and this new species informs the timing of large-scale reorganization in echinoid gene regulatory networks that occurred at the cidaroid-euechinoid divergence, indicating that these changes took place by the Roadian stage of the Permian. PMID:26486232
$35-Million Helps Cornell University Recruit Faculty and Ward off Poachers
ERIC Educational Resources Information Center
June, Audrey Williams
2008-01-01
When it comes to building a top-notch faculty, racing to land prominent scholars is only half the battle for colleges. The other half: Fighting off poachers intent on swiping the college's existing talented mid-career professors. At Cornell University, a $35-million gift announced by officials in late September will give the institution an edge in…
U.S. EPA Awards $1 Million Grant to Research Impact of Drought on Water Quality
SAN FRANCISCO - The U.S. Environmental Protection Agency announced a $1 million grant to the Public Policy Institute of California (PPIC) to conduct research on the effects of drought and extreme weather on the state's water resources. The study will exami
Laying the Foundation for a Solar America: The Million Solar Roofs Initiative
Strahs, G.; Tombari, C.
2006-10-01
As the U.S. Department of Energy's Solar Energy Technology Program embarks on the next phase of its technology acceptance efforts under the Solar America Initiative, there is merit to examining the program's previous market transformation effort, the Million Solar Roofs Initiative. Its goal was to transform markets for distributed solar technologies by facilitating the installation of solar systems.
We Want Our 27 Million Dollars back: Retention as a Revenue Resource
ERIC Educational Resources Information Center
Smith, Raymond T.; Liguori, Denise; O'Connor, Dianna; Postle, Monica
2009-01-01
Community colleges lose millions of dollars in potential revenue due to lackluster retention and graduation rates. It is time for change! Bergen Community College (BCC) has made a unique commitment to concentrate solely on this issue. Learn how the establishment of the Department of Retention Services, with its cutting edge initiatives, is…
A Million New Teachers Are Coming: Will They Be Ready to Teach?
ERIC Educational Resources Information Center
DeMonte, Jenny
2015-01-01
Research shows that the most powerful, in-school influence on learning is the quality of instruction that teachers bring to their students. In the next decade, more than 1.5 million new teachers will be hired for our schools; unfortunately, teacher preparation programs may not be up to the task of delivering the teacher workforce we need, and…
EPA Provides New York State $197 Million for Clean Water Projects
(New York, N.Y.) The U.S. Environmental Protection Agency has allotted $197 million to New York State to help finance improvements to water projects that are essential to protecting public health and the environment. The funds will be used to finance water
Barcodes in a Medical Office Computer System: Experience with Eight Million Data Entry Operations
Willard, Oliver T.
1985-01-01
A medical office management software package has been developed which utilizes barcodes to enhance data entry. The system has been in use in our practice since 1982. Currently, there are over twenty-five installations of this system with a combined experience of some eight million data entry operations using barcodes. The barcode system design and our experience with it is described.
$156 million budget increase to fight HIV/AIDS in African American & other minority communities.
1998-12-01
A series of initiatives was established to invest $156 million in efforts to fight HIV/AIDS in minority populations, where AIDS is still a leading cause of death among men and women between 25 and 44 years of age. The initiatives will target specific communities and includes technical assistance and increased access to care. Specific goals of the initiatives are discussed.
EPA Awards $12.7 Million to Assist Small Drinking Water and Wastewater Systems
WASHINGTON - The U.S. Environmental Protection Agency (EPA) is announcing the award of $12.7 million in grants to help small drinking and wastewater systems and private well owners located in urban and rural communities throughout the U.S. and its t
Zooniverse - Real science online with more than a million people. (Invited)
NASA Astrophysics Data System (ADS)
Smith, A.; Lynn, S.; Lintott, C.; Whyte, L.; Borden, K. A.
2013-12-01
The Zooniverse (zooniverse.org) began in 2007 with the launch of Galaxy Zoo, a project in which more than 175,000 people provided shape analyses of more than 1 million galaxy images sourced from the Sloan Digital Sky Survey. These galaxy 'classifications', some 60 million in total, have since been used to produce more than 50 peer-reviewed publications based not only on the original research goals of the project but also because of serendipitous discoveries made by the volunteer community. Based upon the success of Galaxy Zoo the team have gone on to develop more than 25 web-based citizen science projects, all with a strong research focus in a range of subjects from astronomy to zoology where human-based analysis still exceeds that of machine intelligence. Over the past 6 years Zooniverse projects have collected more than 300 million data analyses from over 1 million volunteers providing fantastically rich datasets for not only the individuals working to produce research from their project but also the machine learning and computer vision research communities. This talk will focus on the core 'method' by which Zooniverse projects are developed and lessons learned by the Zooniverse team developing citizen science projects across a range of disciplines.
One Million Bones: Measuring the Effect of Human Rights Participation in the Social Work Classroom
ERIC Educational Resources Information Center
McPherson, Jane; Cheatham, Leah P.
2015-01-01
This article describes the integration of human rights content and a national arts-activism initiative--One Million Bones--into a bachelor's-level macro practice class as a human rights teaching strategy. Two previously validated scales, the Human Rights Exposure (HRX) in Social Work and the Human Rights Engagement (HRE) in Social Work (McPherson…
Plant-wide assessment summary: $4.1 million in savings identified in Paramount Petroleum assessment
None, None
2003-08-01
The Paramount Petroleum Corporation (PPC) and its partners conducted a systematic plant-wide assessment (PWA) to identify energy- and cost-saving opportunities at the company's plant in Paramount, California. The assessment team identified $4.1 million in potential annual savings.
U.S. EPA to Announce Millions to Improve Local Water Infrastructure, Water Quality Statewide
LOS ANGELES - Tomorrow, U.S. EPA Regional Administrator Jared Blumenfeld will be joined by City of Carlsbad Mayor Matt Hall to announce millions of dollars in funding to the state that will improve local water infrastructure and control water pollution st
EPA Announces $1 Million Clean Diesel Grant to Improve Air Quality in Detroit
(CHICAGO-December 3, 2015) U.S. Environmental Protection Agency Region 5 today announced a $1 million Clean Diesel grant that Southwest Detroit Environmental Vision will use to reduce emissions from diesel trucks to improve air quality in Detroit. T
EPA Awards $5 million in Clean Diesel Grants to Protect Health of Communities near Ports
WASHINGTON -- The U.S. Environmental Protection Agency (EPA) today awarded $5 million in grant funding for clean diesel projects at U.S. ports. The selected projects in California, Oregon, New Jersey and Texas will improve the air quality for people
EPA Announces Availability of $26 Million to Clean Up Diesel Engines Nationwide
SAN FRANCISCO - The U.S. Environmental Protection Agency today announced the availability of $4.4 million in grant funding to establish clean diesel projects aimed at reducing emissions from the existing fleet of diesel engines in Arizona, California, Haw
Probability learning and Piagetian probability conceptions in children 5 to 12 years old.
Kreitler, S; Zigler, E; Kreitler, H
1989-11-01
This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.
Small domes on Venus: probable analogs of Icelandic lava shields
Garvin, James B.; Williams, Richard S.
1990-01-01
On the basis of observed shapes and volumetric estimates, we interpret small, dome-like features on radar images of Venus to be analogs of Icelandic lava-shield volcanoes. Using morphometric data for venusian domes in Aubele and Slyuta (in press), as well as our own measurements of representative dome volumes and areas from Tethus Regio, we demonstrate that the characteristic aspect ratios and flank slopes of these features are consistent with a subclass of low Icelandic lava-shield volcanoes (LILS ). LILS are slightly convex in cross-section with typical flank slopes of ∼3°. Plausible lava-shield-production rates for the venusian plains suggest formation of ∼53 million shields over the past 0.25 Ga. The cumulative global volume of lava that would be associated with this predicted number of lava shields is only a factor of 3–4 times that of a single oceanic composite shield volcano such as Mauna Loa. The global volume of all venusian lava shields in the 0.5–20-km size range would only contribute a meter of resurfacing over geologically significant time scales. Thus, venusian analogs to LILS may represent the most abundant landform on the globally dominant plains of Venus, but would be insignificant with regard to the global volume of lava extruded. As in Iceland, associated lavas from fissure eruptions probably dominate plains volcanism and should be evident on the higher resolution Magellan radar images.
Southern Ocean dust-climate coupling over the past four million years.
Martínez-Garcia, Alfredo; Rosell-Melé, Antoni; Jaccard, Samuel L; Geibert, Walter; Sigman, Daniel M; Haug, Gerald H
2011-08-03
Dust has the potential to modify global climate by influencing the radiative balance of the atmosphere and by supplying iron and other essential limiting micronutrients to the ocean. Indeed, dust supply to the Southern Ocean increases during ice ages, and 'iron fertilization' of the subantarctic zone may have contributed up to 40 parts per million by volume (p.p.m.v.) of the decrease (80-100 p.p.m.v.) in atmospheric carbon dioxide observed during late Pleistocene glacial cycles. So far, however, the magnitude of Southern Ocean dust deposition in earlier times and its role in the development and evolution of Pleistocene glacial cycles have remained unclear. Here we report a high-resolution record of dust and iron supply to the Southern Ocean over the past four million years, derived from the analysis of marine sediments from ODP Site 1090, located in the Atlantic sector of the subantarctic zone. The close correspondence of our dust and iron deposition records with Antarctic ice core reconstructions of dust flux covering the past 800,000 years (refs 8, 9) indicates that both of these archives record large-scale deposition changes that should apply to most of the Southern Ocean, validating previous interpretations of the ice core data. The extension of the record beyond the interval covered by the Antarctic ice cores reveals that, in contrast to the relatively gradual intensification of glacial cycles over the past three million years, Southern Ocean dust and iron flux rose sharply at the Mid-Pleistocene climatic transition around 1.25 million years ago. This finding complements previous observations over late Pleistocene glacial cycles, providing new evidence of a tight connection between high dust input to the Southern Ocean and the emergence of the deep glaciations that characterize the past one million years of Earth history.
Investigating Probability with the NBA Draft Lottery.
ERIC Educational Resources Information Center
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…
Confusion between Odds and Probability, a Pandemic?
ERIC Educational Resources Information Center
Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer
2012-01-01
This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…
Probability distribution functions of the Grincevicjus series
NASA Astrophysics Data System (ADS)
Kapica, Rafal; Morawiec, Janusz
2008-06-01
Given a sequence ([xi]n,[eta]n) of independent identically distributed vectors of random variables we consider the Grincevicjus series and a functional-integral equation connected with it. We prove that the equation characterizes all probability distribution functions of the Grincevicjus series. Moreover, some application of this characterization to a continuous refinement equation is presented.
Time Required to Compute A Posteriori Probabilities,
The paper discusses the time required to compute a posteriori probabilities using Bayes ’ Theorem . In a two-hypothesis example it is shown that, to... Bayes ’ Theorem as the group operation. Winograd’s results concerning the lower bound on the time required to perform a group operation on a finite group using logical circuitry are therefore applicable. (Author)
Interstitial lung disease probably caused by imipramine.
Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H
2014-01-01
Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.
The Smart Potential behind Probability Matching
ERIC Educational Resources Information Center
Gaissmaier, Wolfgang; Schooler, Lael J.
2008-01-01
Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…
Probability of boundary conditions in quantum cosmology
NASA Astrophysics Data System (ADS)
Suenobu, Hiroshi; Nambu, Yasusada
2017-02-01
One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.
Idempotent probability measures on ultrametric spaces
NASA Astrophysics Data System (ADS)
Hubal, Oleksandra; Zarichnyi, Mykhailo
2008-07-01
Following the construction due to Hartog and Vink we introduce a metric on the set of idempotent probability measures (Maslov measures) defined on an ultrametric space. This construction determines a functor on the category of ultrametric spaces and nonexpanding maps. We prove that this functor is the functorial part of a monad on this category. This monad turns out to contain the hyperspace monad.
Five-Parameter Bivariate Probability Distribution
NASA Technical Reports Server (NTRS)
Tubbs, J.; Brewer, D.; Smith, O. W.
1986-01-01
NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.
Independent Events in Elementary Probability Theory
ERIC Educational Resources Information Center
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Geometric Probability and the Areas of Leaves
ERIC Educational Resources Information Center
Hoiberg, Karen Bush; Sharp, Janet; Hodgson, Ted; Colbert, Jim
2005-01-01
This article describes how a group of fifth-grade mathematics students measured irregularly shaped objects using geometric probability theory. After learning how to apply a ratio procedure to find the areas of familiar shapes, students extended the strategy for use with irregularly shaped objects, in this case, leaves. (Contains 2 tables and 8…
Assessing Schematic Knowledge of Introductory Probability Theory
ERIC Educational Resources Information Center
Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley
2005-01-01
The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…
Automatic Item Generation of Probability Word Problems
ERIC Educational Resources Information Center
Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina
2009-01-01
Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…
Probability from a Socio-Cultural Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Probability & Perception: The Representativeness Heuristic in Action
ERIC Educational Resources Information Center
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Posterior Probabilities for a Consensus Ordering.
ERIC Educational Resources Information Center
Fligner, Michael A.; Verducci, Joseph S.
1990-01-01
The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)
Phonotactic Probability Effects in Children Who Stutter
ERIC Educational Resources Information Center
Anderson, Julie D.; Byrd, Courtney T.
2008-01-01
Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…
Rethinking the learning of belief network probabilities
Musick, R.
1996-03-01
Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.
Probability distribution functions in turbulent convection
NASA Technical Reports Server (NTRS)
Balachandar, S.; Sirovich, L.
1991-01-01
Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.
Probability & Statistics: Modular Learning Exercises. Student Edition
ERIC Educational Resources Information Center
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Spatial Probability Cuing and Right Hemisphere Damage
ERIC Educational Resources Information Center
Shaqiri, Albulena; Anderson, Britt
2012-01-01
In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…
Learning a Probability Distribution Efficiently and Reliably
NASA Technical Reports Server (NTRS)
Laird, Philip; Gamble, Evan
1988-01-01
A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.
Probability & Statistics: Modular Learning Exercises. Teacher Edition
ERIC Educational Resources Information Center
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Overcoming Challenges in Learning Probability Vocabulary
ERIC Educational Resources Information Center
Groth, Randall E.; Butler, Jaime; Nelson, Delmar
2016-01-01
Students can struggle to understand and use terms that describe probabilities. Such struggles lead to difficulties comprehending classroom conversations. In this article, we describe some specific misunderstandings a group of students (ages 11-12) held in regard to vocabulary such as "certain", "likely" and…
Activities in Elementary Probability, Monograph No. 9.
ERIC Educational Resources Information Center
Fouch, Daniel J.
This monograph on elementary probability for middle school, junior high, or high school consumer mathematics students is divided into two parts. Part one emphasizes lessons which cover the fundamental counting principle, permutations, and combinations. The 5 lessons of part I indicate the objectives, examples, methods, application, and problems…
Probability in Action: The Red Traffic Light
ERIC Educational Resources Information Center
Shanks, John A.
2007-01-01
Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…
Technique for Evaluating Multiple Probability Occurrences /TEMPO/
NASA Technical Reports Server (NTRS)
Mezzacappa, M. A.
1970-01-01
Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.
Monte Carlo methods to calculate impact probabilities
NASA Astrophysics Data System (ADS)
Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.
2014-09-01
Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward
Southard, Rodney E.; Veilleux, Andrea G.
2014-01-01
similar and related to three primary physiographic provinces. The final regional regression analyses resulted in three sets of equations. For Regions 1 and 2, the basin characteristics of drainage area and basin shape factor were statistically significant. For Region 3, because of the small amount of data from streamgages, only drainage area was statistically significant. Average standard errors of prediction ranged from 28.7 to 38.4 percent for flood region 1, 24.1 to 43.5 percent for flood region 2, and 25.8 to 30.5 percent for region 3. The regional regression equations are only applicable to stream sites in Missouri with flows not significantly affected by regulation, channelization, backwater, diversion, or urbanization. Basins with about 5 percent or less impervious area were considered to be rural. Applicability of the equations are limited to the basin characteristic values that range from 0.11 to 8,212.38 square miles (mi2) and basin shape from 2.25 to 26.59 for Region 1, 0.17 to 4,008.92 mi2 and basin shape 2.04 to 26.89 for Region 2, and 2.12 to 2,177.58 mi2 for Region 3. Annual peak data from streamgages were used to qualitatively assess the largest floods recorded at streamgages in Missouri since the 1915 water year. Based on existing streamgage data, the 1983 flood event was the largest flood event on record since 1915. The next five largest flood events, in descending order, took place in 1993, 1973, 2008, 1994 and 1915. Since 1915, five of six of the largest floods on record occurred from 1973 to 2012.
ERIC Educational Resources Information Center
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods
ERIC Educational Resources Information Center
Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.
2012-01-01
Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…
ERIC Educational Resources Information Center
Karelitz, Tzur M.; Budescu, David V.
2004-01-01
When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…
ERIC Educational Resources Information Center
Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques
2010-01-01
P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in…
ERIC Educational Resources Information Center
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
Tyler, C W; Chen, C C
2000-01-01
Neural implementation of classical High-Threshold Theory reveals fundamental flaws in its applicability to realistic neural systems and to the two-alternative forced-choice (2AFC) paradigm. For 2AFC, Signal Detection Theory provides a basis for accurate analysis of the observer's attentional strategy and effective degree of probability summation over attended neural channels. The resulting theory provides substantially different predictions from those of previous approximation analyses. In additive noise, attentional probability summation depends on the attentional model assumed. (1) For an ideal attentional strategy in additive noise, summation proceeds at a diminishing rate from an initial level of fourth-root summation for the first few channels. The maximum improvement asymptotes to about a factor of 4 by a million channels. (2) For a fixed attention field in additive noise, detection is highly inefficient at first and approximates fourth-root summation through the summation range. (3) In physiologically plausible root-multiplicative noise, on the other hand, attentional probability summation mimics a linear improvement in sensitivity up to about ten channels, approaching a factor of 1000 by a million channels. (4) Some noise sources, such as noise from eye movements, are fully multiplicative and would prevent threshold determination within their range of effectiveness. Such results may require reappraisal of previous interpretations of detection behavior in the 2AFC paradigm.
VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES
G.A. Valentine; F.V. Perry; S. Dartevelle
2005-08-26
Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision
One in a Million: The Orthogonality of Anti-crepuscular Rays and Rainbows.
ERIC Educational Resources Information Center
Bartlett, Albert A.
1996-01-01
Reviews the concepts behind atmospheric optical phenomena such as rainbows and anticrepuscular rays. Describes the experience of observing these two phenomena simultaneously and calculates the probability of that observation. (JRH)
Cheating Probabilities on Multiple Choice Tests
NASA Astrophysics Data System (ADS)
Rizzuto, Gaspard T.; Walters, Fred
1997-10-01
This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.
Approaches to Evaluating Probability of Collision Uncertainty
NASA Technical Reports Server (NTRS)
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
A probability distribution model for rain rate
NASA Technical Reports Server (NTRS)
Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.
1994-01-01
A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.
Earthquake probabilities: theoretical assessments and reality
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.
2013-12-01
It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance
Complex analysis methods in noncommutative probability
NASA Astrophysics Data System (ADS)
Teodor Belinschi, Serban
2006-02-01
In this thesis we study convolutions that arise from noncommutative probability theory. We prove several regularity results for free convolutions, and for measures in partially defined one-parameter free convolution semigroups. We discuss connections between Boolean and free convolutions and, in the last chapter, we prove that any infinitely divisible probability measure with respect to monotonic additive or multiplicative convolution belongs to a one-parameter semigroup with respect to the corresponding convolution. Earlier versions of some of the results in this thesis have already been published, while some others have been submitted for publication. We have preserved almost entirely the specific format for PhD theses required by Indiana University. This adds several unnecessary pages to the document, but we wanted to preserve the specificity of the document as a PhD thesis at Indiana University.
A quantum probability perspective on borderline vagueness.
Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter
2013-10-01
The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Approximate probability distributions of the master equation.
Thomas, Philipp; Grima, Ramon
2015-07-01
Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.
Transit probabilities for debris around white dwarfs
NASA Astrophysics Data System (ADS)
Lewis, John Arban; Johnson, John A.
2017-01-01
The discovery of WD 1145+017 (Vanderburg et al. 2015), a metal-polluted white dwarf with an infrared-excess and transits confirmed the long held theory that at least some metal-polluted white dwarfs are actively accreting material from crushed up planetesimals. A statistical understanding of WD 1145-like systems would inform us on the various pathways for metal-pollution and the end states of planetary systems around medium- to high-mass stars. However, we only have one example and there are presently no published studies of transit detection/discovery probabilities for white dwarfs within this interesting regime. We present a preliminary look at the transit probabilities for metal-polluted white dwarfs and their projected space density in the Solar Neighborhood, which will inform future searches for analogs to WD 1145+017.
Volcano shapes, entropies, and eruption probabilities
NASA Astrophysics Data System (ADS)
Gudmundsson, Agust; Mohajeri, Nahid
2014-05-01
We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to
Probability of identity by descent in metapopulations.
Kaj, I; Lascoux, M
1999-01-01
Equilibrium probabilities of identity by descent (IBD), for pairs of genes within individuals, for genes between individuals within subpopulations, and for genes between subpopulations are calculated in metapopulation models with fixed or varying colony sizes. A continuous-time analog to the Moran model was used in either case. For fixed-colony size both propagule and migrant pool models were considered. The varying population size model is based on a birth-death-immigration (BDI) process, to which migration between colonies is added. Wright's F statistics are calculated and compared to previous results. Adding between-island migration to the BDI model can have an important effect on the equilibrium probabilities of IBD and on Wright's index. PMID:10388835
Conflict Probability Estimation for Free Flight
NASA Technical Reports Server (NTRS)
Paielli, Russell A.; Erzberger, Heinz
1996-01-01
The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.
Approximate probability distributions of the master equation
NASA Astrophysics Data System (ADS)
Thomas, Philipp; Grima, Ramon
2015-07-01
Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.
Computing association probabilities using parallel Boltzmann machines.
Iltis, R A; Ting, P Y
1993-01-01
A new computational method is presented for solving the data association problem using parallel Boltzmann machines. It is shown that the association probabilities can be computed with arbitrarily small errors if a sufficient number of parallel Boltzmann machines are available. The probability beta(i)(j) that the i th measurement emanated from the jth target can be obtained simply by observing the relative frequency with which neuron v(i,j) in a two-dimensional network is on throughout the layers. Some simple tracking examples comparing the performance of the Boltzmann algorithm to the exact data association solution and with the performance of an alternative parallel method using the Hopfield neural network are also presented.
Nuclear data uncertainties: I, Basic concepts of probability
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
The Origin of Probability and Entropy
NASA Astrophysics Data System (ADS)
Knuth, Kevin H.
2008-11-01
Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.
Calculating Cumulative Binomial-Distribution Probabilities
NASA Technical Reports Server (NTRS)
Scheuer, Ernest M.; Bowerman, Paul N.
1989-01-01
Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.
Sampling probability distributions of lesions in mammograms
NASA Astrophysics Data System (ADS)
Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.
2015-03-01
One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.
SureTrak Probability of Impact Display
NASA Technical Reports Server (NTRS)
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
Non-signalling Theories and Generalized Probability
NASA Astrophysics Data System (ADS)
Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek
2016-09-01
We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.
Probability and Statistics in Aerospace Engineering
NASA Technical Reports Server (NTRS)
Rheinfurth, M. H.; Howell, L. W.
1998-01-01
This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.
Extreme ecosystem instability suppressed tropical dinosaur dominance for 30 million years.
Whiteside, Jessica H; Lindström, Sofie; Irmis, Randall B; Glasspool, Ian J; Schaller, Morgan F; Dunlavey, Maria; Nesbitt, Sterling J; Smith, Nathan D; Turner, Alan H
2015-06-30
A major unresolved aspect of the rise of dinosaurs is why early dinosaurs and their relatives were rare and species-poor at low paleolatitudes throughout the Late Triassic Period, a pattern persisting 30 million years after their origin and 10-15 million years after they became abundant and speciose at higher latitudes. New palynological, wildfire, organic carbon isotope, and atmospheric pCO2 data from early dinosaur-bearing strata of low paleolatitudes in western North America show that large, high-frequency, tightly correlated variations in δ(13)Corg and palynomorph ecotypes occurred within a context of elevated and increasing pCO2 and pervasive wildfires. Whereas pseudosuchian archosaur-dominated communities were able to persist in these same regions under rapidly fluctuating extreme climatic conditions until the end-Triassic, large-bodied, fast-growing tachymetabolic dinosaurian herbivores requiring greater resources were unable to adapt to unstable high CO2 environmental conditions of the Late Triassic.
Extreme ecosystem instability suppressed tropical dinosaur dominance for 30 million years
NASA Astrophysics Data System (ADS)
Whiteside, Jessica H.; Lindström, Sofie; Irmis, Randall B.; Glasspool, Ian J.; Schaller, Morgan F.; Dunlavey, Maria; Nesbitt, Sterling J.; Smith, Nathan D.; Turner, Alan H.
2015-06-01
A major unresolved aspect of the rise of dinosaurs is why early dinosaurs and their relatives were rare and species-poor at low paleolatitudes throughout the Late Triassic Period, a pattern persisting 30 million years after their origin and 10-15 million years after they became abundant and speciose at higher latitudes. New palynological, wildfire, organic carbon isotope, and atmospheric pCO2 data from early dinosaur-bearing strata of low paleolatitudes in western North America show that large, high-frequency, tightly correlated variations in δ13Corg and palynomorph ecotypes occurred within a context of elevated and increasing pCO2 and pervasive wildfires. Whereas pseudosuchian archosaur-dominated communities were able to persist in these same regions under rapidly fluctuating extreme climatic conditions until the end-Triassic, large-bodied, fast-growing tachymetabolic dinosaurian herbivores requiring greater resources were unable to adapt to unstable high CO2 environmental conditions of the Late Triassic.
Millions can be saved through better energy management in federal hospitals
Not Available
1982-09-01
A comparison of the energy savings achieved by hospitals of the Navy, Veterans Administration, and Indian Health service - all of which have energy conservation programs - with that of five non-federal hospitals having aggressive energy management programs indicated that these agencies could save between $16 million and $55 million more each year if additional energy-saving measures were adopted. The investment required to achieve these savings would be quickly recouped. GAO believes that additional energy-saving opportunities also exist at the Army and Air Force hospitals. Two important program elements - technical audits to identify cost-effective energy conservation measures and accountability to ensure that the measures are implemented - are generally missing or incomplete in federal hospitals' energy conservation efforts. By increasing emphasis on these elements, federal agencies could achieve many of the yet unrealized energy savings.
Latitudinal species diversity gradient of marine zooplankton for the last three million years
Yasuhara, Moriaki; Hunt, Gene; Dowsett, Harry J.; Robinson, Marci M.; Stoll, Danielle K.
2012-01-01
High tropical and low polar biodiversity is one of the most fundamental patterns characterising marine ecosystems, and the influence of temperature on such marine latitudinal diversity gradients is increasingly well documented. However, the temporal stability of quantitative relationships among diversity, latitude and temperature is largely unknown. Herein we document marine zooplankton species diversity patterns at four time slices [modern, Last Glacial Maximum (18 000 years ago), last interglacial (120 000 years ago), and Pliocene (~3.3–3.0 million years ago)] and show that, although the diversity-latitude relationship has been dynamic, diversity-temperature relationships are remarkably constant over the past three million years. These results suggest that species diversity is rapidly reorganised as species' ranges respond to temperature change on ecological time scales, and that the ecological impact of future human-induced temperature change may be partly predictable from fossil and paleoclimatological records.
Experience of active tuberculosis case finding in nearly 5 million households in India.
Prasad, B M; Satyanarayana, S; Chadha, S S; Das, A; Thapa, B; Mohanty, S; Pandurangan, S; Babu, E R; Tonsing, J; Sachdeva, K S
2016-03-21
In India, to increase tuberculosis (TB) case detection under the National Tuberculosis Programme, active case finding (ACF) was implemented by the Global Fund-supported Project Axshya, among high-risk groups in 300 districts. Between April 2013 and December 2014, 4.9 million households covering ~20 million people were visited. Of 350 047 presumptive pulmonary TB cases (cough of ⩾2 weeks) identified, 187 586 (54%) underwent sputum smear examination and 14 447 (8%) were found to be smear-positive. ACF resulted in the detection of a large number of persons with presumptive pulmonary TB and smear-positive TB. Ensuring sputum examination of all those with presumptive TB was a major challenge.
Boice, John D.
2015-02-27
A pilot study was completed demonstrating the feasibility of conducting an epidemiologic study assessing cancer and other disease mortality among nearly one million US veterans and workers exposed to ionizing radiation, a population 10 times larger than atomic bomb survivor study with high statistical power to evaluate low dose rate effects. Among the groups enumerated and/or studied were: (1) 194,000 Department of Energy Uranium Workers; (2) 6,700 Rocketdyne Radiation Workers; (3) 7,000 Mound Radiation Workers; (4) 156,000 DOE Plutonium Workers; (5) 212,000 Nuclear Power Plant Workers; (6) 130,000 Industrial Radiography Workers; (7) 1.7 million Medical Workers and (8) 135,000 Atomic Veterans.
Exceptionally preserved 450-million-year-old ordovician ostracods with brood care.
Siveter, David J; Tanaka, Gengo; Farrell, Una C; Martin, Markus J; Siveter, Derek J; Briggs, Derek E G
2014-03-31
Ostracod crustaceans are the most abundant fossil arthropods and are characterized by a long stratigraphic range. However, their soft parts are very rarely preserved, and the presence of ostracods in rocks older than the Silurian period [1-5] was hitherto based on the occurrence of their supposed shells. Pyritized ostracods that preserve limbs and in situ embryos, including an egg within an ovary and possible hatched individuals, are here described from rocks of the Upper Ordovician Katian Stage Lorraine Group of New York State, including examples from the famous Beecher's Trilobite Bed [6, 7]. This discovery extends our knowledge of the paleobiology of ostracods by some 25 million years and provides the first unequivocal demonstration of ostracods in the Ordovician period, including the oldest known myodocope, Luprisca incuba gen. et sp. nov. It also provides conclusive evidence of a developmental brood-care strategy conserved within Ostracoda for at least 450 million years.
Galileo view of Moon orbiting the Earth taken from 3.9 million miles
NASA Technical Reports Server (NTRS)
1992-01-01
Eight days after its encounter with the Earth, the Galileo spacecraft was able to look back and capture this remarkable view of the Moon in orbit about the Earth, taken from a distance of about 6.2 million kilometers (3.9 million miles). The picture was constructed from images taken through the violet, red, and 1.0-micron infrared filters. The Moon is in the foreground, moving from left to right. The brightly-colored Earth contrasts strongly with the Moon, which reflects only about one-third as much sunlight as the Earth. Contrast and color have been computer-enhanced for both objects to improve visibility. Antarctica is visible through clouds (bottom). The Moon's far side is seen; the shadowy indentation in the dawn terminator is the south-Pole/Aitken Basin, one of the largest and oldest lunar impact features. Alternate Jet Propulsion Laboratory (JPL) number is P-41508.
Merolla, Paul A; Arthur, John V; Alvarez-Icaza, Rodrigo; Cassidy, Andrew S; Sawada, Jun; Akopyan, Filipp; Jackson, Bryan L; Imam, Nabil; Guo, Chen; Nakamura, Yutaka; Brezzo, Bernard; Vo, Ivan; Esser, Steven K; Appuswamy, Rathinakumar; Taba, Brian; Amir, Arnon; Flickner, Myron D; Risk, William P; Manohar, Rajit; Modha, Dharmendra S
2014-08-08
Inspired by the brain's structure, we have developed an efficient, scalable, and flexible non-von Neumann architecture that leverages contemporary silicon technology. To demonstrate, we built a 5.4-billion-transistor chip with 4096 neurosynaptic cores interconnected via an intrachip network that integrates 1 million programmable spiking neurons and 256 million configurable synapses. Chips can be tiled in two dimensions via an interchip communication interface, seamlessly scaling the architecture to a cortexlike sheet of arbitrary size. The architecture is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification. With 400-pixel-by-240-pixel video input at 30 frames per second, the chip consumes 63 milliwatts.
A Quantum Probability Model of Causal Reasoning
Trueblood, Jennifer S.; Busemeyer, Jerome R.
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747
Theoretical Analysis of Rain Attenuation Probability
NASA Astrophysics Data System (ADS)
Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan
2007-07-01
Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.
The Probability Distribution of Daily Streamflow
NASA Astrophysics Data System (ADS)
Blum, A.; Vogel, R. M.
2015-12-01
Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.
Probability of metastable states in Yukawa clusters
NASA Astrophysics Data System (ADS)
Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael
2008-11-01
Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)
Atomic Transition Probabilities for Rare Earths
NASA Astrophysics Data System (ADS)
Curry, J. J.; Anderson, Heidi M.; den Hartog, E. A.; Wickliffe, M. E.; Lawler, J. E.
1996-10-01
Accurate absolute atomic transition probabilities for selected neutral and singly ionized rare earth elements including Tm, Dy, and Ho are being measured. The increasing use of rare earths in high intensity discharge lamps provides motivation; the data are needed for diagnosing and modeling the lamps. Radiative lifetimes, measured using time resolved laser induced fluorescence (LIF), are combined with branching fractions, measured using a large Fourier transform spectrometer (FTS), to determine accurate absolute atomic transition probabilities. More than 15,000 LIF decay curves from Tm and Dy atoms and ions in slow beams have been recorded and analyzed. Radiative lifetimes for 298 levels of TmI and TmII and for 450 levels of DyI and DyII are determined. Branching fractions are extracted from spectra recorded using the 1.0 m FTS at the National Solar Observatory. Branching fractions and absolute transition probabilities for 500 of the strongest TmI and TmII lines are complete. Representative lifetime and branching fraction data will be presented and discussed. Supported by Osram Sylvania Inc. and the NSF.
Bacteria survival probability in bactericidal filter paper.
Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M
2014-05-01
Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.
A quantum probability model of causal reasoning.
Trueblood, Jennifer S; Busemeyer, Jerome R
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
Determination of thorium in the parts per million range in rocks
Levine, H.; Grimaldi, F.S.
1958-01-01
A procedure is presented for the determination of thorium in the concentration range of 0??2 to 10 parts per million ThO2 in felsic or mafic rocks. Thorium is extracted by mesityl oxide and purified by iodate precipitation from nitric acid medium containing tartaric acid and hydrogen peroxide. The thorium is determined spectrophotometrically with thoron from meso-tartaric acid medium. ?? 1958.
Wear of sequentially enhanced 9-Mrad polyethylene in 10 million cycle knee simulation study.
Tsukamoto, Riichiro; Williams, Paul Allen; Shoji, Hiromu; Hirakawa, Kazuo; Yamamoto, Kengo; Tsukamoto, Mikiko; Clarke, Ian C
2008-07-01
Highly crosslinked polyethylene (HXPE) has been shown to be effective in reducing wear in total hip replacements. HXPE has not found widespread use in TKR, because the crosslinking inevitably leads to reductions in critical properties such as toughness and fatigue strength. Sequentially enhanced crosslinking (SXPE) have been suggested for improved wear resistance for tibial inserts with maintenance of mechanical properties and anticipated high oxidation resistance superior to conventional polyethylene (XLPE). We compared the wear of SXPE (9Mrad) to XLPE inserts (3Mrad) to 10 million cycles. Triathlon femoral condyles were identical in both. This is the first wear study of SXPE inserts. According to the power law relating irradiation dose to wear of XLPE inserts, wear of 9 Mrad inserts should be reduced by 70% compared to 3Mrad controls. The wear rates of the SXPE inserts were reduced by 86% at 10 million cycles duration, somewhat greater than predicted. The one prior investigation by the manufacturer reported a 79% wear reduction for SXPE compared to controls in a 5 million cycle simulator study in knee design and test parameters. There were important differences between the two studies. Nevertheless there clearly appeared to be a major benefit for sequentially enhanced polyethylene in tibial inserts. This combined wear reduction of 80-85% with improved oxidation resistance and retention of mechanical properties may prove beneficial for active patients who may otherwise risk high wear rates over many years of use.
Lunar radionuclide records of average solar-cosmic-ray fluxes over the last ten million years
NASA Technical Reports Server (NTRS)
Reedy, R. C.
1980-01-01
The use of cosmogenic radionuclides in lunar materials as indicators of solar cosmic ray fluxes and thus solar activity over the past 10 million years is discussed. The nature of solar and galactic cosmic ray particles and their interactions with matter are reviewed, with particular emphasis on nuclide production by cosmic-ray-induced nuclear reactions. Evidence of galactic cosmic ray flux variations from measurements of radionuclide activities in meteorites is considered which has indicated changes of less than about 25-50% over the last few million years. Measurements of radionuclide activities in lunar materials which are used to determine solar cosmic ray fluxes are then examined together with direct proton measurements indicating variations in solar fluxes with different solar cycles. It is noted that whereas average solar proton fluxes determined for the last 1-10 million years from Al-26 and Mn-53 data show little variation and are similar to recent values, lunar C-14 and Kr-81 activities indicate average solar proton fluxes several times greater over the past 10,000 to 100,000 years.
Shell/esso will install /700-million underwater manifold center in North Sea
Not Available
1982-02-01
Sometime in the summer of 1982 engineers and technicians for Shell and Esso will make the first commercial installation of a sophisticated Underwater Manifold Center (UMC) in tests until (490-ft) water in the Cormorant field of the UK North Sea sector. The subsea production system has been under development since 1974. If its promise is fulfilled, it could mean millions of additional barrels of oil can be produced commercially from small reservoirs beyond the reach of existing platforms and from marginal fields lying in deep water, sometimes in combination with floating production systems. The UMC will be capable of working in several thousand feet of water safely without diver intervention and with minimum maintenance. The underwater test project is scheduled to begin production in 1983 and continue its tests until 1986. Total cost is estimated at /700 million. During its anticipated 25-year lifetime, the UMC is expected to recover about 110 million bbls from the central Cormorant area - 20% of the field's total anticipated production.
The probability and severity of decompression sickness
Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p << 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928
The probability and severity of decompression sickness.
Howle, Laurens E; Weber, Paul W; Hada, Ethan A; Vann, Richard D; Denoble, Petar J
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild-Type I (manifestations 4-6)-and serious-Type II (manifestations 1-3). Additionally, we considered an alternative grouping of mild-Type A (manifestations 3-6)-and serious-Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p < 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of 'mild' DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed.
The Karskiy craters are the probable records of catastrophe at the Cretaceous-Tertiary boundary
NASA Technical Reports Server (NTRS)
Kolesnikov, E. M.; Nazarov, M. A.; Badjukov, D. D.; Shukolyukov, Yu. A.
1988-01-01
In order to corroborate the hypothesis of Alvarez and others about the connection of mass mortality and meteorite or cometary impact at the Cretaceous-Tertiary boundary, it is necessary to find a meteorite crater which was formed at the same time. Masaitiss suggested that the Karskiy craters (USSR) are suitable, but previous K/Ar data from other laboratories are very different (from 47 to 82 million years). Impact glasses were gathered from the Karskiy and Ust-Karskiy craters K/Ar age analyses were performed. The glasses cooled very rapidly and had the youngest model ages from 65.8 to 67.6 million years. The slower cooling crypto-crystalline aggregates had more ancient model ages, from 70.5 to 73.9 my as had tagamite because they captured excess argon during crystallization. Least squares analysis showed that with probability of 99 percent the findings on crypto-crystalline aggregates, tagamite and quartz glasses from the Karskiy and Ust-Karskiy craters lie on an isochron which has an age of 65.8 + or - 1.1 million years and a content of excess argon. For the two glasses with identical composition which have different quantities of secondary non-potassium minerals, an independent method determined the content of excess argon. Taking into account these data a more exact slope of the first isochron of 66.4 + or - 1.0 million years was observed and the second glass isochron with age 66.5 + or - 1.1 million years was constructed.
CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS
Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...
Probability sampling in legal cases: Kansas cellphone users
NASA Astrophysics Data System (ADS)
Kadane, Joseph B.
2012-10-01
Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.
Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation
NASA Astrophysics Data System (ADS)
Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin
2016-12-01
If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.
Elemental mercury poisoning probably causes cortical myoclonus.
Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B
2007-10-15
Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.
The Prediction of Spatial Aftershock Probabilities (PRESAP)
NASA Astrophysics Data System (ADS)
McCloskey, J.
2003-12-01
It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the
Probable interaction between trazodone and carbamazepine.
Sánchez-Romero, A; Mayordomo-Aranda, A; García-Delgado, R; Durán-Quintana, J A
2011-06-01
The need to maintain long-term treatment of chronic pathologies makes the appearance of interactions possible when such therapies incorporate other drugs to deal with the aggravation of the same or other intercurrent pathologies. A case is presented in which the addition of trazodone to a chronic treatment with carbamazepine (CBZ) is associated with symptoms typical for intoxication by this antiepileptic, accompanied by a raised serum concentration. When the trazodone was suspended, these symptoms lessened and the concentration of CBZ decreased progressively, suggesting a probable interaction between the 2 drugs.
Atomic transition probabilities of Gd i
NASA Astrophysics Data System (ADS)
Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.
2011-05-01
Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.
Atomic transition probabilities of Er i
NASA Astrophysics Data System (ADS)
Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.
2010-12-01
Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.
Modulation Based on Probability Density Functions
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Probability density functions in turbulent channel flow
NASA Technical Reports Server (NTRS)
Dinavahi, Surya P. G.
1992-01-01
The probability density functions (pdf's) of the fluctuating velocity components, as well as their first and second derivatives, are calculated using data from the direct numerical simulations (DNS) of fully developed turbulent channel flow. It is observed that, beyond the buffer region, the pdf of each of these quantities is independent of the distance from the channel wall. It is further observed that, beyond the buffer region, the pdf's for all the first derivatives collapse onto a single universal curve and those of the second derivatives also collapse onto another universal curve, irrespective of the distance from the wall. The kinetic-energy dissipation rate exhibits log normal behavior.
Nagy, Lois Anne; Zumberge, John E.
1976-01-01
Microfossils, probably representing members of Precambrian photosynthetic communities of bacteria and blue-green algae, have been found in the approximately 2800-2500 million-year-old Bulawayan stromatolites from Rhodesia. Several populations of coccoid and elongate microfossils have been observed in the dark, carbon-rich stromatolite laminae. Some of these elongate forms are morphologically similar to modern bacterial spores. These microfossils were studied in petrographic thin sections and identified by combined scanning electron microscopy-electron microprobe and by analyses of energy dispersive spectra of individual microfossils. The microfossils contain 1-20% organic carbon; some morphotypes contain traces of sulfur and one other, traces of phosphorus. The polymeric nature of the organic carbon was established by analyzing aggregates of microfossils at elevated temperatures in the solid inlet system of an organic mass spectrometer. The coccoid microfossils range in size from 1.2 to 4.3 μm, the elongate microfossils are from 2.4 to 9.8 μm. They are mineralized with dolomite, embedded in a calcite matrix, and are shown to be both indigenous and syngenous with the rock. Identical microfossils also containing organic carbon but mineralized with quartz have been observed in the stromatolites from Belingwe which are part of the Bulawayan Group from Rhodesia. Caution must be used in the interpretation of what these forms are because of their great age and relatively simple morphologies. However, based on morphology and chemical analyses, they represent fossilized bacteria, blue-green algae, or, most likely, both. Images PMID:16592348
NASA Astrophysics Data System (ADS)
Tan, Elcin
physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.
Estimating flood exceedance probabilities in estuarine regions
NASA Astrophysics Data System (ADS)
Westra, Seth; Leonard, Michael
2016-04-01
Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).
An all-timescales rainfall probability distribution
NASA Astrophysics Data System (ADS)
Papalexiou, S. M.; Koutsoyiannis, D.
2009-04-01
The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.
Computation-distributed probability hypothesis density filter
NASA Astrophysics Data System (ADS)
Wang, Junjie; Zhao, Lingling; Su, Xiaohong; Shi, Chunmei; Ma, JiQuan
2016-12-01
Particle probability hypothesis density filtering has become a promising approach for multi-target tracking due to its capability of handling an unknown and time-varying number of targets in a nonlinear, non-Gaussian system. However, its computational complexity linearly increases with the number of obtained observations and the number of particles, which can be very time consuming, particularly when numerous targets and clutter exist in the surveillance region. To address this issue, we present a distributed computation particle probability hypothesis density(PHD) filter for target tracking. It runs several local decomposed particle PHD filters in parallel while processing elements. Each processing element takes responsibility for a portion of particles but all measurements and provides local estimates. A central unit controls particle exchange among the processing elements and specifies a fusion rule to match and fuse the estimates from different local filters. The proposed framework is suitable for parallel implementation. Simulations verify that the proposed method can significantly accelerate and maintain a comparative accuracy compared to the standard particle PHD filter.
Measures, Probability and Holography in Cosmology
NASA Astrophysics Data System (ADS)
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We
2017-01-01
The bat genus Myotis is represented by 120+ living species and 40+ extinct species and is found on every continent except Antarctica. The time of divergence of Myotis has been contentious as has the time and place of origin of its encompassing group the Vespertilionidae, the most diverse (450+ species) and widely distributed extant bat family. Fossil Myotis species are common, especially in Europe, beginning in the Miocene but earlier records are poor. Recent study of new specimens from the Belgian early Oligocene locality of Boutersem reveals the presence of a relatively large vespertilionid. Morphological comparison and phylogenetic analysis confirms that the new, large form can be confidently assigned to the genus Myotis, making this record the earliest known for that taxon and extending the temporal range of this extant genus to over 33 million years. This suggests that previously published molecular divergence dates for crown myotines (Myotis) are too young by at least 7 million years. Additionally, examination of first fossil appearance data of 1,011 extant placental mammal genera indicates that only 13 first occurred in the middle to late Paleogene (48 to 33 million years ago) and of these, six represent bats, including Myotis. Paleogene members of both major suborders of Chiroptera (Yangochiroptera and Yinpterochiroptera) include extant genera indicating early establishment of successful and long-term adaptive strategies as bats underwent an explosive radiation near the beginning of the Early Eocene Climatic Optimum in the Old World. A second bat adaptive radiation in the New World began coincident with the Mid-Miocene Climatic Optimum. PMID:28273112
Innovation for the 'bottom 100 million': eliminating neglected tropical diseases in the Americas.
Hotez, Peter J; Dumonteil, Eric; Heffernan, Michael J; Bottazzi, Maria E
2013-01-01
An estimated 100 million people in the Latin American and Caribbean (LAC) region live on less than US$2 per day, while another 46 million people in the US live below that nation's poverty line. Almost all of the 'bottom 100 million' people suffer from at least one neglected tropical disease (NTD), including one-half of the poorest people in the region infected with hookworms, 10% with Chagas disease, and up to 1-2% with dengue, schistosomiasis, and/or leishmaniasis. In the US, NTDs such as Chagas disease, cysticercosis, toxocariasis, and trichomoniasis are also common among poor populations. These NTDs trap the poorest people in the region in poverty, because of their impact on maternal and child health, and occupational productivity. Through mass drug administration (MDA), several NTDs are on the verge of elimination in the Americas, including lymphatic filariasis, onchocerciasis, trachoma, and possibly leprosy. In addition, schistosomiasis may soon be eliminated in the Caribbean. However, for other NTDs including hookworm infection, Chagas disease, dengue, schistosomiasis, and leishmaniasis, a new generation of 'anti-poverty vaccines' will be required. Several vaccines for dengue are under development by multinational pharmaceutical companies, whereas others are being pursued through non-profit product development partnerships (PDPs), in collaboration with developing country manufacturers in Brazil and Mexico. The Sabin Vaccine Institute PDP is developing a primarily preventive bivalent recombinant human hookworm vaccine, which is about to enter phase 1 clinical testing in Brazil, as well as a new therapeutic Chagas disease vaccine in collaboration with several Mexican institutions. The Chagas disease vaccine would be administered to seropositive patients to delay or prevent the onset of Chagasic cardiomyopathy (secondary prevention). Together, MDA and the development of new anti-poverty vaccines afford an opportunity to implement effective control and
Gunnell, Gregg F; Smith, Richard; Smith, Thierry
2017-01-01
The bat genus Myotis is represented by 120+ living species and 40+ extinct species and is found on every continent except Antarctica. The time of divergence of Myotis has been contentious as has the time and place of origin of its encompassing group the Vespertilionidae, the most diverse (450+ species) and widely distributed extant bat family. Fossil Myotis species are common, especially in Europe, beginning in the Miocene but earlier records are poor. Recent study of new specimens from the Belgian early Oligocene locality of Boutersem reveals the presence of a relatively large vespertilionid. Morphological comparison and phylogenetic analysis confirms that the new, large form can be confidently assigned to the genus Myotis, making this record the earliest known for that taxon and extending the temporal range of this extant genus to over 33 million years. This suggests that previously published molecular divergence dates for crown myotines (Myotis) are too young by at least 7 million years. Additionally, examination of first fossil appearance data of 1,011 extant placental mammal genera indicates that only 13 first occurred in the middle to late Paleogene (48 to 33 million years ago) and of these, six represent bats, including Myotis. Paleogene members of both major suborders of Chiroptera (Yangochiroptera and Yinpterochiroptera) include extant genera indicating early establishment of successful and long-term adaptive strategies as bats underwent an explosive radiation near the beginning of the Early Eocene Climatic Optimum in the Old World. A second bat adaptive radiation in the New World began coincident with the Mid-Miocene Climatic Optimum.
Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People
Doherty, Aiden R.; Smeaton, Alan F.
2010-01-01
In sensor research we take advantage of additional contextual sensor information to disambiguate potentially erroneous sensor readings or to make better informed decisions on a single sensor’s output. This use of additional information reinforces, validates, semantically enriches, and augments sensed data. Lifelog data is challenging to augment, as it tracks one’s life with many images including the places they go, making it non-trivial to find associated sources of information. We investigate realising the goal of pervasive user-generated content based on sensors, by augmenting passive visual lifelogs with “Web 2.0” content collected by millions of other individuals. PMID:22294880
Revival and Identification of Bacterial Spores in 25- to 40-Million-Year-Old Dominican Amber
NASA Astrophysics Data System (ADS)
Cano, Raul J.; Borucki, Monica K.
1995-05-01
A bacterial spore was revived, cultured, and identified from the abdominal contents of extinct bees preserved for 25 to 40 million years in buried Dominican amber. Rigorous surface decontamination of the amber and aseptic procedures were used during the recovery of the bacterium. Several lines of evidence indicated that the isolated bacterium was of ancient origin and not an extant contaminant. The characteristic enzymatic, biochemical, and 16S ribosomal DNA profiles indicated that the ancient bacterium is most closely related to extant Bacillus sphaericus.
SETI@home: A Million CPU Years and Still No ETs
Anderson, David P.
2001-04-11
SETI{at}home records data at the Arecibo radio observatory, distributes it through the Internet, and analyzes it using a screensaver program, searching for signs of extraterrestrial life. In our first year of operation we analyzed 15 Terabytes of data using 400,000 years of computer time. Over 2.5 million people in 226 countries have participated. SETI{at}home is the largest computation ever performed, is the first major scientific experiment with large-scale public participation, and serves as a prototype for future distributed-computing projects.
Automatically augmenting lifelog events using pervasively generated content from millions of people.
Doherty, Aiden R; Smeaton, Alan F
2010-01-01
In sensor research we take advantage of additional contextual sensor information to disambiguate potentially erroneous sensor readings or to make better informed decisions on a single sensor's output. This use of additional information reinforces, validates, semantically enriches, and augments sensed data. Lifelog data is challenging to augment, as it tracks one's life with many images including the places they go, making it non-trivial to find associated sources of information. We investigate realising the goal of pervasive user-generated content based on sensors, by augmenting passive visual lifelogs with "Web 2.0" content collected by millions of other individuals.
Determination of niobium in the parts per million range in rocks
Grimaldi, F.S.
1960-01-01
A modified niobium thiocyanate spectrophotometric procedure relatively insensitive to titanium interference is presented. Elements such as tungsten, molybdenum, vanadium, and rhenium, which seriously interfere in the spectrophotometric determination of niobium, are separated by simple sodium hydroxide fusion and leach; iron and magnesium are used as carriers for the niobium. Tolerance limits are given for 28 elements in the spectrophotometric method. Specific application is made to the determination of niobium in the parts per million range in rocks. The granite G-1 contains 0.0022% niobium and the diabase W-1 0.00096% niobium.
Fossilized nuclei and chromosomes reveal 180 million years of genomic stasis in royal ferns.
Bomfleur, Benjamin; McLoughlin, Stephen; Vajda, Vivi
2014-03-21
Rapidly permineralized fossils can provide exceptional insights into the evolution of life over geological time. Here, we present an exquisitely preserved, calcified stem of a royal fern (Osmundaceae) from Early Jurassic lahar deposits of Sweden in which authigenic mineral precipitation from hydrothermal brines occurred so rapidly that it preserved cytoplasm, cytosol granules, nuclei, and even chromosomes in various stages of cell division. Morphometric parameters of interphase nuclei match those of extant Osmundaceae, indicating that the genome size of these reputed "living fossils" has remained unchanged over at least 180 million years-a paramount example of evolutionary stasis.
A new ascarid species in cynodont coprolite dated of 240 million years.
Silva, Priscilla A da; Borba, Victor H; Dutra, Juliana M F; Leles, Daniela; da-Rosa, Atila A S; Ferreira, Luiz F; Araujo, Adauto
2014-03-01
Cynodonts represent the transition from reptiles to mammals. They are classified as synapsids, or tetrapod animals with mammalian characteristics. We present here the finding of helminth eggs in a coprolite identified as of cynodont origin dated of nearly 240 million years. Microscopy revealed the presence of very well preserved intestinal parasite eggs. Up to now we identified an ascarid egg by morphological characteristics. Based on a previous description of the new genus Ascarites Poinar Jr and Boucot 2006 in coprolites of iguanodons from Belgium, we propose a new species, Ascarites rufferi n.sp. in cynodonts, a host that inhabited the Southern Region of Brazil in the Triassic period.
NASA Technical Reports Server (NTRS)
Kasting, J. F.
1984-01-01
A self-consistent method of determining initial conditions for the model presented by Berner, Lasaga, and Garrels (1983) (henceforth, the BLAG model) is derived, based on the assumption that the CO2 geochemical cycle was in steady state at t = -100 my (million years). This initialization procedure leads to a dissolved magnesium concentration higher than that calculated by Berner, Lasaga, and Garrels and to a low ratio of dissolved calcium to bicarbonate prior to 60 my ago. The latter prediction conflicts with the geologic record of evaporite deposits, which requires that this ratio remain greater than 0.5. The contradiction is probably caused by oversimplifications in the BLAG model, such as the neglect of the cycles of organic carbon and sulfur.
NASA Technical Reports Server (NTRS)
Kasting, J. F.
1984-01-01
A self-consistent method of determining initial conditions for the model presented by Berner, Lasaga, and Garrels (1983) (henceforth, the BLAG model) is derived, based on the assumption that the CO2 geochemical cycle was in steady state at t = -100 m.y. (million years). This initialization procedure leads to a dissolved magnesium concentration higher than that calculated by Berner, Lasaga, and Garrels and to a low ratio of dissolved calcium to bicarbonate prior to 60 m.y. ago. The latter prediction conflicts with the geologic record of evaporite deposits, which requires that this ratio remain greater than 0.5. The contradiction is probably caused by oversimplifications in the BLAG model, such as the neglect of the cycles of organic carbon and sulfur.
Significance of "high probability/low damage" versus "low probability/high damage" flood events
NASA Astrophysics Data System (ADS)
Merz, B.; Elmer, F.; Thieken, A. H.
2009-06-01
The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.
Not All Probabilities Are Equivalent: Evidence From Orientation Versus Spatial Probability Learning.
Jabar, Syaheed B; Anderson, Britt
2017-02-23
Frequently targets are detected faster, probable locations searched earlier, and likely orientations estimated more precisely. Are these all consequences of a single, domain-general "attentional" effect? To examine this issue, participants were shown brief instances of spatial gratings, and were tasked to draw their location and orientation. Unknown to participants, either the location or orientation probability of these gratings were manipulated. While orientation probability affected the precision of orientation reports, spatial probability did not. Further, utilising lowered stimulus contrast (via a staircase procedure) and a combination of behavioral precision and confidence self-report, we clustered trials with perceived stimuli from trials where the target was not detected: Spatial probability only modulated the likelihood of stimulus detection, but not did not modulate perceptual precision. Even when no physical attentional cues are present, acquired probabilistic information on space versus orientation leads to separable 'attention-like' effects on behaviour. We discuss how this could be linked to distinct underlying neural mechanisms. (PsycINFO Database Record
Segmentation and automated measurement of chronic wound images: probability map approach
NASA Astrophysics Data System (ADS)
Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.
2014-03-01
estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.
NASA Astrophysics Data System (ADS)
Couthures, Jean
1989-11-01
A reconstruction, based on direct observations in the form of borings and paleontological digs, indicates repetitive catastrophes in the palaeo-lake of the Sénèze depression long after formation of the maar. This event is correlated with extensive slumping associated with erosion of the crater rim induced by a critical ring fault. K-Ar dating gives a lower limit of around 2.48 million years for the initial formation, but not the lapse of time between eruption and filling. According to drilling, sedimentology, and palynological analysis, the lake was functional between 2.3 and 1.3 million years ago, i.e., throughout the Tiglian and at the beginning of the Eburonian. The destruction of all animal life could only have been due to asphyxia, but probably resulted from a CO 2 release. All classes of animals, mainly mammals, were affected and subsequently died. They are found in the same levels in several fossiliferous beds, at the top of the lacustrine deposits and also in reworked debris from the crater rim.
Naive Probability: A Mental Model Theory of Extensional Reasoning.
ERIC Educational Resources Information Center
Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul
1999-01-01
Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…
Economic choices reveal probability distortion in macaque monkeys.
Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram
2015-02-18
Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing.
40-million-year lake record of early Mesozoic orbital climatic forcing
Olsen, P.E.
1986-11-14
Sediments of the early Mesozoic Newark Supergroup of eastern North America consist largely of sedimentary cycles produced by the rise and fall of very large lakes that responded to periodic climate changes controlled by variations in the earth's orbit. Fourier analysis of long sections of the Late Triassic Lockatong and Passaic formations of the Newark Basin show periods in thickness of 5.9, 10.5, 25.2, 32.0, and 96.0 meters corresponding to periodicities in time of roughly 25,000, 44,000, 100,000, 133,000 and 400,000 years, as judged by radiometric time scales and varve-calibrated sedimentation rates. The ratios of the shortest cycle with longer cycles correspond closely to the ratios of the present periods of the main orbital terms that appear to influence climate. Similar long sequences of sedimentary cycles occur through most of the rest of the Newark Supergroup spanning a period of more than 40 million years. This is strong evidence of orbital forcing of climate in the ice-free early Mesozoic and indicates that the main periods of the orbital cycles were not very different 200 million years ago from those today.
Twenty-million-year relationship between mammalian diversity and primary productivity.
Fritz, Susanne A; Eronen, Jussi T; Schnitzler, Jan; Hof, Christian; Janis, Christine M; Mulch, Andreas; Böhning-Gaese, Katrin; Graham, Catherine H
2016-09-27
At global and regional scales, primary productivity strongly correlates with richness patterns of extant animals across space, suggesting that resource availability and climatic conditions drive patterns of diversity. However, the existence and consistency of such diversity-productivity relationships through geological history is unclear. Here we provide a comprehensive quantitative test of the diversity-productivity relationship for terrestrial large mammals through time across broad temporal and spatial scales. We combine >14,000 occurrences for 690 fossil genera through the Neogene (23-1.8 Mya) with regional estimates of primary productivity from fossil plant communities in North America and Europe. We show a significant positive diversity-productivity relationship through the 20-million-year record, providing evidence on unprecedented spatial and temporal scales that this relationship is a general pattern in the ecology and paleo-ecology of our planet. Further, we discover that genus richness today does not match the fossil relationship, suggesting that a combination of human impacts and Pleistocene climate variability has modified the 20-million-year ecological relationship by strongly reducing primary productivity and driving many mammalian species into decline or to extinction.
Gueriau, Pierre; Rabet, Nicolas; Clément, Gaël; Lagebro, Linda; Vannier, Jean; Briggs, Derek E G; Charbonnier, Sylvain; Olive, Sébastien; Béthoux, Olivier
2016-02-08
Branchiopod crustaceans are represented by fairy, tadpole, and clam shrimps (Anostraca, Notostraca, Laevicaudata, Spinicaudata), which typically inhabit temporary freshwater bodies, and water fleas (Cladoceromorpha), which live in all kinds of freshwater and occasionally marine environments [1, 2]. The earliest branchiopods occur in the Cambrian, where they are represented by complete body fossils from Sweden such as Rehbachiella kinnekullensis [3] and isolated mandibles preserved as small carbonaceous fossils [4-6] from Canada. The earliest known continental branchiopods are associated with hot spring environments [7] represented by the Early Devonian Rhynie Chert of Scotland (410 million years ago) and include possible stem-group or crown-group Anostraca, Notostraca, and clam shrimps or Cladoceromorpha [8-10], which differ morphologically from their modern counterparts [1, 2, 11]. Here we report the discovery of an ephemeral pool branchiopod community from the 365-million-year-old Strud locality of Belgium. It is characterized by new anostracans and spinicaudatans, closely resembling extant species, and the earliest notostracan, Strudops goldenbergi [12]. These branchiopods released resting eggs into the sediment in a manner similar to their modern representatives [1, 2]. We infer that this reproductive strategy was critical to overcoming environmental constraints such as seasonal desiccation imposed by living on land. The pioneer colonization of ephemeral freshwater pools by branchiopods in the Devonian was followed by remarkable ecological and morphological stasis that persists to the present day.
Physical mapping of the elephant X chromosome: conservation of gene order over 105 million years.
Delgado, Claudia Leticia Rodríguez; Waters, Paul D; Gilbert, Clément; Robinson, Terence J; Graves, Jennifer A Marshall
2009-01-01
All therian mammals (eutherians and marsupials) have an XX female/XY male sex chromosome system or some variant of it. The X and Y evolved from a homologous pair of autosomes over the 166 million years since therian mammals diverged from monotremes. Comparing the sex chromosomes of eutherians and marsupials defined an ancient X conserved region that is shared between species of these mammalian clades. However, the eutherian X (and the Y) was augmented by a recent addition (XAR) that is autosomal in marsupials. XAR is part of the X in primates, rodents, and artiodactyls (which belong to the eutherian clade Boreoeutheria), but it is uncertain whether XAR is part of the X chromosome in more distantly related eutherian mammals. Here we report on the gene content and order on the X of the elephant (Loxodonta africana)-a representative of Afrotheria, a basal endemic clade of African mammals-and compare these findings to those of other documented eutherian species. A total of 17 genes were mapped to the elephant X chromosome. Our results support the hypothesis that the eutherian X and Y chromosomes were augmented by the addition of autosomal material prior to eutherian radiation. Not only does the elephant X bear the same suite of genes as other eutherian X chromosomes, but gene order appears to have been maintained across 105 million years of evolution, perhaps reflecting strong constraints posed by the eutherian X inactivation system.
A 61-million-person experiment in social influence and political mobilization.
Bond, Robert M; Fariss, Christopher J; Jones, Jason J; Kramer, Adam D I; Marlow, Cameron; Settle, Jaime E; Fowler, James H
2012-09-13
Human behaviour is thought to spread through face-to-face social networks, but it is difficult to identify social influence effects in observational studies, and it is unknown whether online social networks operate in the same way. Here we report results from a randomized controlled trial of political mobilization messages delivered to 61 million Facebook users during the 2010 US congressional elections. The results show that the messages directly influenced political self-expression, information seeking and real-world voting behaviour of millions of people. Furthermore, the messages not only influenced the users who received them but also the users' friends, and friends of friends. The effect of social transmission on real-world voting was greater than the direct effect of the messages themselves, and nearly all the transmission occurred between 'close friends' who were more likely to have a face-to-face relationship. These results suggest that strong ties are instrumental for spreading both online and real-world behaviour in human social networks.
From the Primitive Soup to Cyanobacteria: It May have Taken Less Than 10 Million Years
NASA Technical Reports Server (NTRS)
Miller, Stanley L.; Lazcano, Antonio
1996-01-01
Most scientific discussions on the likelihood of extraterrestrial life have been constrained by the characteristics of life on our planet and the environmental conditions under which it may have emerged. Although it has been generally assumed that this process must have been extremely slow, involving hundreds of millions or even billions of years, a number of recent discoveries have led to a considerable compression of the time believed necessary for life to appear. It is now recognized that during its early history the Earth and other bodies of the inner Solar System went through a stage of intense collisions. Some of these impacts by large asteroids or comets may have raised the terrestrial surface to sterilizing temperatures and may have evaporated the oceans and killed off life as late as 3.8 x 10(exp 9) years ago. However, there is also ample paleontological evidence derived from the 3.5 x 10(exp 9) year old Warrawoona sediments showing that only 300 million years after the period of intense impacts ended, our planet was populated by phototactic, stromatolite-forming microorganisms. Although these discoveries are now generally interpreted to imply that the origin and early evolution of life were rapid, no attempts have been made to estimate the actual time required for these processes to occur.
Gene flow persists millions of years after speciation in Heliconius butterflies
2008-01-01
Background Hybridization, or the interbreeding of two species, is now recognized as an important process in the evolution of many organisms. However, the extent to which hybridization results in the transfer of genetic material across the species boundary (introgression) remains unknown in many systems, as does the length of time after initial divergence that the species boundary remains porous to such gene flow. Results Here I use genome-wide genotypic and DNA sequence data to show that there is introgression and admixture between the melpomene/cydno and silvaniform clades of the butterfly genus Heliconius, groups that separated from one another as many as 30 million generations ago. Estimates of historical migration based on 523 DNA sequences from 14 genes suggest unidirectional gene flow from the melpomene/cydno clade into the silvaniform clade. Furthermore, genetic clustering based on 520 amplified fragment length polymorphisms (AFLPs) identified multiple individuals of mixed ancestry showing that introgression is on-going. Conclusion These results demonstrate that genomes can remain porous to gene flow very long after initial divergence. This, in turn, greatly expands the evolutionary potential afforded by introgression. Phenotypic and species diversity in a wide variety of organisms, including Heliconius, have likely arisen from introgressive hybridization. Evidence for continuous gene flow over millions of years points to introgression as a potentially important source of genetic variation to fuel the evolution of novel forms. PMID:18371203
Liebeskind, David S
2016-01-01
Crowdsourcing, an unorthodox approach in medicine, creates an unusual paradigm to study precision cerebrovascular health, eliminating the relative isolation and non-standardized nature of current imaging data infrastructure, while shifting emphasis to the astounding capacity of big data in the cloud. This perspective envisions the use of imaging data of the brain and vessels to orient and seed A Million Brains Initiative™ that may leapfrog incremental advances in stroke and rapidly provide useful data to the sizable population around the globe prone to the devastating effects of stroke and vascular substrates of dementia. Despite such variability in the type of data available and other limitations, the data hierarchy logically starts with imaging and can be enriched with almost endless types and amounts of other clinical and biological data. Crowdsourcing allows an individual to contribute to aggregated data on a population, while preserving their right to specific information about their own brain health. The cloud now offers endless storage, computing prowess, and neuroimaging applications for postprocessing that is searchable and scalable. Collective expertise is a windfall of the crowd in the cloud and particularly valuable in an area such as cerebrovascular health. The rise of precision medicine, rapidly evolving technological capabilities of cloud computing and the global imperative to limit the public health impact of cerebrovascular disease converge in the imaging of A Million Brains Initiative™. Crowdsourcing secure data on brain health may provide ultimate generalizability, enable focused analyses, facilitate clinical practice, and accelerate research efforts.
Toward Millions of File System IOPS on Low-Cost, Commodity Hardware.
Zheng, Da; Burns, Randal; Szalay, Alexander S
2013-01-01
We describe a storage system that removes I/O bottlenecks to achieve more than one million IOPS based on a user-space file abstraction for arrays of commodity SSDs. The file abstraction refactors I/O scheduling and placement for extreme parallelism and non-uniform memory and I/O. The system includes a set-associative, parallel page cache in the user space. We redesign page caching to eliminate CPU overhead and lock-contention in non-uniform memory architecture machines. We evaluate our design on a 32 core NUMA machine with four, eight-core processors. Experiments show that our design delivers 1.23 million 512-byte read IOPS. The page cache realizes the scalable IOPS of Linux asynchronous I/O (AIO) and increases user-perceived I/O performance linearly with cache hit rates. The parallel, set-associative cache matches the cache hit rates of the global Linux page cache under real workloads.
Twenty-million-year relationship between mammalian diversity and primary productivity
Fritz, Susanne A.; Eronen, Jussi T.; Schnitzler, Jan; Hof, Christian; Janis, Christine M.; Mulch, Andreas; Böhning-Gaese, Katrin; Graham, Catherine H.
2016-01-01
At global and regional scales, primary productivity strongly correlates with richness patterns of extant animals across space, suggesting that resource availability and climatic conditions drive patterns of diversity. However, the existence and consistency of such diversity–productivity relationships through geological history is unclear. Here we provide a comprehensive quantitative test of the diversity–productivity relationship for terrestrial large mammals through time across broad temporal and spatial scales. We combine >14,000 occurrences for 690 fossil genera through the Neogene (23–1.8 Mya) with regional estimates of primary productivity from fossil plant communities in North America and Europe. We show a significant positive diversity–productivity relationship through the 20-million-year record, providing evidence on unprecedented spatial and temporal scales that this relationship is a general pattern in the ecology and paleo-ecology of our planet. Further, we discover that genus richness today does not match the fossil relationship, suggesting that a combination of human impacts and Pleistocene climate variability has modified the 20-million-year ecological relationship by strongly reducing primary productivity and driving many mammalian species into decline or to extinction. PMID:27621451
Functional classification of 15 million SNPs detected from diverse chicken populations
Gheyas, Almas A.; Boschiero, Clarissa; Eory, Lel; Ralph, Hannah; Kuo, Richard; Woolliams, John A.; Burt, David W.
2015-01-01
Next-generation sequencing has prompted a surge of discovery of millions of genetic variants from vertebrate genomes. Besides applications in genetic association and linkage studies, a fraction of these variants will have functional consequences. This study describes detection and characterization of 15 million SNPs from chicken genome with the goal to predict variants with potential functional implications (pfVars) from both coding and non-coding regions. The study reports: 183K amino acid-altering SNPs of which 48% predicted as evolutionary intolerant, 13K splicing variants, 51K likely to alter RNA secondary structures, 500K within most conserved elements and 3K from non-coding RNAs. Regions of local fixation within commercial broiler and layer lines were investigated as potential selective sweeps using genome-wide SNP data. Relationships with phenotypes, if any, of the pfVars were explored by overlaying the sweep regions with known QTLs. Based on this, the candidate genes and/or causal mutations for a number of important traits are discussed. Although the fixed variants within sweep regions were enriched with non-coding SNPs, some non-synonymous-intolerant mutations reached fixation, suggesting their possible adaptive advantage. The results presented in this study are expected to have important implications for future genomic research to identify candidate causal mutations and in poultry breeding. PMID:25926514
Liebeskind, David S.
2016-01-01
Crowdsourcing, an unorthodox approach in medicine, creates an unusual paradigm to study precision cerebrovascular health, eliminating the relative isolation and non-standardized nature of current imaging data infrastructure, while shifting emphasis to the astounding capacity of big data in the cloud. This perspective envisions the use of imaging data of the brain and vessels to orient and seed A Million Brains Initiative™ that may leapfrog incremental advances in stroke and rapidly provide useful data to the sizable population around the globe prone to the devastating effects of stroke and vascular substrates of dementia. Despite such variability in the type of data available and other limitations, the data hierarchy logically starts with imaging and can be enriched with almost endless types and amounts of other clinical and biological data. Crowdsourcing allows an individual to contribute to aggregated data on a population, while preserving their right to specific information about their own brain health. The cloud now offers endless storage, computing prowess, and neuroimaging applications for postprocessing that is searchable and scalable. Collective expertise is a windfall of the crowd in the cloud and particularly valuable in an area such as cerebrovascular health. The rise of precision medicine, rapidly evolving technological capabilities of cloud computing and the global imperative to limit the public health impact of cerebrovascular disease converge in the imaging of A Million Brains Initiative™. Crowdsourcing secure data on brain health may provide ultimate generalizability, enable focused analyses, facilitate clinical practice, and accelerate research efforts. PMID:27921034
A progressively wetter climate in southern East Africa over the past 1.3 million years.
Johnson, T C; Werne, J P; Brown, E T; Abbott, A; Berke, M; Steinman, B A; Halbur, J; Contreras, S; Grosshuesch, S; Deino, A; Scholz, C A; Lyons, R P; Schouten, S; Damsté, J S Sinninghe
2016-09-08
African climate is generally considered to have evolved towards progressively drier conditions over the past few million years, with increased variability as glacial-interglacial change intensified worldwide. Palaeoclimate records derived mainly from northern Africa exhibit a 100,000-year (eccentricity) cycle overprinted on a pronounced 20,000-year (precession) beat, driven by orbital forcing of summer insolation, global ice volume and long-lived atmospheric greenhouse gases. Here we present a 1.3-million-year-long climate history from the Lake Malawi basin (10°-14° S in eastern Africa), which displays strong 100,000-year (eccentricity) cycles of temperature and rainfall following the Mid-Pleistocene Transition around 900,000 years ago. Interglacial periods were relatively warm and moist, while ice ages were cool and dry. The Malawi record shows limited evidence for precessional variability, which we attribute to the opposing effects of austral summer insolation and the temporal/spatial pattern of sea surface temperature in the Indian Ocean. The temperature history of the Malawi basin, at least for the past 500,000 years, strongly resembles past changes in atmospheric carbon dioxide and terrigenous dust flux in the tropical Pacific Ocean, but not in global ice volume. Climate in this sector of eastern Africa (unlike northern Africa) evolved from a predominantly arid environment with high-frequency variability to generally wetter conditions with more prolonged wet and dry intervals.
Lunar surface processes and cosmic ray histories over the past several million years
NASA Technical Reports Server (NTRS)
Fruchter, J. S.; Rancitelli, L. A.; Evans, J. C.; Perkins, R. W.
1978-01-01
Measurements of the Al-26 and Mn-53 in interior portions of lunar rocks have shown that lunar surface processes which move a significant fraction of kilogram size rocks on the lunar surface occur on time scales of a few million years. These measurements, together with noble gas age dating have made it possible to define the history for nine rock samples selected from whole rock counting data because of anomalously low Al-26 relative to Na-22. Six of the rocks from the Apollo 15 and 16 missions showed evidence of movement during the past five million years. Of these six, only two are of an age consistent with their origin from the South Ray Crater Event. In addition, our measurements of Na-22 and Al-26 in Apollo 17 double drive tube 74001-74002 suggest that one to two cm of soil is missing from the top of this core tube. Even with this loss, at least two cm of gardening is indicated in the top portion of 74002.
Twenty-million-year relationship between mammalian diversity and primary productivity
NASA Astrophysics Data System (ADS)
Fritz, Susanne A.; Eronen, Jussi T.; Schnitzler, Jan; Hof, Christian; Janis, Christine M.; Mulch, Andreas; Böhning-Gaese, Katrin; Graham, Catherine H.
2016-09-01
At global and regional scales, primary productivity strongly correlates with richness patterns of extant animals across space, suggesting that resource availability and climatic conditions drive patterns of diversity. However, the existence and consistency of such diversity-productivity relationships through geological history is unclear. Here we provide a comprehensive quantitative test of the diversity-productivity relationship for terrestrial large mammals through time across broad temporal and spatial scales. We combine >14,000 occurrences for 690 fossil genera through the Neogene (23-1.8 Mya) with regional estimates of primary productivity from fossil plant communities in North America and Europe. We show a significant positive diversity-productivity relationship through the 20-million-year record, providing evidence on unprecedented spatial and temporal scales that this relationship is a general pattern in the ecology and paleo-ecology of our planet. Further, we discover that genus richness today does not match the fossil relationship, suggesting that a combination of human impacts and Pleistocene climate variability has modified the 20-million-year ecological relationship by strongly reducing primary productivity and driving many mammalian species into decline or to extinction.
THE FIRST KINEMATIC DETERMINATION OF MILLION-YEAR PRECESSION PERIOD OF ACTIVE GALACTIC NUCLEI
Gong, B. P.; Li, Y. P.; Zhang, H. C.
2011-06-20
Short precession periods like the 164 day period of SS433 can be well determined by observations of timescales longer or much longer than the precession period. However, this does not work for sources with precession periods of millions of years. This Letter utilizes the particular morphologies of X-shaped sources, so that the three-dimensional kinematics of lobes can be obtained. Thus, for the first time, the million-year precession period of X-shaped sources by an observer on the Earth can be determined elegantly: 6.1 {+-} 1.5 Myr, 1.8 {+-} 0.5 Myr, and 3.2 {+-} 1.2 Myr for 3C52, 3C223.1, and 4C12.03, respectively. The result naturally explains the asymmetry displayed in the morphology of these sources, and the effect of propagation time on the diversity of morphologies is well demonstrated. The precession period may originate from long-term effects of a binary supermassive black hole system, which is a potential source of gravitational wave radiation.
A progressively wetter climate in southern East Africa over the past 1.3 million years
NASA Astrophysics Data System (ADS)
Johnson, T. C.; Werne, J. P.; Brown, E. T.; Abbott, A.; Berke, M.; Steinman, B. A.; Halbur, J.; Contreras, S.; Grosshuesch, S.; Deino, A.; Scholz, C. A.; Lyons, R. P.; Schouten, S.; Damsté, J. S. Sinninghe
2016-09-01
African climate is generally considered to have evolved towards progressively drier conditions over the past few million years, with increased variability as glacial-interglacial change intensified worldwide. Palaeoclimate records derived mainly from northern Africa exhibit a 100,000-year (eccentricity) cycle overprinted on a pronounced 20,000-year (precession) beat, driven by orbital forcing of summer insolation, global ice volume and long-lived atmospheric greenhouse gases. Here we present a 1.3-million-year-long climate history from the Lake Malawi basin (10°-14° S in eastern Africa), which displays strong 100,000-year (eccentricity) cycles of temperature and rainfall following the Mid-Pleistocene Transition around 900,000 years ago. Interglacial periods were relatively warm and moist, while ice ages were cool and dry. The Malawi record shows limited evidence for precessional variability, which we attribute to the opposing effects of austral summer insolation and the temporal/spatial pattern of sea surface temperature in the Indian Ocean. The temperature history of the Malawi basin, at least for the past 500,000 years, strongly resembles past changes in atmospheric carbon dioxide and terrigenous dust flux in the tropical Pacific Ocean, but not in global ice volume. Climate in this sector of eastern Africa (unlike northern Africa) evolved from a predominantly arid environment with high-frequency variability to generally wetter conditions with more prolonged wet and dry intervals.
Expanded perlite insulation selected for process piping in $80 million boric acid plant
Nannini, L.; Gaines, A.
1982-03-01
U.S. Borax's new $80 million chemical facility in Boron, California utilizes the most modern technology to produce 200,000 tons per year of boric acid that is used in texyile fiber glass, various types of heat resistant glasses, metallurgy, drugs and cosmetics. The boric acid plant contains thousands of feet of pipe to convey liquors to mixing tanks, clarifiers, crystallizers, centrifuges and other equipment for the refining process. Steel pipe lined with polyvinylidene fluoride (PVDF) was used for a major portion of the piping system to avoid corrosion problems and assure products free of contaminants. The process lines were insulated with a lightweight, asbestos-free product made of expanded perlite containing millions of air cells for low thermal conductivity, bonded together by special binders and reinforcing fibers for good compressive strength. The rigid, molded, insulation can withstand continuous and cycling temperatures to 1500/sup 0/F with minimal shrinkage, and contains less than 150 ppm chlorides to avoid stress corrosion cracking of austenitic stainless steels. The boric acid plant, which is one of the world's largest, began operations in August 1980, and the performance of the expanded perlite pipe insulation in maintaining process temperatures is considered very satisfactory. Any line leakage that occurred during start-up or normal operation has not affected the heat barrier efficiency or structural integrity of the insulation. The combined strength of the insulation and PVC jacket has prevented any serious damage to the pipe covering when struck or scraped.
A 61-million-person experiment in social influence and political mobilization
Bond, Robert M.; Fariss, Christopher J.; Jones, Jason J.; Kramer, Adam D. I.; Marlow, Cameron; Settle, Jaime E.; Fowler, James H.
2013-01-01
Human behaviour is thought to spread through face-to-face social networks, but it is difficult to identify social influence effects in observational studies9–13, and it is unknown whether online social networks operate in the same way14–19. Here we report results from a randomized controlled trial of political mobilization messages delivered to 61 million Facebook users during the 2010 US congressional elections. The results show that the messages directly influenced political self-expression, information seeking and real-world voting behaviour of millions of people. Furthermore, the messages not only influenced the users who received them but also the users’ friends, and friends of friends. The effect of social transmission on real-world voting was greater than the direct effect of the messages themselves, and nearly all the transmission occurred between ‘close friends’ who were more likely to have a face-to-face relationship. These results suggest that strong ties are instrumental for spreading both online and real-world behaviour in human social networks. PMID:22972300
The complete genome of a viable archaeum isolated from 123-million-year-old rock salt.
Jaakkola, Salla T; Pfeiffer, Friedhelm; Ravantti, Janne J; Guo, Qinggong; Liu, Ying; Chen, Xiangdong; Ma, Hongling; Yang, Chunhe; Oksanen, Hanna M; Bamford, Dennis H
2016-02-01
Live microbes have been isolated from rock salt up to Permian age. Only obligatory cellular functions can be performed in halite-buried cells. Consequently, their genomic sequences are likely to remain virtually unchanged. However, the available sequence information from these organisms is scarce and consists of mainly ribosomal 16S sequences. Here, live archaea were isolated from early Cretaceous (∼ 123 million years old) halite from the depth of 2000 m in Qianjiang Depression, Hubei Province, China. The sample was radiologically dated and subjected to rigorous surface sterilization before microbe isolation. The isolates represented a single novel species of Halobacterium, for which we suggest the name Halobacterium hubeiense, type strain Hbt. hubeiense JI20-1. The species was closely related to a Permian (225-280 million years old) isolate, Halobacterium noricense, originating from Alpine rock salt. This study is the first one to publish the complete genome of an organism originating from surface-sterilized ancient halite. In the future, genomic data from halite-buried microbes can become a key factor in understanding the mechanisms by which these organisms are able to survive in harsh conditions deep underground or possibly on other celestial bodies.
A 40-million-year lake record of early mesozoic orbital climatic forcing.
Olsen, P E
1986-11-14
Sediments of the early Mesozoic Newark Supergroup of eastern North America consist largely of sedimentary cycles produced by the rise and fall of very large lakes that responded to periodic climate changes controlled by variations in the earth's orbit. Fourier analysis of long sections of the Late Triassic Lockatong and Passaic formations of the Newark Basin show periods in thickness of 5.9, 10.5, 25.2, 32.0, and 96.0 meters corresponding to periodicities in time of roughly 25,000, 44,000, 100,0003,, 13000 and 400,000 years, as judged by radiometric time scales and varve-calibrated sedimentation rates. The ratios of the shortest cycle with longer cycles correspond closely to the ratios of the present periods of the main orbital terms that appear to influence climate. Similar long sequences of sedimentary cycles occur through most of the rest of the Newark Supergroup spanning a period of more than 40 million years. This is strong evidence of orbital forcing of climate in the ice-free early Mesozoic and indicates that the main periods of the orbital cycles were not very different 200 million years ago from those today.
Anthropogenic carbon release rate unprecedented during the past 66 million years
NASA Astrophysics Data System (ADS)
Zeebe, Richard E.; Ridgwell, Andy; Zachos, James C.
2016-04-01
Carbon release rates from anthropogenic sources reached a record high of ~10 Pg C yr-1 in 2014. Geologic analogues from past transient climate changes could provide invaluable constraints on the response of the climate system to such perturbations, but only if the associated carbon release rates can be reliably reconstructed. The Palaeocene-Eocene Thermal Maximum (PETM) is known at present to have the highest carbon release rates of the past 66 million years, but robust estimates of the initial rate and onset duration are hindered by uncertainties in age models. Here we introduce a new method to extract rates of change from a sedimentary record based on the relative timing of climate and carbon cycle changes, without the need for an age model. We apply this method to stable carbon and oxygen isotope records from the New Jersey shelf using time-series analysis and carbon cycle-climate modelling. We calculate that the initial carbon release during the onset of the PETM occurred over at least 4,000 years. This constrains the maximum sustained PETM carbon release rate to less than 1.1 Pg C yr-1. We conclude that, given currently available records, the present anthropogenic carbon release rate is unprecedented during the past 66 million years. We suggest that such a `no-analogue’ state represents a fundamental challenge in constraining future climate projections. Also, future ecosystem disruptions are likely to exceed the relatively limited extinctions observed at the PETM.
Born too soon: accelerating actions for prevention and care of 15 million newborns born too soon.
Lawn, Joy E; Kinney, Mary V; Belizan, José M; Mason, Elizabeth Mary; McDougall, Lori; Larson, Jim; Lackritz, Eve; Friberg, Ingrid K; Howson, Christopher P
2013-01-01
Preterm birth complication is the leading cause of neonatal death resulting in over one million deaths each year of the 15 million babies born preterm. To accelerate change, we provide an overview of the comprehensive strategy required, the tools available for context-specifi c health system implementation now, and the priorities for research and innovation. There is an urgent need for action on a dual track: (1) through strategic research to advance the prevention of preterm birth and (2) improved implementation and innovation for care of the premature neonate. We highlight evidence-based interventions along the continuum of care, noting gaps in coverage, quality, equity and implications for integration and scale up. Improved metrics are critical for both burden and tracking programmatic change. Linked to the United Nation’s Every Women Every Child strategy, a target was set for 50% reduction in preterm deaths by 2025. Three analyses informed this target: historical change in high income countries, recent progress in best performing countries, and modelling of mortality reduction with high coverage of existing interventions. If universal coverage of selected interventions were to be achieved, then 84% or more than 921,000 preterm neonatal deaths could be prevented annually, with antenatal corticosteroids and Kangaroo Mother Care having the highest impact. Everyone has a role to play in reaching this target including government leaders, professionals, private sector, and of course families who are aff ected the most and whose voices have been critical for change in many of the countries with the most progress.
Consideration of probability of bacterial growth for Jovian planets and their satellites
NASA Technical Reports Server (NTRS)
Taylor, D. M.; Berkman, R. M.; Divine, N.
1975-01-01
Environmental parameters affecting growth of bacteria (e.g., moisture, temperature, pH, and chemical composition) were compared with current atmospheric models for Jupiter and Saturn, and with the available physical data for their satellites. Different zones of relative probability of growth were identified for Jupiter and Saturn, with the highest in pressure regions of 1-10 million N/sq m (10 to 100 atmospheres) and 3-30 million N/sq m (30 to 300 atmospheres), respectively. Of the more than two dozen satellites, only the largest (Io, Europa, Ganymede, Callisto, and Titan) were found to be interesting biologically. Titan's atmosphere may produce a substantial greenhouse effect providing increased surface temperatures. Models predicting a dense atmosphere are compatible with microbial growth for a range of pressures at Titan's surface. For Titan's surface the probability of growth would be enhanced if (1) the surface is entirely or partially liquid (water), (2) volcanism (in an ice-water-steam system) is present, or (3) access to internal heat sources is significant.
On the probability of dinosaur fleas.
Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F
2016-01-11
Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.
Quantum probabilities for inflation from holography
Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk
2014-01-01
The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2006-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Probability-one homotopies in computational science
NASA Astrophysics Data System (ADS)
Watson, Layne T.
2002-03-01
Probability-one homotopy algorithms are a class of methods for solving nonlinear systems of equations that, under mild assumptions, are globally convergent for a wide range of problems in science and engineering. Convergence theory, robust numerical algorithms, and production quality mathematical software exist for general nonlinear systems of equations, and special cases such as Brouwer fixed point problems, polynomial systems, and nonlinear constrained optimization. Using a sample of challenging scientific problems as motivation, some pertinent homotopy theory and algorithms are presented. The problems considered are analog circuit simulation (for nonlinear systems), reconfigurable space trusses (for polynomial systems), and fuel-optimal orbital rendezvous (for nonlinear constrained optimization). The mathematical software packages HOMPACK90 and POLSYS_PLP are also briefly described.
Audio feature extraction using probability distribution function
NASA Astrophysics Data System (ADS)
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
Microtechnique for most-probable-number analysis.
Rowe, R; Todd, R; Waide, J
1977-03-01
A microtechnique based on the most-probable-number (MPN) method has been developed for the enumeration of the ammonium-oxidizing population in soil samples. An MPN table for a research design ([8 by 12] i.e., 12 dilutions, 8 replicates per dilution) is presented. A correlation of 0.68 was found between MPNs determined by the microtechnique and the standard tube technique. Higher MPNs were obtained with the microtechnique with increased accuracy in endpoint determinations being a possible cause. Considerable savings of time, space, equipment, and reagents are observed using this method. The microtechnique described may be adapted to other microbial populations using various types of media and endpoint determinations.
Continuity of percolation probability on hyperbolic graphs
NASA Astrophysics Data System (ADS)
Wu, C. Chris
1997-05-01
Let T k be a forwarding tree of degree k where each vertex other than the origin has k children and one parent and the origin has k children but no parent ( k≥2). Define G to be the graph obtained by adding to T k nearest neighbor bonds connecting the vertices which are in the same generation. G is regarded as a discretization of the hyperbolic plane H 2 in the same sense that Z d is a discretization of R d . Independent percolation on G has been proved to have multiple phase transitions. We prove that the percolation probability O(p) is continuous on [0,1] as a function of p.
Probability and delay discounting of erotic stimuli.
Lawyer, Steven R
2008-09-01
Adult undergraduate men (n=38) and women (n=33) were categorized as erotica "users" (n=34) and "non-users" (n=37) based on their responses to screening questions and completed computerized delay and probability discounting tasks concerning hypothetical money and erotica. Erotica users discounted the value of erotica similarly to money on three of the four erotica tasks; erotica non-users discounted the value of money consistent with erotica users, but not the value of erotica. Erotica users were disproportionately male, scored higher on several psychometric measures of sexuality-related constructs, and exhibited more impulsive choice patterns on the delay discounting for money task than erotica non-users did. These findings suggest that discounting processes generalize to erotic outcomes for some individuals.