Paint Removers: Methylene Chloride and n-Methylpyrrolidone (NMP)
This webinar described the risks associated with NMP, provided an overview of TSCA section 6, and solicited input on ways EPA should address identified risks, with a particular emphasis on vulnerable populations.
Genome-Wide Mapping and Interrogation of the Nmp4 Antianabolic Bone Axis.
Childress, Paul; Stayrook, Keith R; Alvarez, Marta B; Wang, Zhiping; Shao, Yu; Hernandez-Buquer, Selene; Mack, Justin K; Grese, Zachary R; He, Yongzheng; Horan, Daniel; Pavalko, Fredrick M; Warden, Stuart J; Robling, Alexander G; Yang, Feng-Chun; Allen, Matthew R; Krishnan, Venkatesh; Liu, Yunlong; Bidwell, Joseph P
2015-09-01
PTH is an osteoanabolic for treating osteoporosis but its potency wanes. Disabling the transcription factor nuclear matrix protein 4 (Nmp4) in healthy, ovary-intact mice enhances bone response to PTH and bone morphogenetic protein 2 and protects from unloading-induced osteopenia. These Nmp4(-/-) mice exhibit expanded bone marrow populations of osteoprogenitors and supporting CD8(+) T cells. To determine whether the Nmp4(-/-) phenotype persists in an osteoporosis model we compared PTH response in ovariectomized (ovx) wild-type (WT) and Nmp4(-/-) mice. To identify potential Nmp4 target genes, we performed bioinformatic/pathway profiling on Nmp4 chromatin immunoprecipitation sequencing (ChIP-seq) data. Mice (12 w) were ovx or sham operated 4 weeks before the initiation of PTH therapy. Skeletal phenotype analysis included microcomputed tomography, histomorphometry, serum profiles, fluorescence-activated cell sorting and the growth/mineralization of cultured WT and Nmp4(-/-) bone marrow mesenchymal stem progenitor cells (MSPCs). ChIP-seq data were derived using MC3T3-E1 preosteoblasts, murine embryonic stem cells, and 2 blood cell lines. Ovx Nmp4(-/-) mice exhibited an improved response to PTH coupled with elevated numbers of osteoprogenitors and CD8(+) T cells, but were not protected from ovx-induced bone loss. Cultured Nmp4(-/-) MSPCs displayed enhanced proliferation and accelerated mineralization. ChIP-seq/gene ontology analyses identified target genes likely under Nmp4 control as enriched for negative regulators of biosynthetic processes. Interrogation of mRNA transcripts in nondifferentiating and osteogenic differentiating WT and Nmp4(-/-) MSPCs was performed on 90 Nmp4 target genes and differentiation markers. These data suggest that Nmp4 suppresses bone anabolism, in part, by regulating IGF-binding protein expression. Changes in Nmp4 status may lead to improvements in osteoprogenitor response to therapeutic cues.
Responses to Comments for N-Methylpyrrolidone (NMP) Work Plan Risk Assessment
This document summarizes the public and external peer review comments that the EPA’s Office of Pollution Prevention and Toxics (OPPT) received for the draft work plan risk assessment for n-Methylpyrrolidone (NMP).
2015-10-01
1-Methyl-pyrrolidone (NMP) is used as a solvent in many technical applications. The general population may be exposed to NMP from the use as ingredient in paint and graffiti remover, indoors also from use in paints and carpeting. Because of developmental toxic effects, the use of NMP in consumer products in the EU is regulated. The developmental effects accompanied by weak maternally toxic effects in animal experiments are considered as the critical effects by the German HBM Commission. Based on these effects, HBM-I values of 10 mg/l urine for children and of 15 mg/l for adults, respectively, were derived for the metabolites 5-Hydroxy-NMP and 2-Hydroxy-N-methylsuccinimide. HBM-II-values were set to 30 mg/l urine for children and 50 mg/l for adults, respectively. Because of similar effects of the structural analogue 1-ethyl-2-pyrrolidone (NEP), the possible mixed exposure to both compounds has to be taken into account when evaluating the total burden.
Bader, Michael; Wrbitzky, Renate; Blaszkewicz, Meinolf; Schäper, Michael; van Thriel, Christoph
2008-01-01
N-Methyl-2-pyrrolidone (NMP) is a versatile organic solvent frequently used for surface cleaning such as paint stripping or graffiti removal. Liquid NMP is rapidly absorbed through the skin but dermal vapour phase absorption might also play an important role for the uptake of the solvent. This particular aspect was investigated in an experimental study with 16 volunteers exposed to 80 mg/m(3) NMP for 8 h under either whole-body, i.e. inhalational plus dermal, or dermal-only conditions. Additionally, the influence of moderate physical workload on the uptake of NMP was studied. The urinary concentrations of NMP and its metabolites 5-hydroxy-N-methyl-2-pyrrolidone (5-HNMP) and 2-hydroxy-N-methylsuccinimide (2-HMSI) were followed for 48 h and analysed by gas chromatography-mass spectrometry (GC-MS). Percutaneous uptake delayed the elimination peak times and the apparent biological half-lives of NMP and 5-HNMP. Under resting conditions, dermal-only exposure resulted in the elimination of 71 +/- 8 mg NMP equivalents as compared to 169 +/- 15 mg for whole-body exposure. Moderate workload yielded 79 +/- 8 mg NMP (dermal-only) and 238 +/- 18 mg (whole-body). Thus, dermal absorption from the vapour phase may contribute significantly to the total uptake of NMP, e.g. from workplace atmospheres. As the concentration of airborne NMP does not reflect the body dose, biomonitoring should be carried out for surveillance purposes.
cNMP-AMs mimic and dissect bacterial nucleotidyl cyclase toxin effects.
Beckert, Ulrike; Grundmann, Manuel; Wolter, Sabine; Schwede, Frank; Rehmann, Holger; Kaever, Volkhard; Kostenis, Evi; Seifert, Roland
2014-09-05
In addition to the well-known second messengers cAMP and cGMP, mammalian cells contain the cyclic pyrimidine nucleotides cCMP and cUMP. The Pseudomonas aeruginosa toxin ExoY massively increases cGMP and cUMP in cells, whereas the Bordetella pertussis toxin CyaA increases cAMP and, to a lesser extent, cCMP. To mimic and dissect toxin effects, we synthesized cNMP-acetoxymethylesters as prodrugs. cNMP-AMs rapidly and effectively released the corresponding cNMP in cells. The combination of cGMP-AM plus cUMP-AM mimicked cytotoxicity of ExoY. cUMP-AM and cGMP-AM differentially activated gene expression. Certain cCMP and cUMP effects were independent of the known cNMP effectors protein kinases A and G and guanine nucleotide exchange factor Epac. In conclusion, cNMP-AMs are useful tools to mimic and dissect bacterial nucleotidyl cyclase toxin effects.
Mice Deficient in CIZ/NMP4 Develop an Attenuated Form of K/BxN-Serum Induced Arthritis.
Nakamoto, Tetsuya; Izu, Yayoi; Kawasaki, Makiri; Notomi, Takuya; Hayata, Tadayoshi; Noda, Masaki; Ezura, Yoichi
2016-04-01
CIZ/NMP4 (Cas interacting zinc finger protein, Nmp4, Zfp384) is a transcription factor that is known to regulate matrix related-proteins. To explore the possible pathophysiological role of CIZ/NMP4 in arthritis, we examined CIZ/NMP4 expression in articular cartilage in arthritis model. CIZ/NMP4 was expressed in the articular chondrocytes of mice at low levels while its expression was enhanced when arthritis was induced. Arthritis induction increased clinical score in wild type mice. In contrast, CIZ/NMP4 deficiency suppressed such rise in the levels of arthritis score and swelling of soft tissue. CIZ/NMP4 deficiency also reduced invasion of inflammatory cells in joint tissue. Quantitative PCR analyses of mRNA from joints revealed that arthritis-induced increase in expressions of IL-1β was suppressed by CIZ/NMP4 deficiency. CIZ/NMP4 bound to IL-1β promoter and activated its transcription. The increase in CIZ/NMP4 in arthritis was also associated with enhancement in bone resorption and cartilage matrix degradation. In fact, RANKL, a signaling molecule prerequisite for osteoclastogenesis and, MMP-3, a clinical marker for arthritis were increased in joints upon arthritis induction. In contrast, CIZ/NMP4 deficiency suppressed the arthritis-induced increase in bone resorption, expression of RANKL and MMP-3 mRNA. Thus, CIZ/NMP4 plays a role in the development of arthritis at least in part through regulation of key molecules related to the arthritis.
Nmp4/CIZ: road block at the intersection of PTH and load.
Childress, Paul; Robling, Alexander G; Bidwell, Joseph P
2010-02-01
Teriparatide (parathyroid hormone, [PTH]) is the only FDA-approved drug that replaces bone lost to osteoporosis. Enhancing PTH efficacy will improve cost-effectiveness and ameliorate contraindications. Combining this hormone with load-bearing exercise may enhance therapeutic potential consistent with a growing body of evidence that these agonists are synergistic and share common signaling pathways. Additionally, neutralizing molecules that naturally suppress the anabolic response to PTH may also improve the efficacy of treatment with this hormone. Nmp4/CIZ (nuclear matrix protein 4/cas interacting zinc finger)-null mice have enhanced responses to intermittent PTH with respect to increasing trabecular bone mass and are also immune to disuse-induced bone loss likely by the removal of Nmp4/CIZ suppressive action on osteoblast function. Nmp4/CIZ activity may be sensitive to changes in the mechanical environment of the bone cell brought about by hormone- or mechanical load-induced changes in cell shape and adhesion. Nmp4 was identified in a screen for PTH-responsive nuclear matrix architectural transcription factors (ATFs) that we proposed translate hormone-induced changes in cell shape and adhesion into changes in target gene DNA conformation. CIZ was independently identified as a nucleocytoplasmic shuttling transcription factor associating with the mechano-sensitive focal adhesion proteins p130Cas and zxyin. The p130Cas/zyxin/Nmp4/CIZ pathway resembles the beta-catenin/TCF/LEF1 mechanotransduction response limb and both share features with the HMGB1 (high mobility group box 1)/RAGE (receptor for advanced glycation end products) signaling axis. Here we describe Nmp4/CIZ within the context of the PTH-induced anabolic response and consider the place of this molecule in the hierarchy of the PTH-load response network.
Karfeld-Sulzer, Lindsay S; Ghayor, Chafik; Siegenthaler, Barbara; Gjoksi, Bebeka; Pohjonen, Timo H; Weber, Franz E
2017-02-01
Guided bone regeneration (GBR) has been utilized for several decades for the healing of cranio-maxillofacial bone defects and, particularly in the dental field, by creating space with a barrier membrane to exclude soft tissue and encourage bone growth in the membrane-protected volume. Although the first membranes were non-resorbable, a new generation of GBR membranes aims to biodegrade and provide bioactivity for better overall results. The Inion GTR™ poly(lactide-co-glycolide) (PLGA) membrane is not only resorbable but also bioactive, since it includes N-methylpyrrolidone (NMP), which has been shown to promote bone regeneration. In this study, the effects of loading different amounts of NMP onto the membrane through chemical vapour deposition or dipping have been explored. In vitro release demonstrated that lower levels of NMP led to lower NMP concentrations and slower release, based on total NMP loaded in the membrane. The dipped membrane released almost all of the NMP within 15 min, leading to a high NMP concentration. For the in vivo studies in rabbits, 6 mm calvarial defects were created and left untreated or covered with an ePTFE membrane or PLGA membranes dipped in, or preloaded with, NMP. Evaluation of the bony regeneration revealed that the barrier membranes improved bony healing and that a decrease in NMP content improved the performance. Overall, we have demonstrated the potential of these PLGA membranes with a more favourable NMP release profile and the significance of exploring the effect of NMP on these PLGA membranes with regard to bone ingrowth. Copyright © 2014 John Wiley & Sons, Ltd.
Synthesis of NMP, a Fluoxetine (Prozac) Precursor, in the Introductory Organic Laboratory
NASA Astrophysics Data System (ADS)
Perrine, Daniel M.; Sabanayagam, Nathan R.; Reynolds, Kristy J.
1998-10-01
A synthesis of the immediate precursor of the widely used antidepressant fluoxetine (Prozac) is described. The procedure is short, safe, and simple enough to serve as a laboratory exercise for undergraduate students in the second semester of introductory organic chemistry and is one which will be particularly interesting to those planning a career in the health sciences. The compound synthesized is (°)-N,N-dimethyl-3-(p-trifluoromethylphenoxy)-3-phenylpropylamine, or "N-methyl Prozac" (NMP). The synthesis of NMP requires one two-hour period and a second three-hour period. In the first period, a common Mannich base, 3-dimethylaminopropiophenone, is reduced with sodium borohydride to form (°)-3-dimethylamino-1-phenylpropanol. In the second period, potassium t-butoxide is used to couple (°)-3-dimethylamino-1-phenylpropanol with p-chlorotrifluoromethylbenzene to form NMP, which is isolated as its oxalate salt. All processes use equipment and materials that are inexpensive and readily available in most undergraduate laboratories. Detailed physical data are given on NMP, including high-field DEPT 13C NMR.
These assessments will focus on the use of DCM and NMP in paint stripping. NMP exposures scenarios for this use result in inhalation and dermal exposure to consumers and workers. The low concern for environmental effects of NMP will be discussed in the assessment. In the case of ...
Childress, Paul; Philip, Binu K.; Robling, Alexander G.; Bruzzaniti, Angela; Kacena, Melissa A.; Bivi, Nicoletta; Plotkin, Lilian I.; Heller, Aaron; Bidwell, Joseph P.
2011-01-01
How parathyroid hormone (PTH) increases bone mass is unclear but understanding this phenomenon is significant to the improvement of osteoporosis therapy. Nmp4/CIZ is a nucleocytoplasmic shuttling transcriptional repressor that suppresses PTH-induced osteoblast gene expression and hormone-stimulated gains in murine femoral trabecular bone. To further characterize Nmp4/CIZ suppression of hormone-mediated bone growth we treated 10 wk-old Nmp4-knockout (KO) and wild-type (WT) mice with intermittent human PTH (1-34) at 30μg/kg/day or vehicle, 7 days/wk, for 2, 3, or 7 wks. Null mice treated with hormone (7 wks) gained more vertebral and tibial cancellous bone than WT animals paralleling the exaggerated response in the femur. Interestingly, Nmp4/CIZ suppression of this hormone-stimulated bone formation was not apparent during the first 2 wks of treatment. Consistent with the null mice enhanced PTH-stimulated addition of trabecular bone these animals exhibited an augmented hormone-induced increase in serum osteocalcin 3 wks into treatment. Unexpectedly the Nmp4-KO mice displayed an osteoclast phenotype. Serum C-terminal telopeptides, a marker for bone resorption, was elevated in the null mice, irrespective of treatment. Nmp4-KO bone marrow cultures produced more osteoclasts, which exhibited an elevated resorbing activity, compared to WT cultures. The expression of several genes critical to the development of both osteoblasts and osteoclasts were elevated in Nmp4-KO mice at 2 wks but not 3 wks of hormone exposure. We propose that Nmp4/CIZ dampens PTH-induced improvement of trabecular bone throughout the skeleton by transiently suppressing hormone-stimulated increases in the expression of proteins key to the required enhanced activity/number of both osteoblasts and osteoclasts. PMID:21607813
Low cost environmental sensors for Spaceflight : NMP Space Environmental Monitor (SEM) requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buelher, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
Low Cost Environmental Sensors for Spaceflight: NMP Space Environmental Monitor (SEM) Requirements
NASA Technical Reports Server (NTRS)
Garrett, Henry B.; Buehler, Martin G.; Brinza, D.; Patel, J. U.
2005-01-01
An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.
Calorimetric investigation of polyaniline and 1-methyl 2-pyrrolidone (NMP) mixtures
NASA Astrophysics Data System (ADS)
Delmas, Geneviàve
2001-03-01
Mixing polyaniline powder (Pani) with polar solvents such as 1-methyl-2-pyrrolidone (NMP) generates a strong interaction at RT as indicated by the immediate development on contact of color and heat. Stirring increases color only if it is vigorous.It does not lead to complete solubility.The question arises of a dispersion or a dissolution of Pani in NMP. It was found that a oxygen-free mixture undergoes a phase-change in a slow T-ramp (0.1K/min) between 72 and 120o C.The values of Δ H are large, exothermic and critically dependant on the mixture history and nominal concentration. As an example, in similar stirring conditions at RT, Δ H increases from -2,000 to -20,000 J/g when c (nominal ) diminishes from 1.3 to 0.3 mg/cm3. Stirring is achieved either with a magnetic stirrer, by sonication or by gentle shaking. The UV-visible spectra of mixtures are measured at RT as a function of stirring conditions and c (nominal).The values of Δ H obtained in different conditions, show that the formation of a dispersion is not a pre-requisite for the phase-change. Reversibility of the phase-change is being investigated.
This objective of this paper is to demonstrate that NMP is a viable pollution prevention alternative to methylene chloride. Maine Corps Logistics Base (MCLB), Albany, GA, USA was the host site for the demonstration. MCLB's primary function is maintenance of military ground supp...
NASA Astrophysics Data System (ADS)
Kojima, Hiroshi; Oka, Masahiro; Akisawa, Atsushi; Kashiwagi, Takao
The absorption heat pump using water-LiBr solution as working fluid has widely been used for air-conditioning system in Japan. However, it is difficult to apply the system for various uses, such as the utilization of low temperature heat sources, refrigeration and air-source type heat pumps because of the properties of the working fluid. 2,2,2- Trifluoroethanol (TFE) /n-Methyl-2- Pyrrolidone(NMP)is expected to be one of the most useful working fluids for the absorption heat pump of such use. While a number of investigations are available for the heat transfer performal1ce of LiBr solution, no work has been carried out to find out the heat transfer coefficient of TFE/NMP mixtures although it is important to know the heat transfer performance of TFE/NMP mixtures to design each element of the heat pump.In this study, nucleate pool boiling heat transfer cofficients are measured for TFE/NMP mixtures in order to evaluate the heat transfer perfonnance in the generator which is one of the element of the heat pump.
Chiu, Tsan-Yu; Lao, Jeemeng; Manalansan, Bianca; Loqué, Dominique; Roux, Stanley J; Heazlewood, Joshua L
2015-11-15
Plant apyrases are nucleoside triphosphate (NTP) diphosphohydrolases (NTPDases) and have been implicated in an array of functions within the plant including the regulation of extracellular ATP. Arabidopsis encodes a family of seven membrane bound apyrases (AtAPY1-7) that comprise three distinct clades, all of which contain the five conserved apyrase domains. With the exception of AtAPY1 and AtAPY2, the biochemical and the sub-cellular characterization of the other members are currently unavailable. In this research, we have shown all seven Arabidopsis apyrases localize to internal membranes comprising the cis-Golgi, endoplasmic reticulum (ER) and endosome, indicating an endo-apyrase classification for the entire family. In addition, all members, with the exception of AtAPY7, can function as endo-apyrases by complementing a yeast double mutant (Δynd1Δgda1) which lacks apyrase activity. Interestingly, complementation of the mutant yeast using well characterized human apyrases could only be accomplished by using a functional ER endo-apyrase (NTPDase6), but not the ecto-apyrase (NTPDase1). Furthermore, the substrate specificity analysis for the Arabidopsis apyrases AtAPY1-6 indicated that each member has a distinct set of preferred substrates covering various NDPs (nucleoside diphosphates) and NTPs. Combining the biochemical analysis and sub-cellular localization of the Arabidopsis apyrases family, the data suggest their possible roles in regulating endomembrane NDP/NMP (nucleoside monophosphate) homoeostasis.
Rudolph, M. G.; Veit, T. J.; Reinstein, J.
1999-01-01
Direct thermodynamic and kinetic investigations of the binding of nucleotides to the nucleoside monophosphate (NMP) site of NMP kinases have not been possible so far because a spectroscopic probe was not available. By coupling a fluorescent N-methylanthraniloyl- (mant) group to the beta-phosphate of CDP via a butyl linker, a CDP analogue [(Pbeta)MABA-CDP] was obtained that still binds specifically to the NMP site of UmpKdicty, because the base and the ribose moieties, which are involved in specific interactions, are not modified. This allows the direct determination of binding constants for its substrates in competition experiments. PMID:10631985
Liu, Chao; Lu, Qingquan; Huang, Zhiyuan; Zhang, Jian; Liao, Fan; Peng, Pan; Lei, Aiwen
2015-12-18
A novel strategy was developed to trigger ·CF3 by using in situ generated peroxide in NMP under O2 or air as the radical initiator. Radical trifluoromethylation of alkenes was achieved toward tertiary β-trifluoromethyl alcohols. Various tertiary β-trifluoromethyl alcohols can be synthesized in good yields without extra oxidants or transition metal catalysts. Preliminary mechanistic investigation revealed that O2 diffusion can influence the reaction rate.
DMAC and NMP as Electrolyte Additives for Li-Ion Cells
NASA Technical Reports Server (NTRS)
Smart, Marshall; Bugga, Ratnakumar; Lucht, Brett
2008-01-01
Dimethyl acetamide (DMAC) and N-methyl pyrrolidinone (NMP) have been found to be useful as high-temperature-resilience-enhancing additives to a baseline electrolyte used in rechargeable lithium-ion electrochemical cells. The baseline electrolyte, which was previously formulated to improve low-temperature performance, comprises LiPF6 dissolved at a concentration of 1.0 M in a mixture comprising equal volume proportions of ethylene carbonate, diethyl carbonate, and dimethyl carbonate. This and other electrolytes comprising lithium salts dissolved in mixtures of esters (including alkyl carbonates) have been studied in continuing research directed toward extending the lower limits of operating temperatures and, more recently, enhancing the high-temperature resilience of such cells. This research at earlier stages, and the underlying physical and chemical principles, were reported in numerous previous NASA Tech Briefs articles. Although these electrolytes provide excellent performance at low temperatures (typically as low as -40 C), when the affected Li-ion cells are subjected to high temperatures during storage and cycling, there occur irreversible losses of capacity accompanied by power fade and deterioration of low-temperature performance. The term "high-temperature resilience" signifies, loosely, the ability of a cell to resist such deterioration, retaining as much as possible of its initial charge/discharge capacity during operation or during storage in the fully charged condition at high temperature. For the purposes of the present development, a temperature is considered to be high if it equals or exceeds the upper limit (typically, 30 C) of the operating-temperature range for which the cells in question are generally designed.
Flick, Burkhard Talsness, Chris E.; Jaeckh, Rudolf; Buesen, Roland; Klug, Stephan
2009-06-01
N-methyl-2-pyrrolidone (NMP), which undergoes extensive biotransformation, has been shown in vivo to cause developmental toxicity and, especially after oral treatment, malformations in rats and rabbits. Data are lacking as to whether the original compound or one of its main metabolites is responsible for the toxic effects observed. Therefore, the relative embryotoxicity of the parent compound and its metabolites was evaluated using rat whole embryo culture (WEC) and the balb/c 3T3 cytotoxicity test. The resulting data were evaluated using two strategies; namely, one based on using all endpoints determined in the WEC and the other including endpoints from both the WEC and the cytotoxicity test. On basis of the first analysis, the substance with the highest embryotoxic potential is NMP, followed by 5-hydroxy-N-methyl-pyrrolidone (5-HNMP), 2-hydroxy-N-methylsuccinimide (2-HMSI) and N-methylsuccinimide (MSI). Specific dysmorphogeneses induced by NMP and 5-HNMP were aberrations in the head region of the embryos, abnormal development of the second visceral arches and open neural pores. The second evaluation strategy used only two endpoints of the WEC, i.e. the no observed adverse effect concentration (NOAEC{sub WEC}) and the lowest concentration leading to dysmorphogenesis in 100% of the cultured embryos (IC{sub MaxWEC}). In addition to these WEC endpoints the IC{sub 503T3} from the cytotoxicity test (balb/c 3T3 fibroblasts) was included in the evaluation scheme. These three endpoints were applied to a prediction model developed during a validation study of the European Centre for the Validation of Alternative Methods (ECVAM) allowing the classification of the embryotoxic potential of each compound into three classes (non-, weakly- and strongly embryotoxic). Consistent results from both evaluation strategies were observed, whereby NMP and its metabolites revealed a direct embryotoxic potential. Hereby, only NMP and 5-HNMP induced specific embryotoxic effects and were
Flick, Burkhard; Talsness, Chris E; Jäckh, Rudolf; Buesen, Roland; Klug, Stephan
2009-06-01
N-methyl-2-pyrrolidone (NMP), which undergoes extensive biotransformation, has been shown in vivo to cause developmental toxicity and, especially after oral treatment, malformations in rats and rabbits. Data are lacking as to whether the original compound or one of its main metabolites is responsible for the toxic effects observed. Therefore, the relative embryotoxicity of the parent compound and its metabolites was evaluated using rat whole embryo culture (WEC) and the balb/c 3T3 cytotoxicity test. The resulting data were evaluated using two strategies; namely, one based on using all endpoints determined in the WEC and the other including endpoints from both the WEC and the cytotoxicity test. On basis of the first analysis, the substance with the highest embryotoxic potential is NMP, followed by 5-hydroxy-N-methyl-pyrrolidone (5-HNMP), 2-hydroxy-N-methylsuccinimide (2-HMSI) and N-methylsuccinimide (MSI). Specific dysmorphogeneses induced by NMP and 5-HNMP were aberrations in the head region of the embryos, abnormal development of the second visceral arches and open neural pores. The second evaluation strategy used only two endpoints of the WEC, i.e. the no observed adverse effect concentration (NOAEC(WEC)) and the lowest concentration leading to dysmorphogenesis in 100% of the cultured embryos (IC(Max WEC)). In addition to these WEC endpoints the IC(50 3T3) from the cytotoxicity test (balb/c 3T3 fibroblasts) was included in the evaluation scheme. These three endpoints were applied to a prediction model developed during a validation study of the European Centre for the Validation of Alternative Methods (ECVAM) allowing the classification of the embryotoxic potential of each compound into three classes (non-, weakly- and strongly embryotoxic). Consistent results from both evaluation strategies were observed, whereby NMP and its metabolites revealed a direct embryotoxic potential. Hereby, only NMP and 5-HNMP induced specific embryotoxic effects and were classified as
NASA Astrophysics Data System (ADS)
Jaynes, E. T.; Bretthorst, G. Larry
2003-04-01
Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.
Interaction partners for human ZNF384/CIZ/NMP4-zyxin as a mediator for p130CAS signaling?
Janssen, Hilde; Marynen, Peter . E-mail: Peter.Marynen@med.kuleuven.be
2006-04-15
Transcription factor ZNF384/CIZ/NMP4 was first cloned in rat as a p130Cas-binding protein and has a role in bone metabolism and spermatogenesis. It is recurrently involved in translocations in acute lymphoblastic leukemia. Translocations t(12;17) and t(12;22) fuse ZNF384 to RNA-binding proteins TAF15 and EWSR1, while a translocation t(12;19) generates an E2A/ZNF384 fusion. We screened for ZNF384 interacting proteins using yeast two-hybrid technology. In contrast to its rat homolog, human ZNF384 does not interact with p130CAS. Zyxin, PCBP1, and vimentin, however, were identified as ZNF384-binding partners. Given the interaction between human zyxin and p130CAS, these results suggest that zyxin indirectly enables the interaction of ZNF384 with p130CAS which is described in rat.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Jana, Biman; Adkar, Bharat V; Biswas, Rajib; Bagchi, Biman
2011-01-21
The catalytic conversion of adenosine triphosphate (ATP) and adenosine monophosphate (AMP) to adenosine diphosphate (ADP) by adenylate kinase (ADK) involves large amplitude, ligand induced domain motions, involving the opening and the closing of ATP binding domain (LID) and AMP binding domain (NMP) domains, during the repeated catalytic cycle. We discover and analyze an interesting dynamical coupling between the motion of the two domains during the opening, using large scale atomistic molecular dynamics trajectory analysis, covariance analysis, and multidimensional free energy calculations with explicit water. Initially, the LID domain must open by a certain amount before the NMP domain can begin to open. Dynamical correlation map shows interesting cross-peak between LID and NMP domain which suggests the presence of correlated motion between them. This is also reflected in our calculated two-dimensional free energy surface contour diagram which has an interesting elliptic shape, revealing a strong correlation between the opening of the LID domain and that of the NMP domain. Our free energy surface of the LID domain motion is rugged due to interaction with water and the signature of ruggedness is evident in the observed root mean square deviation variation and its fluctuation time correlation functions. We develop a correlated dynamical disorder-type theoretical model to explain the observed dynamic coupling between the motion of the two domains in ADK. Our model correctly reproduces several features of the cross-correlation observed in simulations.
Confidence Probability versus Detection Probability
Axelrod, M
2005-08-18
In a discovery sampling activity the auditor seeks to vet an inventory by measuring (or inspecting) a random sample of items from the inventory. When the auditor finds every sample item in compliance, he must then make a confidence statement about the whole inventory. For example, the auditor might say: ''We believe that this inventory of 100 items contains no more than 5 defectives with 95% confidence.'' Note this is a retrospective statement in that it asserts something about the inventory after the sample was selected and measured. Contrast this to the prospective statement: ''We will detect the existence of more than 5 defective items in this inventory with 95% probability.'' The former uses confidence probability while the latter uses detection probability. For a given sample size, the two probabilities need not be equal, indeed they could differ significantly. Both these probabilities critically depend on the auditor's prior belief about the number of defectives in the inventory and how he defines non-compliance. In other words, the answer strongly depends on how the question is framed.
Ruan, Lifang; Pleitner, Aaron; Gänzle, Michael G.; McMullen, Lynn M.
2011-01-01
This study aimed to elucidate determinants of heat resistance in Escherichia coli by comparing the composition of membrane lipids, as well as gene expression, in heat-resistant E. coli AW1.7 and heat-sensitive E. coli GGG10 with or without heat shock. The survival of E. coli AW1.7 at late exponential phase was 100-fold higher than that of E. coli GGG10 after incubation at 60°C for 15 min. The cytoplasmic membrane of E. coli AW1.7 contained a higher proportion of saturated and cyclopropane fatty acids than that of E. coli GGG10. Microarray hybridization of cDNA libraries obtained from exponentially growing or heat-shocked cultures was performed to compare gene expression in these two strains. Expression of selected genes from different functional groups was quantified by quantitative PCR. DnaK and 30S and 50S ribosomal subunits were overexpressed in E. coli GGG10 relative to E. coli AW1.7 upon heat shock at 50°C, indicating improved ribosome stability. The outer membrane porin NmpC and several transport proteins were overexpressed in exponentially growing E. coli AW1.7. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis of membrane properties confirmed that NmpC is present in the outer membrane of E. coli AW1.7 but not in that of E. coli GGG10. Expression of NmpC in E. coli GGG10 increased survival at 60°C 50- to 1,000-fold. In conclusion, the outer membrane porin NmpC contributes to heat resistance in E. coli AW1.7, but the heat resistance of this strain is dependent on additional factors, which likely include the composition of membrane lipids, as well as solute transport proteins. PMID:21398480
Jia, Hongying; Gao, Picheng; Ma, Hongmin; Wu, Dan; Du, Bin; Wei, Qin
2015-02-01
A novel label-free amperometric immunosensor for sensitive detection of nuclear matrix protein 22 (NMP22) was developed based on Au-Pt bimetallic nanostructures, which were prepared by combining top-down with bottom-up strategies. Nanoporous gold (NPG) was prepared by "top-down" dealloying of commercial Au/Ag alloy film. After deposition of NPG on an electrode, Pt nanoparticles (PtNPs) were further decorated on NPG by "bottom-up" electrodeposition. The prepared bimetallic nanostructures combine the merits of both NPG and PtNPs, and show a high electrocatalytic activity towards the reduction of H2O2. The label-free immunosensor was constructed by directly immobilizing antibody of NMP22 (anti-NMP22) on the surface of bimetallic nanostructures. The immunoreaction induced amperometric response could be detected and negatively correlated to the concentration of NMP22. Bimetallic nanostructure morphologies and detection conditions were investigated to obtain the best sensing performance. Under the optimal conditions, a linear range from 0.01ng/mL to 10ng/mL and a detection limit of 3.33pg/mL were obtained. The proposed immunosensor showed high sensitivity, good selectivity, stability, reproducibility, and regeneration for the detection of NMP22, and it was evaluated in urine samples, receiving satisfactory results.
ERIC Educational Resources Information Center
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Probability and Relative Frequency
NASA Astrophysics Data System (ADS)
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
ERIC Educational Resources Information Center
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
TAF15 and the leukemia-associated fusion protein TAF15-CIZ/NMP4 are cleaved by caspases-3 and -7
Alves, Juliano; Wurdak, Heiko; Garay-Malpartida, Humberto M.; Harris, Jennifer L.; Occhiucci, Joao M.; Belizario, Jose E.; Li, Jun
2009-07-10
Caspases are central players in proteolytic pathways that regulate cellular processes such as apoptosis and differentiation. To accelerate the discovery of novel caspase substrates we developed a method combining in silico screening and in vitro validation. With this approach, we identified TAF15 as a novel caspase substrate in a trial study. We find that TAF15 was specifically cleaved by caspases-3 and -7. Site-directed mutagenesis revealed the consensus sequence {sup 106}DQPD/Y{sup 110} as the only site recognized by these caspases. Surprisingly, TAF15 was cleaved at more than one site in staurosporine-treated Jurkat cells. In addition, we generated two oncogenic TAF15-CIZ/NMP4-fused proteins which have been found in acute myeloid leukemia and demonstrate that caspases-3 and -7 cleave the fusion proteins at one single site. Broad application of this combination approach should expedite identification of novel caspase-interacting proteins and provide new insights into the regulation of caspase pathways leading to cell death in normal and cancer cells.
Dynamical Simulation of Probabilities
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
NASA Astrophysics Data System (ADS)
Laktineh, Imad
2010-04-01
This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p.) corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Probability of satellite collision
NASA Technical Reports Server (NTRS)
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
ERIC Educational Resources Information Center
Barnes, Bernis, Ed.; And Others
This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…
Teachers' Understandings of Probability
ERIC Educational Resources Information Center
Liu, Yan; Thompson, Patrick
2007-01-01
Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…
NASA Technical Reports Server (NTRS)
Soneira, R. M.; Bahcall, J. N.
1981-01-01
Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Rationalizing Hybrid Earthquake Probabilities
NASA Astrophysics Data System (ADS)
Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.
2003-12-01
An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.
Asteroidal collision probabilities
NASA Astrophysics Data System (ADS)
Bottke, W. F.; Greenberg, R.
1993-05-01
Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.
Probabilities in implicit learning.
Tseng, Philip; Hsu, Tzu-Yu; Tzeng, Ovid J L; Hung, Daisy L; Juan, Chi-Hung
2011-01-01
The visual system possesses a remarkable ability in learning regularities from the environment. In the case of contextual cuing, predictive visual contexts such as spatial configurations are implicitly learned, retained, and used to facilitate visual search-all without one's subjective awareness and conscious effort. Here we investigated whether implicit learning and its facilitatory effects are sensitive to the statistical property of such implicit knowledge. In other words, are highly probable events learned better than less probable ones even when such learning is implicit? We systematically varied the frequencies of context repetition to alter the degrees of learning. Our results showed that search efficiency increased consistently as contextual probabilities increased. Thus, the visual contexts, along with their probability of occurrences, were both picked up by the visual system. Furthermore, even when the total number of exposures was held constant between each probability, the highest probability still enjoyed a greater cuing effect, suggesting that the temporal aspect of implicit learning is also an important factor to consider in addition to the effect of mere frequency. Together, these findings suggest that implicit learning, although bypassing observers' conscious encoding and retrieval effort, behaves much like explicit learning in the sense that its facilitatory effect also varies as a function of its associative strengths.
NASA Technical Reports Server (NTRS)
Bollenbacher, Gary; Guptill, James D.
1999-01-01
This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Experimental Probability in Elementary School
ERIC Educational Resources Information Center
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
A Unifying Probability Example.
ERIC Educational Resources Information Center
Maruszewski, Richard F., Jr.
2002-01-01
Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…
ERIC Educational Resources Information Center
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Univariate Probability Distributions
ERIC Educational Resources Information Center
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
Approximating Integrals Using Probability
ERIC Educational Resources Information Center
Maruszewski, Richard F., Jr.; Caudle, Kyle A.
2005-01-01
As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…
NASA Astrophysics Data System (ADS)
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Superpositions of probability distributions
NASA Astrophysics Data System (ADS)
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Efficient Probability Sequences
2014-08-18
Ungar (2014), to produce a distinct forecasting system. The system consists of the method for eliciting individual subjective forecasts together with...E. Stone, and L. H. Ungar (2014). Two reasons to make aggregated probability forecasts more extreme. Decision Analysis 11 (2), 133–145. Bickel, J. E...Letters 91 (3), 425–429. Mellers, B., L. Ungar , J. Baron, J. Ramos, B. Gurcay, K. Fincher, S. E. Scott, D. Moore, P. Atanasov, S. A. Swift, et al. (2014
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Troutman, B.M.; Karlinger, M.R.
2003-01-01
The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.
Retrieve Tether Survival Probability
2007-11-02
cuts of the tether by meteorites and orbital debris , is calculated to be 99.934% for the planned experiment duration of six months or less. This is...due to the unlikely event of a strike by a large piece of orbital debris greater than 1 meter in size cutting all the lines of the tether at once. The...probability of the tether surviving multiple cuts by meteoroid and orbital debris impactors smaller than 5 cm in diameter is 99.9993% at six months
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
ERIC Educational Resources Information Center
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Coherent Assessment of Subjective Probability
1981-03-01
known results of de Finetti (1937, 1972, 1974), Smith (1961), and Savage (1971) and some recent results of Lind- ley (1980) concerning the use of...provides the motivation for de Finettis definition of subjective probabilities as coherent bet prices. From the definition of the probability measure...subjective probability, the probability laws which are traditionally stated as axioms or definitions are obtained instead as theorems. (De Finetti F -7
Probabilities of transversions and transitions.
Vol'kenshtein, M V
1976-01-01
The values of the mean relative probabilities of transversions and transitions have been refined on the basis of the data collected by Jukes and found to be equal to 0.34 and 0.66, respectively. Evolutionary factors increase the probability of transversions to 0.44. The relative probabilities of individual substitutions have been determined, and a detailed classification of the nonsense mutations has been given. Such mutations are especially probable in the UGG (Trp) codon. The highest probability of AG, GA transitions correlates with the lowest mean change in the hydrophobic nature of the amino acids coded.
Probability workshop to be better in probability topic
NASA Astrophysics Data System (ADS)
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Propensity, Probability, and Quantum Theory
NASA Astrophysics Data System (ADS)
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
The Probabilities of Unique Events
2012-08-30
probabilities into quantum mechanics, and some psychologists have argued that they have a role to play in accounting for errors in judgment [30]. But, in...Discussion The mechanisms underlying naive estimates of the probabilities of unique events are largely inaccessible to consciousness , but they...Can quantum probability provide a new direc- tion for cognitive modeling? Behavioral and Brain Sciences (in press). 31. Paolacci G, Chandler J
Probability Surveys, Conditional Probability, and Ecological Risk Assessment
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Information Processing Using Quantum Probability
NASA Astrophysics Data System (ADS)
Behera, Laxmidhar
2006-11-01
This paper presents an information processing paradigm that introduces collective response of multiple agents (computational units) while the level of intelligence associated with the information processing has been increased manifold. It is shown that if the potential field of the Schroedinger wave equation is modulated using a self-organized learning scheme, then the probability density function associated with the stochastic data is transferred to the probability amplitude function which is the response of the Schroedinger wave equation. This approach illustrates that information processing of data with stochastic behavior can be efficiently done using quantum probability instead of classical probability. The proposed scheme has been demonstrated through two applications: denoising and adaptive control.
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
Capture probabilities for secondary resonances
NASA Technical Reports Server (NTRS)
Malhotra, Renu
1990-01-01
A perturbed pendulum model is used to analyze secondary resonances, and it is shown that a self-similarity between secondary and primary resonances exists. Henrard's (1982) theory is used to obtain formulas for the capture probability into secondary resonances. The tidal evolution of Miranda and Umbriel is considered as an example, and significant probabilities of capture into secondary resonances are found.
Definition of the Neutrosophic Probability
NASA Astrophysics Data System (ADS)
Smarandache, Florentin
2014-03-01
Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.
Cluster membership probability: polarimetric approach
NASA Astrophysics Data System (ADS)
Medhi, Biman J.; Tamura, Motohide
2013-04-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.
Holographic Probabilities in Eternal Inflation
NASA Astrophysics Data System (ADS)
Bousso, Raphael
2006-11-01
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Dinosaurs, Dinosaur Eggs, and Probability.
ERIC Educational Resources Information Center
Teppo, Anne R.; Hodgson, Ted
2001-01-01
Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)
Joint probabilities and quantum cognition
NASA Astrophysics Data System (ADS)
de Barros, J. Acacio
2012-12-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Interference of probabilities in dynamics
Zak, Michail
2014-08-15
A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.
Knowledge typology for imprecise probabilities.
Wilson, G. D.; Zucker, L. J.
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Stretching Probability Explorations with Geoboards
ERIC Educational Resources Information Center
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
On probability-possibility transformations
NASA Technical Reports Server (NTRS)
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Children's Understanding of Posterior Probability
ERIC Educational Resources Information Center
Girotto, Vittorio; Gonzalez, Michael
2008-01-01
Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five,…
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Probability Simulation in Middle School.
ERIC Educational Resources Information Center
Lappan, Glenda; Winter, M. J.
1980-01-01
Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)
GPS: Geometry, Probability, and Statistics
ERIC Educational Resources Information Center
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Time-dependent earthquake probabilities
Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.
2005-01-01
We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Fact Sheet: N-Methylpyrrolidone (NMP)
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Probability summation--a critique.
Laming, Donald
2013-03-01
This Discussion Paper seeks to kill off probability summation, specifically the high-threshold assumption, as an explanatory idea in visual science. In combination with a Weibull function of a parameter of about 4, probability summation can accommodate, to within the limits of experimental error, the shape of the detectability function for contrast, the reduction in threshold that results from the combination of widely separated grating components, summation with respect to duration at threshold, and some instances, but not all, of spatial summation. But it has repeated difficulty with stimuli below threshold, because it denies the availability of input from such stimuli. All the phenomena listed above, and many more, can be accommodated equally accurately by signal-detection theory combined with an accelerated nonlinear transform of small, near-threshold, contrasts. This is illustrated with a transform that is the fourth power for the smallest contrasts, but tends to linear above threshold. Moreover, this particular transform can be derived from elementary properties of sensory neurons. Probability summation cannot be regarded as a special case of a more general theory, because it depends essentially on the 19th-century notion of a high fixed threshold. It is simply an obstruction to further progress.
Objective Probability and Quantum Fuzziness
NASA Astrophysics Data System (ADS)
Mohrhoff, U.
2009-02-01
This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
NASA Astrophysics Data System (ADS)
Woltjer, L.
1987-06-01
En la reunion celebrada en diciembre dei ano pasado informe al Consejo de mi deseo de terminar mi contrato como Director General de la ESO una vez que fuera aprobado el proyecto dei VLT, que se espera sucedera hacia fines de este aAo. Cuando fue renovada mi designacion hace tres aAos, el Consejo conocia mi intencion de no completar los cinco aAos dei contrato debido a mi deseo de disponer de mas tiempo para otras actividades. Ahora, una vez terminada la fase preparatoria para el VLT, Y habiendose presentado el proyecto formalmente al Consejo el dia 31 de marzo, y esperando su muy probable aprobacion antes dei termino de este ano, me parece que el 10 de enero de 1988 presenta una excelente fecha para que se produzca un cambio en la administracion de la ESO.
The Black Hole Formation Probability
NASA Astrophysics Data System (ADS)
Clausen, Drew R.; Piro, Anthony; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Lectures on probability and statistics
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.
Modality, probability, and mental models.
Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N
2016-10-01
We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both. Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning.
MSPI False Indication Probability Simulations
Dana Kelly; Kurt Vedros; Robert Youngblood
2011-03-01
This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values
WITPO (What Is the Probability Of).
ERIC Educational Resources Information Center
Ericksen, Donna Bird; And Others
1991-01-01
Included in this probability board game are the requirements, the rules, the board, and 44 sample questions. This game can be used as a probability unit review for practice on basic skills and algorithms, such as computing compound probability and using Pascal's triangle to solve binomial probability problems. (JJK)
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Fusion probability in heavy nuclei
NASA Astrophysics Data System (ADS)
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability,
Trajectory versus probability density entropy.
Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A
2001-07-01
We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.
THE BLACK HOLE FORMATION PROBABILITY
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
The Black Hole Formation Probability
NASA Astrophysics Data System (ADS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Teaching Probabilities and Statistics to Preschool Children
ERIC Educational Resources Information Center
Pange, Jenny
2003-01-01
This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…
The Cognitive Substrate of Subjective Probability
ERIC Educational Resources Information Center
Nilsson, Hakan; Olsson, Henrik; Juslin, Peter
2005-01-01
The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…
UT Biomedical Informatics Lab (BMIL) probability wheel
NASA Astrophysics Data System (ADS)
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
UT Biomedical Informatics Lab (BMIL) Probability Wheel.
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Derivation of quantum probability from measurement
NASA Astrophysics Data System (ADS)
Herbut, Fedor
2016-05-01
To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).
Error probability performance of unbalanced QPSK receivers
NASA Technical Reports Server (NTRS)
Simon, M. K.
1978-01-01
A simple technique for calculating the error probability performance and associated noisy reference loss of practical unbalanced QPSK receivers is presented. The approach is based on expanding the error probability conditioned on the loop phase error in a power series in the loop phase error and then, keeping only the first few terms of this series, averaging this conditional error probability over the probability density function of the loop phase error. Doing so results in an expression for the average error probability which is in the form of a leading term representing the ideal (perfect synchronization references) performance plus a term proportional to the mean-squared crosstalk. Thus, the additional error probability due to noisy synchronization references occurs as an additive term proportional to the mean-squared phase jitter directly associated with the receiver's tracking loop. Similar arguments are advanced to give closed-form results for the noisy reference loss itself.
UT Biomedical Informatics Lab (BMIL) Probability Wheel
Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
2016-01-01
A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant,” about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences. PMID:28105462
Probability and Quantum Paradigms: the Interplay
Kracklauer, A. F.
2007-12-03
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
Location probability learning requires focal attention.
Kabata, Takashi; Yokoyama, Takemasa; Noguchi, Yasuki; Kita, Shinichi
2014-01-01
Target identification is related to the frequency with which targets appear at a given location, with greater frequency enhancing identification. This phenomenon suggests that location probability learned through repeated experience with the target modulates cognitive processing. However, it remains unclear whether attentive processing of the target is required to learn location probability. Here, we used a dual-task paradigm to test the location probability effect of attended and unattended stimuli. Observers performed an attentionally demanding central-letter task and a peripheral-bar discrimination task in which location probability was manipulated. Thus, we were able to compare performance on the peripheral task when attention was fully engaged to the target (single-task condition) versus when attentional resources were drawn away by the central task (dual-task condition). The location probability effect occurred only in the single-task condition, when attention resources were fully available. This suggests that location probability learning requires attention to the target stimuli.
Experience matters: information acquisition optimizes probability gain.
Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J
2010-07-01
Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.
Total variation denoising of probability measures using iterated function systems with probabilities
NASA Astrophysics Data System (ADS)
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
Simulations of Probabilities for Quantum Computing
NASA Technical Reports Server (NTRS)
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
47 CFR 1.1623 - Probability calculation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...
Correlation as Probability of Common Descent.
ERIC Educational Resources Information Center
Falk, Ruma; Well, Arnold D.
1996-01-01
One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the…
Probability: A Matter of Life and Death
ERIC Educational Resources Information Center
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Phonotactic Probabilities in Young Children's Speech Production
ERIC Educational Resources Information Center
Zamuner, Tania S.; Gerken, Louann; Hammond, Michael
2004-01-01
This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…
47 CFR 1.1623 - Probability calculation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than...
Teaching Statistics and Probability: 1981 Yearbook.
ERIC Educational Resources Information Center
Shulte, Albert P., Ed.; Smart, James R., Ed.
This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…
Teaching Probability: A Socio-Constructivist Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
ERIC Educational Resources Information Center
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
WPE: A Mathematical Microworld for Learning Probability
ERIC Educational Resources Information Center
Kiew, Su Ding; Sam, Hong Kian
2006-01-01
In this study, the researchers developed the Web-based Probability Explorer (WPE), a mathematical microworld and investigated the effectiveness of the microworld's constructivist learning environment in enhancing the learning of probability and improving students' attitudes toward mathematics. This study also determined the students' satisfaction…
Malawian Students' Meanings for Probability Vocabulary
ERIC Educational Resources Information Center
Kazima, Mercy
2007-01-01
The paper discusses findings of a study that investigated Malawian students' meanings for some probability vocabulary. The study explores the meanings that, prior to instruction, students assign to some words that are commonly used in teaching probability. The aim is to have some insight into the meanings that students bring to the classroom. The…
Probability Simulations by Non-Lipschitz Chaos
NASA Technical Reports Server (NTRS)
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Laboratory-Tutorial Activities for Teaching Probability
ERIC Educational Resources Information Center
Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We…
Probability Issues in without Replacement Sampling
ERIC Educational Resources Information Center
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Average Transmission Probability of a Random Stack
ERIC Educational Resources Information Center
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Assessment of the probability of contaminating Mars
NASA Technical Reports Server (NTRS)
Judd, B. R.; North, D. W.; Pezier, J. P.
1974-01-01
New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.
Optimizing Probability of Detection Point Estimate Demonstration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Time-dependent landslide probability mapping
Campbell, Russell H.; Bernknopf, Richard L.; ,
1993-01-01
Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
An introductory analysis of satellite collision probabilities
NASA Astrophysics Data System (ADS)
Carlton-Wippern, Kitt C.
This paper addresses a probailistic approach in assessing the probabilities of a satellite collision occurring due to relative trajectory analyses and probability density functions representing the satellites' position/momentum vectors. The paper is divided into 2 parts: Static and Dynamic Collision Probabilities. In the Static Collision Probability section, the basic phenomenon under study is: given the mean positions and associated position probability density functions for the two objects, calculate the probability that the two objects collide (defined as being within some distance of each other). The paper presents the classic Laplace problem of the probability of arrival, using standard uniform distribution functions. This problem is then extrapolated to show how 'arrival' can be classified as 'collision', how the arrival space geometries map to collision space geometries and how arbitrary position density functions can then be included and integrated into the analysis. In the Dynamic Collision Probability section, the nature of collisions based upon both trajectory and energy considerations is discussed, and that energy states alone cannot be used to completely describe whether or not a collision occurs. This fact invalidates some earlier work on the subject and demonstrates why Liouville's theorem cannot be used in general to describe the constant density of the position/momentum space in which a collision may occur. Future position probability density functions are then shown to be the convolution of the current position and momentum density functions (linear analysis), and the paper further demonstrates the dependency of the future position density functions on time. Strategies for assessing the collision probabilities for two point masses with uncertainties in position and momentum at some given time, and thes integrated with some arbitrary impact volume schema, are then discussed. This presentation concludes with the formulation of a high level design
Liquefaction probability curves for surficial geologic deposits
Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.
2011-01-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA) = 0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.
Seismicity alert probabilities at Parkfield, California, revisited
Michael, A.J.; Jones, L.M.
1998-01-01
For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.
The probability distribution of intense daily precipitation
NASA Astrophysics Data System (ADS)
Cavanaugh, Nicholas R.; Gershunov, Alexander; Panorska, Anna K.; Kozubowski, Tomasz J.
2015-03-01
The probability tail structure of over 22,000 weather stations globally is examined in order to identify the physically and mathematically consistent distribution type for modeling the probability of intense daily precipitation and extremes. Results indicate that when aggregating data annually, most locations are to be considered heavy tailed with statistical significance. When aggregating data by season, it becomes evident that the thickness of the probability tail is related to the variability in precipitation causing events and thus that the fundamental cause of precipitation volatility is weather diversity. These results have both theoretical and practical implications for the modeling of high-frequency climate variability worldwide.
Class probability estimation for medical studies.
Simon, Richard
2014-07-01
I provide a commentary on two papers "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. Those papers provide an up-to-date review of some popular machine learning methods for class probability estimation and compare those methods to logistic regression modeling in real and simulated datasets.
Objective and subjective probability in gene expression.
Velasco, Joel D
2012-09-01
In this paper I address the question of whether the probabilities that appear in models of stochastic gene expression are objective or subjective. I argue that while our best models of the phenomena in question are stochastic models, this fact should not lead us to automatically assume that the processes are inherently stochastic. After distinguishing between models and reality, I give a brief introduction to the philosophical problem of the interpretation of probability statements. I argue that the objective vs. subjective distinction is a false dichotomy and is an unhelpful distinction in this case. Instead, the probabilities in our models of gene expression exhibit standard features of both objectivity and subjectivity.
Characteristic length of the knotting probability revisited
NASA Astrophysics Data System (ADS)
Uehara, Erica; Deguchi, Tetsuo
2015-09-01
We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.
Transition Probability and the ESR Experiment
ERIC Educational Resources Information Center
McBrierty, Vincent J.
1974-01-01
Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)
Inclusion probability with dropout: an operational formula.
Milot, E; Courteau, J; Crispino, F; Mailly, F
2015-05-01
In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.
The low synaptic release probability in vivo.
Borst, J Gerard G
2010-06-01
The release probability, the average probability that an active zone of a presynaptic terminal releases one or more vesicles following an action potential, is tightly regulated. Measurements in cultured neurons or in slices indicate that this probability can vary greatly between synapses, but on average it is estimated to be as high as 0.5. In vivo, however, the size of synaptic potentials is relatively independent of recent history, suggesting that release probability is much lower. Possible causes for this discrepancy include maturational differences, a higher spontaneous activity, a lower extracellular calcium concentration and more prominent tonic inhibition by ambient neurotransmitters during in vivo recordings. Existing evidence thus suggests that under physiological conditions in vivo, presynaptic action potentials trigger the release of neurotransmitter much less frequently than what is observed in in vitro preparations.
Classical and Quantum Spreading of Position Probability
ERIC Educational Resources Information Center
Farina, J. E. G.
1977-01-01
Demonstrates that the standard deviation of the position probability of a particle moving freely in one dimension is a function of the standard deviation of its velocity distribution and time in classical or quantum mechanics. (SL)
On Convergent Probability of a Random Walk
ERIC Educational Resources Information Center
Lee, Y.-F.; Ching, W.-K.
2006-01-01
This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.
Robust satisficing and the probability of survival
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2014-01-01
Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.
Probability, clinical decision making and hypothesis testing
Banerjee, A.; Jadhav, S. L.; Bhawalkar, J. S.
2009-01-01
Few clinicians grasp the true concept of probability expressed in the ‘P value.’ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing. PMID:21234167
Grounding quantum probability in psychological mechanism.
Love, Bradley C
2013-06-01
Pothos & Busemeyer (P&B) provide a compelling case that quantum probability (QP) theory is a better match to human judgment than is classical probability (CP) theory. However, any theory (QP, CP, or other) phrased solely at the computational level runs the risk of being underconstrained. One suggestion is to ground QP accounts in mechanism, to leverage a wide range of process-level data.
A Manual for Encoding Probability Distributions.
1978-09-01
summary of the most significant information contained in the report. If the report contains a significant bibliography or literature survey, mention it...probability distri- bution. Some terms in the literature that are used synonymously to Encoding: Assessment, Assignment (used for single events in this...sessions conducted as parts of practical decision analyses as well as on experimental evidence in the literature . Probability encoding can be applied
Imprecise Probability Methods for Weapons UQ
Picard, Richard Roy; Vander Wiel, Scott Alan
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Probability distribution of the vacuum energy density
Duplancic, Goran; Stefancic, Hrvoje; Glavan, Drazen
2010-12-15
As the vacuum state of a quantum field is not an eigenstate of the Hamiltonian density, the vacuum energy density can be represented as a random variable. We present an analytical calculation of the probability distribution of the vacuum energy density for real and complex massless scalar fields in Minkowski space. The obtained probability distributions are broad and the vacuum expectation value of the Hamiltonian density is not fully representative of the vacuum energy density.
When probability trees don't work
NASA Astrophysics Data System (ADS)
Chan, K. C.; Lenard, C. T.; Mills, T. M.
2016-08-01
Tree diagrams arise naturally in courses on probability at high school or university, even at an elementary level. Often they are used to depict outcomes and associated probabilities from a sequence of games. A subtle issue is whether or not the Markov condition holds in the sequence of games. We present two examples that illustrate the importance of this issue. Suggestions as to how these examples may be used in a classroom are offered.
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions
Larget, Bret
2013-01-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066
Tsunami probability in the Caribbean Region
Parsons, T.; Geist, E.L.
2008-01-01
We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.
Probability detection mechanisms and motor learning.
Lungu, O V; Wächter, T; Liu, T; Willingham, D T; Ashe, J
2004-11-01
The automatic detection of patterns or regularities in the environment is central to certain forms of motor learning, which are largely procedural and implicit. The rules underlying the detection and use of probabilistic information in the perceptual-motor domain are largely unknown. We conducted two experiments involving a motor learning task with direct and crossed mapping of motor responses in which probabilities were present at the stimulus set level, the response set level, and at the level of stimulus-response (S-R) mapping. We manipulated only one level at a time, while controlling for the other two. The results show that probabilities were detected only when present at the S-R mapping and motor levels, but not at the perceptual one (experiment 1), unless the perceptual features have a dimensional overlap with the S-R mapping rule (experiment 2). The effects of probability detection were mostly facilitatory at the S-R mapping, both facilitatory and inhibitory at the perceptual level, and predominantly inhibitory at the response-set level. The facilitatory effects were based on learning the absolute frequencies first and transitional probabilities later (for the S-R mapping rule) or both types of information at the same time (for perceptual level), whereas the inhibitory effects were based on learning first the transitional probabilities. Our data suggest that both absolute frequencies and transitional probabilities are used in motor learning, but in different temporal orders, according to the probabilistic properties of the environment. The results support the idea that separate neural circuits may be involved in detecting absolute frequencies as compared to transitional probabilities.
Minimal entropy probability paths between genome families.
Ahlbrandt, Calvin; Benson, Gary; Casey, William
2004-05-01
We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non
Approximation of Failure Probability Using Conditional Sampling
NASA Technical Reports Server (NTRS)
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.
Computing Earthquake Probabilities on Global Scales
NASA Astrophysics Data System (ADS)
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
The role of probabilities in physics.
Le Bellac, Michel
2012-09-01
Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.
Probability, arrow of time and decoherence
NASA Astrophysics Data System (ADS)
Bacciagaluppi, Guido
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
Match probabilities in racially admixed populations.
Lange, K
1993-01-01
The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors. PMID:8430693
Local Directed Percolation Probability in Two Dimensions
NASA Astrophysics Data System (ADS)
Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi
1998-01-01
Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.
Explosion probability of unexploded ordnance: expert beliefs.
MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G
2008-08-01
This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies
Exact probability distribution functions for Parrondo's games
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Saakian, David B.; Klümper, Andreas
2016-12-01
We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.
Intrinsic Probability of a Multifractal Set
NASA Astrophysics Data System (ADS)
Hosokawa, Iwao
1991-12-01
It is shown that a self-similar measure isotropically distributed in a d-dimensional set should have its own intermittency exponents equivalent to its own generalized dimensions (in the sense of Hentschel and Procaccia), and that the intermittency exponents are completely designated by an intrinsic probability which governs the spatial distribution of the measure. Based on this, it is proven that the intrinsic probability uniquely determines the spatial distribution of the scaling index α of the measure as well as the so-called f-α spectrum of the multifractal set.
Atomic transition probabilities of Nd I
NASA Astrophysics Data System (ADS)
Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.
2011-12-01
Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.
Probabilities for separating sets of order statistics.
Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E
2010-04-01
Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.
Steering in spin tomographic probability representation
NASA Astrophysics Data System (ADS)
Man'ko, V. I.; Markovich, L. A.
2016-09-01
The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.
Determining system maintainability as a probability
Wright, R.E.; Atwood, C.L.
1988-01-01
Maintainability has often been defined in principle as the probability that a system or component can be repaired in a specific time given that it is in a failed state, but presented in practice in terms of mean-time-to-repair. In this paper, formulas are developed for maintainability as a probability, analogous to those for reliability and availability. This formulation is expressed in terms of cut sets, and leads to a natural definition of unmaintainability importance for cut sets and basic events. 6 refs.
Probability in biology: overview of a comprehensive theory of probability in living systems.
Nakajima, Toshiyuki
2013-09-01
Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.
Probability learning and Piagetian probability conceptions in children 5 to 12 years old.
Kreitler, S; Zigler, E; Kreitler, H
1989-11-01
This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.
Investigating Probability with the NBA Draft Lottery.
ERIC Educational Resources Information Center
Quinn, Robert J.
1997-01-01
Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…
Confusion between Odds and Probability, a Pandemic?
ERIC Educational Resources Information Center
Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer
2012-01-01
This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…
Probability distribution functions of the Grincevicjus series
NASA Astrophysics Data System (ADS)
Kapica, Rafal; Morawiec, Janusz
2008-06-01
Given a sequence ([xi]n,[eta]n) of independent identically distributed vectors of random variables we consider the Grincevicjus series and a functional-integral equation connected with it. We prove that the equation characterizes all probability distribution functions of the Grincevicjus series. Moreover, some application of this characterization to a continuous refinement equation is presented.
Time Required to Compute A Posteriori Probabilities,
The paper discusses the time required to compute a posteriori probabilities using Bayes ’ Theorem . In a two-hypothesis example it is shown that, to... Bayes ’ Theorem as the group operation. Winograd’s results concerning the lower bound on the time required to perform a group operation on a finite group using logical circuitry are therefore applicable. (Author)
Interstitial lung disease probably caused by imipramine.
Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H
2014-01-01
Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.
The Smart Potential behind Probability Matching
ERIC Educational Resources Information Center
Gaissmaier, Wolfgang; Schooler, Lael J.
2008-01-01
Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…
Probability of boundary conditions in quantum cosmology
NASA Astrophysics Data System (ADS)
Suenobu, Hiroshi; Nambu, Yasusada
2017-02-01
One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.
Idempotent probability measures on ultrametric spaces
NASA Astrophysics Data System (ADS)
Hubal, Oleksandra; Zarichnyi, Mykhailo
2008-07-01
Following the construction due to Hartog and Vink we introduce a metric on the set of idempotent probability measures (Maslov measures) defined on an ultrametric space. This construction determines a functor on the category of ultrametric spaces and nonexpanding maps. We prove that this functor is the functorial part of a monad on this category. This monad turns out to contain the hyperspace monad.
Five-Parameter Bivariate Probability Distribution
NASA Technical Reports Server (NTRS)
Tubbs, J.; Brewer, D.; Smith, O. W.
1986-01-01
NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.
Independent Events in Elementary Probability Theory
ERIC Educational Resources Information Center
Csenki, Attila
2011-01-01
In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…
Geometric Probability and the Areas of Leaves
ERIC Educational Resources Information Center
Hoiberg, Karen Bush; Sharp, Janet; Hodgson, Ted; Colbert, Jim
2005-01-01
This article describes how a group of fifth-grade mathematics students measured irregularly shaped objects using geometric probability theory. After learning how to apply a ratio procedure to find the areas of familiar shapes, students extended the strategy for use with irregularly shaped objects, in this case, leaves. (Contains 2 tables and 8…
Assessing Schematic Knowledge of Introductory Probability Theory
ERIC Educational Resources Information Center
Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley
2005-01-01
The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…
Automatic Item Generation of Probability Word Problems
ERIC Educational Resources Information Center
Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina
2009-01-01
Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…
Probability from a Socio-Cultural Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Probability & Perception: The Representativeness Heuristic in Action
ERIC Educational Resources Information Center
Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.
2014-01-01
If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…
Posterior Probabilities for a Consensus Ordering.
ERIC Educational Resources Information Center
Fligner, Michael A.; Verducci, Joseph S.
1990-01-01
The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)
Phonotactic Probability Effects in Children Who Stutter
ERIC Educational Resources Information Center
Anderson, Julie D.; Byrd, Courtney T.
2008-01-01
Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…
Rethinking the learning of belief network probabilities
Musick, R.
1996-03-01
Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.
Probability distribution functions in turbulent convection
NASA Technical Reports Server (NTRS)
Balachandar, S.; Sirovich, L.
1991-01-01
Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.
Probability & Statistics: Modular Learning Exercises. Student Edition
ERIC Educational Resources Information Center
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…
Spatial Probability Cuing and Right Hemisphere Damage
ERIC Educational Resources Information Center
Shaqiri, Albulena; Anderson, Britt
2012-01-01
In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…
Learning a Probability Distribution Efficiently and Reliably
NASA Technical Reports Server (NTRS)
Laird, Philip; Gamble, Evan
1988-01-01
A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.
Probability & Statistics: Modular Learning Exercises. Teacher Edition
ERIC Educational Resources Information Center
Actuarial Foundation, 2012
2012-01-01
The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…
Overcoming Challenges in Learning Probability Vocabulary
ERIC Educational Resources Information Center
Groth, Randall E.; Butler, Jaime; Nelson, Delmar
2016-01-01
Students can struggle to understand and use terms that describe probabilities. Such struggles lead to difficulties comprehending classroom conversations. In this article, we describe some specific misunderstandings a group of students (ages 11-12) held in regard to vocabulary such as "certain", "likely" and…
Activities in Elementary Probability, Monograph No. 9.
ERIC Educational Resources Information Center
Fouch, Daniel J.
This monograph on elementary probability for middle school, junior high, or high school consumer mathematics students is divided into two parts. Part one emphasizes lessons which cover the fundamental counting principle, permutations, and combinations. The 5 lessons of part I indicate the objectives, examples, methods, application, and problems…
Probability in Action: The Red Traffic Light
ERIC Educational Resources Information Center
Shanks, John A.
2007-01-01
Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…
Technique for Evaluating Multiple Probability Occurrences /TEMPO/
NASA Technical Reports Server (NTRS)
Mezzacappa, M. A.
1970-01-01
Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.
Monte Carlo methods to calculate impact probabilities
NASA Astrophysics Data System (ADS)
Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.
2014-09-01
Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward
Xia, Shuangluo; Beckman, Jeff; Wang, Jimin; Konigsberg, William H.
2012-10-10
Residues in the nascent base pair binding pocket (NBP) of bacteriophage RB69 DNA polymerase (RB69pol) are responsible for base discrimination. Replacing Tyr567 with Ala leads to greater flexibility in the NBP, increasing the probability of misincorporation. We used the fluorescent cytosine analogue, 1,3-diaza-2-oxophenoxazine (tC{sup o}), to identify preinsertion step(s) altered by NBP flexibility. When tC{sup o} is the templating base in a wild-type (wt) RB69pol ternary complex, its fluorescence is quenched only in the presence of dGTP. However, with the RB69pol Y567A mutant, the fluorescence of tC{sup o} is also quenched in the presence of dATP. We determined the crystal structure of the dATP/tC{sup o}-containing ternary complex of the RB69pol Y567A mutant at 1.9 {angstrom} resolution and found that the incoming dATP formed two hydrogen bonds with an imino-tautomerized form of tC{sup o}. Stabilization of the dATP/tC{sup o} base pair involved movement of the tC{sup o} backbone sugar into the DNA minor groove and required tilting of the tC{sup o} tricyclic ring to prevent a steric clash with L561. This structure, together with the pre-steady-state kinetic parameters and dNTP binding affinity, estimated from equilibrium fluorescence titrations, suggested that the flexibility of the NBP, provided by the Y567 to Ala substitution, led to a more favorable forward isomerization step resulting in an increase in dNTP binding affinity.
ERIC Educational Resources Information Center
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods
ERIC Educational Resources Information Center
Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.
2012-01-01
Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…
ERIC Educational Resources Information Center
Karelitz, Tzur M.; Budescu, David V.
2004-01-01
When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…
ERIC Educational Resources Information Center
Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques
2010-01-01
P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in…
ERIC Educational Resources Information Center
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES
G.A. Valentine; F.V. Perry; S. Dartevelle
2005-08-26
Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision
Cheating Probabilities on Multiple Choice Tests
NASA Astrophysics Data System (ADS)
Rizzuto, Gaspard T.; Walters, Fred
1997-10-01
This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.
Approaches to Evaluating Probability of Collision Uncertainty
NASA Technical Reports Server (NTRS)
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
A probability distribution model for rain rate
NASA Technical Reports Server (NTRS)
Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.
1994-01-01
A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.
Earthquake probabilities: theoretical assessments and reality
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.
2013-12-01
It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance
Complex analysis methods in noncommutative probability
NASA Astrophysics Data System (ADS)
Teodor Belinschi, Serban
2006-02-01
In this thesis we study convolutions that arise from noncommutative probability theory. We prove several regularity results for free convolutions, and for measures in partially defined one-parameter free convolution semigroups. We discuss connections between Boolean and free convolutions and, in the last chapter, we prove that any infinitely divisible probability measure with respect to monotonic additive or multiplicative convolution belongs to a one-parameter semigroup with respect to the corresponding convolution. Earlier versions of some of the results in this thesis have already been published, while some others have been submitted for publication. We have preserved almost entirely the specific format for PhD theses required by Indiana University. This adds several unnecessary pages to the document, but we wanted to preserve the specificity of the document as a PhD thesis at Indiana University.
A quantum probability perspective on borderline vagueness.
Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter
2013-10-01
The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Approximate probability distributions of the master equation.
Thomas, Philipp; Grima, Ramon
2015-07-01
Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.
Transit probabilities for debris around white dwarfs
NASA Astrophysics Data System (ADS)
Lewis, John Arban; Johnson, John A.
2017-01-01
The discovery of WD 1145+017 (Vanderburg et al. 2015), a metal-polluted white dwarf with an infrared-excess and transits confirmed the long held theory that at least some metal-polluted white dwarfs are actively accreting material from crushed up planetesimals. A statistical understanding of WD 1145-like systems would inform us on the various pathways for metal-pollution and the end states of planetary systems around medium- to high-mass stars. However, we only have one example and there are presently no published studies of transit detection/discovery probabilities for white dwarfs within this interesting regime. We present a preliminary look at the transit probabilities for metal-polluted white dwarfs and their projected space density in the Solar Neighborhood, which will inform future searches for analogs to WD 1145+017.
Volcano shapes, entropies, and eruption probabilities
NASA Astrophysics Data System (ADS)
Gudmundsson, Agust; Mohajeri, Nahid
2014-05-01
We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to
Probability of identity by descent in metapopulations.
Kaj, I; Lascoux, M
1999-01-01
Equilibrium probabilities of identity by descent (IBD), for pairs of genes within individuals, for genes between individuals within subpopulations, and for genes between subpopulations are calculated in metapopulation models with fixed or varying colony sizes. A continuous-time analog to the Moran model was used in either case. For fixed-colony size both propagule and migrant pool models were considered. The varying population size model is based on a birth-death-immigration (BDI) process, to which migration between colonies is added. Wright's F statistics are calculated and compared to previous results. Adding between-island migration to the BDI model can have an important effect on the equilibrium probabilities of IBD and on Wright's index. PMID:10388835
Conflict Probability Estimation for Free Flight
NASA Technical Reports Server (NTRS)
Paielli, Russell A.; Erzberger, Heinz
1996-01-01
The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.
Approximate probability distributions of the master equation
NASA Astrophysics Data System (ADS)
Thomas, Philipp; Grima, Ramon
2015-07-01
Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.
Computing association probabilities using parallel Boltzmann machines.
Iltis, R A; Ting, P Y
1993-01-01
A new computational method is presented for solving the data association problem using parallel Boltzmann machines. It is shown that the association probabilities can be computed with arbitrarily small errors if a sufficient number of parallel Boltzmann machines are available. The probability beta(i)(j) that the i th measurement emanated from the jth target can be obtained simply by observing the relative frequency with which neuron v(i,j) in a two-dimensional network is on throughout the layers. Some simple tracking examples comparing the performance of the Boltzmann algorithm to the exact data association solution and with the performance of an alternative parallel method using the Hopfield neural network are also presented.
Nuclear data uncertainties: I, Basic concepts of probability
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
The Origin of Probability and Entropy
NASA Astrophysics Data System (ADS)
Knuth, Kevin H.
2008-11-01
Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.
Calculating Cumulative Binomial-Distribution Probabilities
NASA Technical Reports Server (NTRS)
Scheuer, Ernest M.; Bowerman, Paul N.
1989-01-01
Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.
Sampling probability distributions of lesions in mammograms
NASA Astrophysics Data System (ADS)
Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.
2015-03-01
One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.
SureTrak Probability of Impact Display
NASA Technical Reports Server (NTRS)
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
Non-signalling Theories and Generalized Probability
NASA Astrophysics Data System (ADS)
Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek
2016-09-01
We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.
Probability and Statistics in Aerospace Engineering
NASA Technical Reports Server (NTRS)
Rheinfurth, M. H.; Howell, L. W.
1998-01-01
This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.
A Quantum Probability Model of Causal Reasoning
Trueblood, Jennifer S.; Busemeyer, Jerome R.
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747
Theoretical Analysis of Rain Attenuation Probability
NASA Astrophysics Data System (ADS)
Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan
2007-07-01
Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.
The Probability Distribution of Daily Streamflow
NASA Astrophysics Data System (ADS)
Blum, A.; Vogel, R. M.
2015-12-01
Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.
Probability of metastable states in Yukawa clusters
NASA Astrophysics Data System (ADS)
Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael
2008-11-01
Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)
Atomic Transition Probabilities for Rare Earths
NASA Astrophysics Data System (ADS)
Curry, J. J.; Anderson, Heidi M.; den Hartog, E. A.; Wickliffe, M. E.; Lawler, J. E.
1996-10-01
Accurate absolute atomic transition probabilities for selected neutral and singly ionized rare earth elements including Tm, Dy, and Ho are being measured. The increasing use of rare earths in high intensity discharge lamps provides motivation; the data are needed for diagnosing and modeling the lamps. Radiative lifetimes, measured using time resolved laser induced fluorescence (LIF), are combined with branching fractions, measured using a large Fourier transform spectrometer (FTS), to determine accurate absolute atomic transition probabilities. More than 15,000 LIF decay curves from Tm and Dy atoms and ions in slow beams have been recorded and analyzed. Radiative lifetimes for 298 levels of TmI and TmII and for 450 levels of DyI and DyII are determined. Branching fractions are extracted from spectra recorded using the 1.0 m FTS at the National Solar Observatory. Branching fractions and absolute transition probabilities for 500 of the strongest TmI and TmII lines are complete. Representative lifetime and branching fraction data will be presented and discussed. Supported by Osram Sylvania Inc. and the NSF.
Bacteria survival probability in bactericidal filter paper.
Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M
2014-05-01
Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.
A quantum probability model of causal reasoning.
Trueblood, Jennifer S; Busemeyer, Jerome R
2012-01-01
People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
The probability and severity of decompression sickness
Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p << 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928
The probability and severity of decompression sickness.
Howle, Laurens E; Weber, Paul W; Hada, Ethan A; Vann, Richard D; Denoble, Petar J
2017-01-01
Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild-Type I (manifestations 4-6)-and serious-Type II (manifestations 1-3). Additionally, we considered an alternative grouping of mild-Type A (manifestations 3-6)-and serious-Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p < 0.01) improvement in trinomial model fit over the binomial (2-state) model. With the Type I/II definition, we found that the predicted probability of 'mild' DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed.
CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS
Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...
Probability sampling in legal cases: Kansas cellphone users
NASA Astrophysics Data System (ADS)
Kadane, Joseph B.
2012-10-01
Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.
Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation
NASA Astrophysics Data System (ADS)
Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin
2016-12-01
If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.
Elemental mercury poisoning probably causes cortical myoclonus.
Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B
2007-10-15
Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.
The Prediction of Spatial Aftershock Probabilities (PRESAP)
NASA Astrophysics Data System (ADS)
McCloskey, J.
2003-12-01
It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the
Probable interaction between trazodone and carbamazepine.
Sánchez-Romero, A; Mayordomo-Aranda, A; García-Delgado, R; Durán-Quintana, J A
2011-06-01
The need to maintain long-term treatment of chronic pathologies makes the appearance of interactions possible when such therapies incorporate other drugs to deal with the aggravation of the same or other intercurrent pathologies. A case is presented in which the addition of trazodone to a chronic treatment with carbamazepine (CBZ) is associated with symptoms typical for intoxication by this antiepileptic, accompanied by a raised serum concentration. When the trazodone was suspended, these symptoms lessened and the concentration of CBZ decreased progressively, suggesting a probable interaction between the 2 drugs.
Atomic transition probabilities of Gd i
NASA Astrophysics Data System (ADS)
Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.
2011-05-01
Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.
Atomic transition probabilities of Er i
NASA Astrophysics Data System (ADS)
Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.
2010-12-01
Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.
Modulation Based on Probability Density Functions
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Probability density functions in turbulent channel flow
NASA Technical Reports Server (NTRS)
Dinavahi, Surya P. G.
1992-01-01
The probability density functions (pdf's) of the fluctuating velocity components, as well as their first and second derivatives, are calculated using data from the direct numerical simulations (DNS) of fully developed turbulent channel flow. It is observed that, beyond the buffer region, the pdf of each of these quantities is independent of the distance from the channel wall. It is further observed that, beyond the buffer region, the pdf's for all the first derivatives collapse onto a single universal curve and those of the second derivatives also collapse onto another universal curve, irrespective of the distance from the wall. The kinetic-energy dissipation rate exhibits log normal behavior.
Estimating flood exceedance probabilities in estuarine regions
NASA Astrophysics Data System (ADS)
Westra, Seth; Leonard, Michael
2016-04-01
Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).
An all-timescales rainfall probability distribution
NASA Astrophysics Data System (ADS)
Papalexiou, S. M.; Koutsoyiannis, D.
2009-04-01
The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.
Computation-distributed probability hypothesis density filter
NASA Astrophysics Data System (ADS)
Wang, Junjie; Zhao, Lingling; Su, Xiaohong; Shi, Chunmei; Ma, JiQuan
2016-12-01
Particle probability hypothesis density filtering has become a promising approach for multi-target tracking due to its capability of handling an unknown and time-varying number of targets in a nonlinear, non-Gaussian system. However, its computational complexity linearly increases with the number of obtained observations and the number of particles, which can be very time consuming, particularly when numerous targets and clutter exist in the surveillance region. To address this issue, we present a distributed computation particle probability hypothesis density(PHD) filter for target tracking. It runs several local decomposed particle PHD filters in parallel while processing elements. Each processing element takes responsibility for a portion of particles but all measurements and provides local estimates. A central unit controls particle exchange among the processing elements and specifies a fusion rule to match and fuse the estimates from different local filters. The proposed framework is suitable for parallel implementation. Simulations verify that the proposed method can significantly accelerate and maintain a comparative accuracy compared to the standard particle PHD filter.
Measures, Probability and Holography in Cosmology
NASA Astrophysics Data System (ADS)
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We
Significance of "high probability/low damage" versus "low probability/high damage" flood events
NASA Astrophysics Data System (ADS)
Merz, B.; Elmer, F.; Thieken, A. H.
2009-06-01
The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.
Not All Probabilities Are Equivalent: Evidence From Orientation Versus Spatial Probability Learning.
Jabar, Syaheed B; Anderson, Britt
2017-02-23
Frequently targets are detected faster, probable locations searched earlier, and likely orientations estimated more precisely. Are these all consequences of a single, domain-general "attentional" effect? To examine this issue, participants were shown brief instances of spatial gratings, and were tasked to draw their location and orientation. Unknown to participants, either the location or orientation probability of these gratings were manipulated. While orientation probability affected the precision of orientation reports, spatial probability did not. Further, utilising lowered stimulus contrast (via a staircase procedure) and a combination of behavioral precision and confidence self-report, we clustered trials with perceived stimuli from trials where the target was not detected: Spatial probability only modulated the likelihood of stimulus detection, but not did not modulate perceptual precision. Even when no physical attentional cues are present, acquired probabilistic information on space versus orientation leads to separable 'attention-like' effects on behaviour. We discuss how this could be linked to distinct underlying neural mechanisms. (PsycINFO Database Record
Naive Probability: A Mental Model Theory of Extensional Reasoning.
ERIC Educational Resources Information Center
Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul
1999-01-01
Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…
Economic choices reveal probability distortion in macaque monkeys.
Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram
2015-02-18
Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing.
On the probability of dinosaur fleas.
Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F
2016-01-11
Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.
Quantum probabilities for inflation from holography
Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk
2014-01-01
The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2006-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Probability-one homotopies in computational science
NASA Astrophysics Data System (ADS)
Watson, Layne T.
2002-03-01
Probability-one homotopy algorithms are a class of methods for solving nonlinear systems of equations that, under mild assumptions, are globally convergent for a wide range of problems in science and engineering. Convergence theory, robust numerical algorithms, and production quality mathematical software exist for general nonlinear systems of equations, and special cases such as Brouwer fixed point problems, polynomial systems, and nonlinear constrained optimization. Using a sample of challenging scientific problems as motivation, some pertinent homotopy theory and algorithms are presented. The problems considered are analog circuit simulation (for nonlinear systems), reconfigurable space trusses (for polynomial systems), and fuel-optimal orbital rendezvous (for nonlinear constrained optimization). The mathematical software packages HOMPACK90 and POLSYS_PLP are also briefly described.
Audio feature extraction using probability distribution function
NASA Astrophysics Data System (ADS)
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
Microtechnique for most-probable-number analysis.
Rowe, R; Todd, R; Waide, J
1977-03-01
A microtechnique based on the most-probable-number (MPN) method has been developed for the enumeration of the ammonium-oxidizing population in soil samples. An MPN table for a research design ([8 by 12] i.e., 12 dilutions, 8 replicates per dilution) is presented. A correlation of 0.68 was found between MPNs determined by the microtechnique and the standard tube technique. Higher MPNs were obtained with the microtechnique with increased accuracy in endpoint determinations being a possible cause. Considerable savings of time, space, equipment, and reagents are observed using this method. The microtechnique described may be adapted to other microbial populations using various types of media and endpoint determinations.
Continuity of percolation probability on hyperbolic graphs
NASA Astrophysics Data System (ADS)
Wu, C. Chris
1997-05-01
Let T k be a forwarding tree of degree k where each vertex other than the origin has k children and one parent and the origin has k children but no parent ( k≥2). Define G to be the graph obtained by adding to T k nearest neighbor bonds connecting the vertices which are in the same generation. G is regarded as a discretization of the hyperbolic plane H 2 in the same sense that Z d is a discretization of R d . Independent percolation on G has been proved to have multiple phase transitions. We prove that the percolation probability O(p) is continuous on [0,1] as a function of p.
Probability and delay discounting of erotic stimuli.
Lawyer, Steven R
2008-09-01
Adult undergraduate men (n=38) and women (n=33) were categorized as erotica "users" (n=34) and "non-users" (n=37) based on their responses to screening questions and completed computerized delay and probability discounting tasks concerning hypothetical money and erotica. Erotica users discounted the value of erotica similarly to money on three of the four erotica tasks; erotica non-users discounted the value of money consistent with erotica users, but not the value of erotica. Erotica users were disproportionately male, scored higher on several psychometric measures of sexuality-related constructs, and exhibited more impulsive choice patterns on the delay discounting for money task than erotica non-users did. These findings suggest that discounting processes generalize to erotic outcomes for some individuals.
Trending in Probability of Collision Measurements
NASA Technical Reports Server (NTRS)
Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.
2015-01-01
A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2004-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Microtechnique for Most-Probable-Number Analysis
Rowe, R.; Todd, R.; Waide, J.
1977-01-01
A microtechnique based on the most-probable-number (MPN) method has been developed for the enumeration of the ammonium-oxidizing population in soil samples. An MPN table for a research design ([8 by 12] i.e., 12 dilutions, 8 replicates per dilution) is presented. A correlation of 0.68 was found between MPNs determined by the microtechnique and the standard tube technique. Higher MPNs were obtained with the microtechnique with increased accuracy in endpoint determinations being a possible cause. Considerable savings of time, space, equipment, and reagents are observed using this method. The microtechnique described may be adapted to other microbial populations using various types of media and endpoint determinations. Images PMID:16345226
Parabolic Ejecta Features on Titan? Probably Not
NASA Astrophysics Data System (ADS)
Lorenz, R. D.; Melosh, H. J.
1996-03-01
Radar mapping of Venus by Magellan indicated a number of dark parabolic features, associated with impact craters. A suggested mechanism for generating such features is that ejecta from the impact event is 'winnowed' by the zonal wind field, with smaller ejecta particles falling out of the atmosphere more slowly, and hence drifting further. What discriminates such features from simple wind streaks is the 'stingray' or parabolic shape. This is due to the ejecta's spatial distribution prior to being winnowed during fallout, and this distribution is generated by the explosion plume of the impact piercing the atmosphere, allowing the ejecta to disperse pseudoballistically before re-entering the atmosphere, decelerating to terminal velocity and then being winnowed. Here we apply this model to Titan, which has a zonal wind field similar to that of Venus. We find that Cassini will probably not find parabolic features, as the winds stretch the deposition so far that ejecta will form streaks or bands instead.
Bayesian probability approach to ADHD appraisal.
Robeva, Raina; Penberthy, Jennifer Kim
2009-01-01
Accurate diagnosis of attentional disorders such as attention-deficit hyperactivity disorder (ADHD) is imperative because there are multiple negative psychosocial sequelae related to undiagnosed and untreated ADHD. Early and accurate detection can lead to effective intervention and prevention of negative sequelae. Unfortunately, diagnosing ADHD presents a challenge to traditional assessment paradigms because there is no single test that definitively establishes its presence. Even though ADHD is a physiologically based disorder with a multifactorial etiology, the diagnosis has been traditionally based on a subjective history of symptoms. In this chapter we outline a stochastic method that utilizes a Bayesian interface for quantifying and assessing ADHD. It can be used to combine of a variety of psychometric tests and physiological markers into a single standardized instrument that, on each step, refines a probability for ADHD for each individual based on information provided by the individual assessments. The method is illustrated with data from a small study of six college female students with ADHD and six matched controls in which the method achieves correct classification for all participants, where none of the individual assessments was capable of achieving perfect classification. Further, we provide a framework for applying this Bayesian method for performing meta-analysis of data obtained from disparate studies and using disparate tests for ADHD based on calibration of the data into a unified probability scale. We use this method to combine data from five studies that examine the diagnostic abilities of different behavioral rating scales and EEG assessments of ADHD, enrolling a total of 56 ADHD and 55 control subjects of different age groups and gender.
Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas
2016-06-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the
Helton, J.C.
1996-03-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.
Transit probabilities around hypervelocity and runaway stars
NASA Astrophysics Data System (ADS)
Fragione, G.; Ginsburg, I.
2017-04-01
In the blooming field of exoplanetary science, NASA's Kepler Space Telescope has revolutionized our understanding of exoplanets. Kepler's very precise and long-duration photometry is ideal for detecting planetary transits around Sun-like stars. The forthcoming Transiting Exoplanet Survey Satellite (TESS) is expected to continue Kepler's legacy. Along with transits, the Doppler technique remains an invaluable tool for discovering planets. The next generation of spectrographs, such as G-CLEF, promise precision radial velocity measurements. In this paper, we explore the possibility of detecting planets around hypervelocity and runaway stars, which should host a very compact system as consequence of their turbulent origin. We find that the probability of a multiplanetary transit is 10-3 ≲ P ≲ 10-1. We therefore need to observe ∼10-1000 high-velocity stars to spot a transit. However, even if transits are rare around runaway and hypervelocity stars, the chances of detecting such planets using radial velocity surveys is high. We predict that the European Gaia satellite, along with TESS and the new-generation spectrographs G-CLEF and ESPRESSO, will spot planetary systems orbiting high-velocity stars.
Essays on probability elicitation scoring rules
NASA Astrophysics Data System (ADS)
Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.
2012-10-01
In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.
Probability judgments under ambiguity and conflict.
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.
Atomic Transition Probabilities for Neutral Cerium
NASA Astrophysics Data System (ADS)
Lawler, J. E.; den Hartog, E. A.; Wood, M. P.; Nitz, D. E.; Chisholm, J.; Sobeck, J.
2009-10-01
The spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are more complex than spectra of other rare earth species. The resulting high density of lines in the visible makes Ce ideal for use in metal halide (MH) High Intensity Discharge (HID) lamps. Inclusion of cerium-iodide in a lamp dose can improve both the Color Rendering Index and luminous efficacy of a MH-HID lamp. Basic spectroscopic data including absolute atomic transition probabilities for Ce I and Ce II are needed for diagnosing and modeling these MH-HID lamps. Recent work on Ce II [1] is now being augmented with similar work on Ce I. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2000 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. [4pt] [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [0pt] [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).
Atomic Transition Probabilities for Neutral Cerium
NASA Astrophysics Data System (ADS)
Chisholm, John; Nitz, D.; Sobeck, J.; Den Hartog, E. A.; Wood, M. P.; Lawler, J. E.
2010-01-01
Among the rare earth species, the spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are some of the most complex. Like other rare earth species, Ce has many lines in the visible which are suitable for elemental abundance studies. Recent work on Ce II transition probabilities [1] is now being augmented with similar work on Ce I for future studies using such lines from astrophysical sources. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2500 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. This work was supported by the National Science Foundation's REU program and the Department of Defense's ASSURE program through NSF Award AST-0453442 and NSF Grant CTS0613277. [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).
Lectures on probability and statistics. Revision
Yost, G.P.
1985-06-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.
Insights into decision making using choice probability.
Crapse, Trinity B; Basso, Michele A
2015-12-01
A long-standing question in systems neuroscience is how the activity of single neurons gives rise to our perceptions and actions. Critical insights into this question occurred in the last part of the 20th century when scientists began linking modulations of neuronal activity directly to perceptual behavior. A significant conceptual advance was the application of signal detection theory to both neuronal activity and behavior, providing a quantitative assessment of the relationship between brain and behavior. One metric that emerged from these efforts was choice probability (CP), which provides information about how well an ideal observer can predict the choice an animal makes from a neuron's discharge rate distribution. In this review, we describe where CP has been studied, locational trends in the values found, and why CP values are typically so low. We discuss its dependence on correlated activity among neurons of a population, assess whether it arises from feedforward or feedback mechanisms, and investigate what CP tells us about how many neurons are required for a decision and how they are pooled to do so.
Probability judgments under ambiguity and conflict
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081
Parametric probability distributions for anomalous change detection
Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C
2010-01-01
The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.
Statistical Physics of Pairwise Probability Models
Roudi, Yasser; Aurell, Erik; Hertz, John A.
2009-01-01
Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the mean values and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models. PMID:19949460
Grain Exchange Probabilities Within a Gravel Bed
NASA Astrophysics Data System (ADS)
Haschenburger, J.
2008-12-01
Sediment transfers in gravel-bed rivers involve the vertical exchange of sediments during floods. These exchanges regulate the virtual velocity of sediment and bed material texture. This study describes general tendencies in the vertical exchange of gravels within the substrate that result from multiple floods. Empirical observations come from Carnation Creek, a small gravel-bed river with large woody debris located on the west coast of Vancouver Island, British Columbia. Frequent floods and the relatively limited armor layer facilitate streambed activity and relatively high bedload transport rates, typically under partial sediment transport conditions. Over 2000 magnetically tagged stones, ranging in size from 16 to 180 mm, were deployed on the bed surface between 1991 and 1992. These tracers have been recovered 10 times over 12 flood seasons to quantify their vertical position in the streambed. For analysis, the bed is divided into layers based on armor layer thickness. Once tracers are well mixed within the streambed, grains in the surface layer are most likely to be mixed into the subsurface, while subsurface grains are most likely to persist within the subsurface. Fractional exchange probabilities approach size independence when the most active depth of the substrate is considered. Overall these results highlight vertical mixing as an important process in the dispersion of gravels.
Topology of spaces of probability measures
NASA Astrophysics Data System (ADS)
Banakh, T. O.; Radul, T. N.
1997-08-01
We study the space \\widehat P(X) of Radon probability measures on a metric space X and its subspaces P_c(X), P_d(X) and P_\\omega (X) of continuous measures, discrete measures, and finitely supported measures, respectively. It is proved that for any completely metrizable space X, the space \\widehat P(X) is homeomorphic to a Hilbert space. A topological classification is obtained for the pairs (\\widehat P(K),\\widehat P(X)), (\\widehat P(K),P_d(Y)) and (\\widehat P(K),P_c(Z)), where K is a metric compactum, X an everywhere dense Borel subset of K, Y an everywhere dense F_{\\sigma \\delta }-set of K, and Z an everywhere uncountable everywhere dense Borel subset of K of sufficiently high Borel class. Conditions on the pair (X,Y) are found that are necessary and sufficient for the pair (\\widehat P(X),P_\\omega (Y)) to be homeomorphic to (l^2(A),l^2_f(A)).
Do aftershock probabilities decay with time?
Michael, Andrew J.
2012-01-01
So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."
Cook, Richard D.
2016-05-25
The ParaDIS_lib software is a project that is funded by the DOE ASC Program. Its purpose is to provide visualization and analysis capabilities for the existing ParaDIS parallel dislocation dynamics simulation code.
Projecting Climate Change Impacts on Wildfire Probabilities
NASA Astrophysics Data System (ADS)
Westerling, A. L.; Bryant, B. P.; Preisler, H.
2008-12-01
We present preliminary results of the 2008 Climate Change Impact Assessment for wildfire in California, part of the second biennial science report to the California Climate Action Team organized via the California Climate Change Center by the California Energy Commission's Public Interest Energy Research Program pursuant to Executive Order S-03-05 of Governor Schwarzenegger. In order to support decision making by the State pertaining to mitigation of and adaptation to climate change and its impacts, we model wildfire occurrence monthly from 1950 to 2100 under a range of climate scenarios from the Intergovernmental Panel on Climate Change. We use six climate change models (GFDL CM2.1, NCAR PCM1, CNRM CM3, MPI ECHAM5, MIROC3.2 med, NCAR CCSM3) under two emissions scenarios--A2 (C02 850ppm max atmospheric concentration) and B1(CO2 550ppm max concentration). Climate model output has been downscaled to a 1/8 degree (~12 km) grid using two alternative methods: a Bias Correction and Spatial Donwscaling (BCSD) and a Constructed Analogues (CA) downscaling. Hydrologic variables have been simulated from temperature, precipitation, wind and radiation forcing data using the Variable Infiltration Capacity (VIC) Macroscale Hydrologic Model. We model wildfire as a function of temperature, moisture deficit, and land surface characteristics using nonlinear logistic regression techniques. Previous work on wildfire climatology and seasonal forecasting has demonstrated that these variables account for much of the inter-annual and seasonal variation in wildfire. The results of this study are monthly gridded probabilities of wildfire occurrence by fire size class, and estimates of the number of structures potentially affected by fires. In this presentation we will explore the range of modeled outcomes for wildfire in California, considering the effects of emissions scenarios, climate model sensitivities, downscaling methods, hydrologic simulations, statistical model specifications for
Cheng, Peter C-H
2011-07-01
The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.
28 CFR 2.101 - Probable cause hearing and determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... who have given information upon which revocation may be based) at a postponed probable cause hearing... attendance, unless good cause is found for not allowing confrontation. Whenever a probable cause hearing...
Prospect evaluation as a function of numeracy and probability denominator.
Millroth, Philip; Juslin, Peter
2015-05-01
This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.
Choice strategies in multiple-cue probability learning.
White, Chris M; Koehler, Derek J
2007-07-01
Choice strategies for selecting among outcomes in multiple-cue probability learning were investigated using a simulated medical diagnosis task. Expected choice probabilities (the proportion of times each outcome was selected given each cue pattern) under alternative choice strategies were constructed from corresponding observed judged probabilities (of each outcome given each cue pattern) and compared with observed choice probabilities. Most of the participants were inferred to have responded by using a deterministic strategy, in which the outcome with the higher judged probability is consistently chosen, rather than a probabilistic strategy, in which an outcome is chosen with a probability equal to its judged probability. Extended practice in the learning environment did not affect choice strategy selection, contrary to reports from previous studies, results of which may instead be attributable to changes with practice in the variability and extremity of the perceived probabilities on which the choices were based.
On differentiating the probability of error in multipopular feature selection
NASA Technical Reports Server (NTRS)
Peters, B. C.
1974-01-01
A method of linear feature selection for n dimensional observation vectors which belong to one of m populations is presented. Each population has a known apriori probability and is described by a known multivariate normal density function. Specifically we consider the problem of finding a k x n matrix B of rank k (k n) for which the transformed probability of misclassification is minimized. Providing that the transformed a posterior probabilities are distinct theoretical results are obtained which, for the case k = l, give rise to a numerically tractable formula for the derivative of the probability of misclassification. It is shown that for the two population problem this condition is also necessary. The dependence of the minimum probability of error on the a priori probabilities is investigated. The minimum probability of error satisfies a uniform Lipschitz condition with respect to the a priori probabilities.
Deduction and Inference Using Conditional Logic and Probability
1991-05-01
Carnap and R. C. Jeffrey [30] & [31], H. Gaifman [32], D. Scott and P. Kraus [33], E. W. Adams [19], and T. Hailperin [8] have all defined the probability...Theories of Logic and Probabilities, Open Court. 30. Carnap , R. (1960; 1st ed., 1950) Logical Foundations of Probability 2nd. ed., Univ. of Chicago...Press. 31. Carnap , R. & Jeffrey, R. C. (1971) Studies in Inductive Logic and Probability, Univ. of California Press. 32. Gaifman, H. (1964) Concerning
A short note on probability in clinical medicine.
Upshur, Ross E G
2013-06-01
Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine.
Pretest probability assessment derived from attribute matching
Kline, Jeffrey A; Johnson, Charles L; Pollack, Charles V; Diercks, Deborah B; Hollander, Judd E; Newgard, Craig D; Garvey, J Lee
2005-01-01
Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation <2% in a validation set of 8,120 patients evaluated for possible ACS and did not have ST segment elevation on ECG. 1,061 patients were excluded prior to validation analysis because of ST-segment elevation (713), missing data (77) or being lost to follow-up (271). Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82) for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77) for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25%) patients as having a PTP < 2.0%; 28 developed ACS (1.7% 95% CI = 1.1–2.4%). The LRE categorized 244 (4%, 95% CI = 3–4%) with PTP < 2.0%; four developed ACS (1.6%, 95% CI = 0.4–4.1%). Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE. PMID:16095534
Representation of Odds in Terms of Frequencies Reduces Probability Discounting
ERIC Educational Resources Information Center
Yi, Richard; Bickel, Warren K.
2005-01-01
In studies of probability discounting, the reduction in the value of an outcome as a result of its degree of uncertainty is calculated. Decision making studies suggest two issues with probability that may play a role in data obtained in probability discounting studies. The first issue involves the reduction of risk aversion via subdivision of…
Probability Constructs in Preschool Education and How they Are Taught
ERIC Educational Resources Information Center
Antonopoulos, Konstantinos; Zacharos, Konstantinos
2013-01-01
The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…
Typical Versus Atypical Unpacking and Superadditive Probability Judgment
ERIC Educational Resources Information Center
Sloman, Steven; Rottenstreich, Yuval; Wisniewski, Edward; Hadjichristidis, Constantinos; Fox, Craig R.
2004-01-01
Probability judgments for packed descriptions of events (e.g., the probability that a businessman does business with a European country) are compared with judgments for unpacked descriptions of the same events (e.g., the probability that a businessman does business with England, France, or some other European country). The prediction that…
Factors influencing reporting and harvest probabilities in North American geese
Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.
2009-01-01
We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.
Pig Data and Bayesian Inference on Multinomial Probabilities
ERIC Educational Resources Information Center
Kern, John C.
2006-01-01
Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2016-04-01
Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative
Pattern formation, logistics, and maximum path probability
NASA Astrophysics Data System (ADS)
Kirkaldy, J. S.
1985-05-01
The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are
Probability Distribution for Flowing Interval Spacing
S. Kuzio
2004-09-22
Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be
Surprisingly rational: probability theory plus noise explains biases in judgment.
Costello, Fintan; Watts, Paul
2014-07-01
The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise.
Probability expression for changeable and changeless uncertainties: an implicit test
Wang, Yun; Du, Xue-Lei; Rao, Li-Lin; Li, Shu
2014-01-01
“Everything changes and nothing remains still.”We designed three implicit studies to understand how people react or adapt to a rapidly changing world by testing whether verbal probability is better in expressing changeable uncertainty while numerical probability is better in expressing unchangeable uncertainty. We found that the “verbal-changeable” combination in implicit tasks was more compatible than the “numerical-changeable” combination. Furthermore, the “numerical-changeless” combination was more compatible than the “verbal-changeless” combination. Thus, a novel feature called “changeability” was proposed to describe the changeable nature of verbal probability. However, numerical probability is a better carrier of changeless uncertainty than verbal probability. These results extend the domain of probability predictions and enrich our general understanding of communication with verbal and numerical probabilities. Given that the world around us is constantly changing, this “changeability” feature may play a major role in preparing for uncertainty. PMID:25431566
Partial avoidance contingencies: Absolute omission and punishment probabilities1
Flye, Barbaba L.; Gibbon, John
1979-01-01
Avoidance contingencies were defined by the absolute probability of the conjunction of responding or not responding with shock or no shock. The “omission” probability (ρ00) is the probability of no response and no shock. The “punishment” probability (ρ11) is the probability of both a response and a shock. The traditional avoidance contingency never omits shock on nonresponse trials (ρ00=0) and never presents shock on response trials (ρ11=0). Rats were trained on a discrete-trial paradigm with no intertrial interval. The first lever response changed an auditory stimulus for the remainder of the trial. Shocks were delivered only at the end of each trial cycle. After initial training under the traditional avoidance contingency, one group of rats experienced changes in omission probability (ρ00>0), holding punishment probability at zero. The second group of rats were studied under different punishment probability values (ρ11>0), holding omission probability at zero. Data from subjects in the omission group looked similar, showing graded decrements in responding with increasing probability of omission. These subjects approximately “matched” their nonresponse frequencies to the programmed probability of shock omission on nonresponse trials, producing a very low and approximately constant conditional probability of shock given no response. Subjects in the punishment group showed different sensitivity to increasing absolute punishment probability. Some subjects decreased responding to low values as punishment probability increased, while others continued to respond at substantial levels even when shock was inevitable on all trials (noncontingent shock schedule). These results confirm an asymmetry between two dimensions of partial avoidance contingencies. When the consequences of not responding included occasional omission of shock, all subjects showed graded sensitivity to changes in omission frequency. When the consequences of responding included
Conditional Probabilities for Large Events Estimated by Small Earthquake Rate
NASA Astrophysics Data System (ADS)
Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi
2016-01-01
We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.
Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin
2016-01-01
Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612
GNSS integer ambiguity validation based on posterior probability
NASA Astrophysics Data System (ADS)
Wu, Zemin; Bian, Shaofeng
2015-10-01
GNSS integer ambiguity validation is considered to be a challenge task for decades. Several kinds of validation tests are developed and widely used in these years, but theoretical basis is their weakness. Ambiguity validation theoretically is an issue of hypothesis test. In the frame of Bayesian hypothesis testing, posterior probability is the canonical standard that statistical decision should be based on. In this contribution, (i) we derive the posterior probability of the fixed ambiguity based on the Bayesian principle and modify it for practice ambiguity validation. (ii) The optimal property of the posterior probability test is proved based on an extended Neyman-Pearson lemma. Since validation failure rate is the issue users most concerned about, (iii) we derive the failure rate upper bound of the posterior probability test, so the user can use the posterior probability test either in the fixed posterior probability or in the fixed failure rate way. Simulated as well as real observed data are used for experimental validations. The results show that (i) the posterior probability test is the most effective within the R-ratio test, difference test, ellipsoidal integer aperture test and posterior probability test, (ii) the posterior probability test is computational efficient and (iii) the failure rate estimation for posterior probability test is useful.
Exponential convergence rates for weighted sums in noncommutative probability space
NASA Astrophysics Data System (ADS)
Choi, Byoung Jin; Ji, Un Cig
2016-11-01
We study exponential convergence rates for weighted sums of successive independent random variables in a noncommutative probability space of which the weights are in a von Neumann algebra. Then we prove a noncommutative extension of the result for the exponential convergence rate by Baum, Katz and Read. As applications, we first study a large deviation type inequality for weighted sums in a noncommutative probability space, and secondly we study exponential convergence rates for weighted free additive convolution sums of probability measures.
Posterior Probability Matching and Human Perceptual Decision Making.
Murray, Richard F; Patel, Khushbu; Yee, Alan
2015-06-01
Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods provide new tools
More Diagnosis of Solar Flare Probability from Chromosphere Image Sequences
2012-09-28
AFRL-RV-PS- AFRL-RV-PS- TR-2012-0194 TR-2012-0194 MORE DIAGNOSIS OF SOLAR FLARE PROBABILITY FROM CHROMOSPHERE IMAGE...1 Oct 2011 to 07 Sep 2012 4. TITLE AND SUBTITLE More Diagnosis of Solar Flare Probability from Chromosphere Image Sequences 5a...We continued our investigation of the utility of optical observations of the solar chromosphere in the diagnosis of flare probability. Because we felt
Posterior Probability Matching and Human Perceptual Decision Making
Murray, Richard F.; Patel, Khushbu; Yee, Alan
2015-01-01
Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models’ performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods provide new tools
Path probability of stochastic motion: A functional approach
NASA Astrophysics Data System (ADS)
Hattori, Masayuki; Abe, Sumiyoshi
2016-06-01
The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.
Implicit learning of fifth- and sixth-order sequential probabilities.
Remillard, Gilbert
2010-10-01
Serial reaction time (SRT) task studies have established that people can implicitly learn sequential contingencies as complex as fourth-order probabilities. The present study examined people's ability to learn fifth-order (Experiment 1) and sixth-order (Experiment 2) probabilities. Remarkably, people learned fifth- and sixth-order probabilities. This suggests that the implicit sequence learning mechanism can operate over a range of at least seven sequence elements.
New probability table treatment in MCNP for unresolved resonances
Carter, L.L.; Little, R.C.; Hendricks, J.S.; MacFarlane, R.E.
1998-04-01
An upgrade for MCNP has been implemented to sample the neutron cross sections in the unresolved resonance range using probability tables. These probability tables are generated with the cross section processor code NJOY, by using the evaluated statistical information about the resonances to calculate cumulative probability distribution functions for the microscopic total cross section. The elastic, fission, and radiative capture cross sections are also tabulated as the average values of each of these partials conditional upon the value of the total. This paper summarizes how the probability tables are utilized in this MCNP upgrade and compares this treatment with the approximate smooth treatment for some example problems.
The relationship among probability of failure, landslide susceptibility and rainfall
NASA Astrophysics Data System (ADS)
Huang, Chuen Ming; Lee, Chyi-Tyi
2016-04-01
Landslide hazard included spatial probability, temporal probability and size probability. Many researches evaluate spatial probability in landslide susceptibility, but it is not many in temporal probability and size probability. Because of it must own enough landslide inventories that covered entire study area and large time range. In seismology, using Poisson model to calculate temporal probability is a well-known inference. However, it required a long term and complete records to analyze. In Taiwan, the remote sensing technology made us to establish multi landslide inventories easily, but it is still lack in time series. Thus the landslide susceptibility through changed different return period triggering factor was often assumed landslide hazard. Compare with landslide inventory, collected a long tern rainfall gauge records is easy. However, landslide susceptibility is a relative spatial probability. No matter using different event or analyzing in different area, the landslide susceptibility is not equal. So which model is representative that is difficult to be decided. This study adopted histogram matching to construct basic landslide susceptibility of the region. Then the relationship between landslide susceptibility, probability of failure and rainfall in multi-event can be found out.
Forward and backward location probabilities for sorbing solutes in groundwater
NASA Astrophysics Data System (ADS)
Neupauer, R. M.; Wilson, J. L.
Location probability can be used to describe the likely position of a solute particle as it travels through an aquifer. Forward location probability describes the likely future positions of a particle, and can be used to predict the movement of a contaminant plume. Backward location probability describes the likely prior positions of a solute particle, and can be used to identify sources of contamination. For sorbing solutes, the probability distributions must also account for the phase (aqueous or sorbed) of a solute particle. We present new phase-dependent forward and backward location probabilities to describe transport of a solute undergoing linear non-equilibrium sorption. The effects of sorption are incorporated directly into the governing equations that are used to calculate the probability distributions. The shape and magnitude of the distributions depend on the phase of the contamination at both the source (or prior location) and receptor (or future location). These probabilities are related to adjoint states of concentration. Using adjoint theory, Bayes' theorem, and a clever transformation, we develop a model to efficiently calculate backward location probabilities for one or a few receptors. We illustrate important features of backward location probabilities for a sorbing solute with a hypothetical, one-dimensional confined aquifer, and we demonstrate their use in identifying sources of contamination for a trichloroethylene plume at the Massachusetts Military Reservation using a three-dimensional numerical model.
On the Role of Prior Probability in Adiabatic Quantum Algorithms
NASA Astrophysics Data System (ADS)
Sun, Jie; Lu, Songfeng; Yang, Liping
2016-03-01
In this paper, we study the role of prior probability on the efficiency of quantum local adiabatic search algorithm. The following aspects for prior probability are found here: firstly, only the probabilities of marked states affect the running time of the adiabatic evolution; secondly, the prior probability can be used for improving the efficiency of the adiabatic algorithm; thirdly, like the usual quantum adiabatic evolution, the running time for the case of multiple solution states where the number of marked elements are smaller enough than the size of the set assigned that contains them can be significantly bigger than that of the case where the assigned set only contains all the marked states.
Anytime synthetic projection: Maximizing the probability of goal satisfaction
NASA Technical Reports Server (NTRS)
Drummond, Mark; Bresina, John L.
1990-01-01
A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.
Measuring local context as context-word probabilities.
Hahn, Lance W
2012-06-01
Context enables readers to quickly recognize a related word but disturbs recognition of unrelated words. The relatedness of a final word to a sentence context has been estimated as the probability (cloze probability) that a participant will complete a sentence with a word. In four studies, I show that it is possible to estimate local context-word relatedness based on common language usage. Conditional probabilities were calculated for sentences with published cloze probabilities. Four-word contexts produced conditional probabilities significantly correlated with cloze probabilities, but usage statistics were unavailable for some sentence contexts. The present studies demonstrate that a composite context measure based on conditional probabilities for one- to four-word contexts and the presence of a final period represents all of the sentences and maintains significant correlations (.25, .52, .53) with cloze probabilities. Finally, the article provides evidence for the effectiveness of this measure by showing that local context varies in ways that are similar to the N400 effect and that are consistent with a role for local context in reading. The Supplemental materials include local context measures for three cloze probability data sets.
Oscillations in probability distributions for stochastic gene expression
Petrosyan, K. G. Hu, Chin-Kun
2014-05-28
The phenomenon of oscillations in probability distribution functions of number of components is found for a model of stochastic gene expression. It takes place in cases of low levels of molecules or strong intracellular noise. The oscillations distinguish between more probable even and less probable odd number of particles. The even-odd symmetry restores as the number of molecules increases with the probability distribution function tending to Poisson distribution. We discuss the possibility of observation of the phenomenon in gene, protein, and mRNA expression experiments.
Probabilistic Cloning of Three Real States with Optimal Success Probabilities
NASA Astrophysics Data System (ADS)
Rui, Pin-shu
2017-03-01
We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→N PQC case.
Ordering genes: controlling the decision-error probabilities.
Rogatko, A; Zacks, S
1993-01-01
Determination of the relative gene order on chromosomes is of critical importance in the construction of human gene maps. In this paper we develop a sequential algorithm for gene ordering. We start by comparing three sequential procedures to order three genes on the basis of Bayesian posterior probabilities, maximum-likelihood ratio, and minimal recombinant class. In the second part of the paper we extend sequential procedure based on the posterior probabilities to the general case of g genes. We present a theorem that states that the predicted average probability of committing a decision error, associated with a Bayesian sequential procedure that accepts the hypothesis of a gene-order configuration with posterior probability equal to or greater than pi *, is smaller than 1 - pi *. This theorem holds irrespective of the number of genes, the genetic model, and the source of genetic information. The theorem is an extension of a classical result of Wald, concerning the sum of the actual and the nominal error probabilities in the sequential probability ratio test of two hypotheses. A stepwise strategy for ordering a large number of genes, with control over the decision-error probabilities, is discussed. An asymptotic approximation is provided, which facilitates the calculations with existing computer software for gene mapping, of the posterior probabilities of an order and the error probabilities. We illustrate with some simulations that the stepwise ordering is an efficient procedure. PMID:8488844
Probability Quantization for Multiplication-Free Binary Arithmetic Coding
NASA Technical Reports Server (NTRS)
Cheung, K. -M.
1995-01-01
A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.
Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.
ERIC Educational Resources Information Center
Tversky, Amos; Kahneman, Daniel
1983-01-01
Judgments under uncertainty are often mediated by intuitive heuristics that are not bound by the conjunction rule of probability. Representativeness and availability heuristics can make a conjunction appear more probable than one of its constituents. Alternative interpretations of this conjunction fallacy are discussed and attempts to combat it…
Probability Modeling and Thinking: What Can We Learn from Practice?
ERIC Educational Resources Information Center
Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze
2016-01-01
Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…
A "Virtual Spin" on the Teaching of Probability
ERIC Educational Resources Information Center
Beck, Shari A.; Huse, Vanessa E.
2007-01-01
This article, which describes integrating virtual manipulatives with the teaching of probability at the elementary level, puts a "virtual spin" on the teaching of probability to provide more opportunities for students to experience successful learning. The traditional use of concrete manipulatives is enhanced with virtual coins and spinners from…
A New Way to Evaluate the Probability and Fresnel Integrals
ERIC Educational Resources Information Center
Khalili, Parviz
2007-01-01
In this article, we show how "Laplace Transform" may be used to evaluate variety of nontrivial improper integrals, including "Probability" and "Fresnel" integrals. The algorithm we have developed here to evaluate "Probability, Fresnel" and other similar integrals seems to be new. This method transforms the evaluation of certain improper integrals…
Misconceptions in Rational Numbers, Probability, Algebra, and Geometry
ERIC Educational Resources Information Center
Rakes, Christopher R.
2010-01-01
In this study, the author examined the relationship of probability misconceptions to algebra, geometry, and rational number misconceptions and investigated the potential of probability instruction as an intervention to address misconceptions in all 4 content areas. Through a review of literature, 5 fundamental concepts were identified that, if…
Probability Theory, Not the Very Guide of Life
ERIC Educational Resources Information Center
Juslin, Peter; Nilsson, Hakan; Winman, Anders
2009-01-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…
21 CFR 1316.10 - Administrative probable cause.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section 1316.10 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE ADMINISTRATIVE FUNCTIONS, PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the...
Fisher classifier and its probability of error estimation
NASA Technical Reports Server (NTRS)
Chittineni, C. B.
1979-01-01
Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.
The Probability Approach to English If-Conditional Sentences
ERIC Educational Resources Information Center
Wu, Mei
2012-01-01
Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…
Probability Prediction and Classification. Research Report. RR-04-19
ERIC Educational Resources Information Center
Haberman, Shelby J.
2004-01-01
Criteria for prediction of multinomial responses are examined in terms of estimation bias. Logarithmic penalty and least squares are quite similar in behavior but quite different from maximum probability. The differences ultimately reflect deficiencies in the behavior of the criterion of maximum probability.
Stochastic inequality probabilities for adaptively randomized clinical trials.
Cook, John D; Nadarajah, Saralees
2006-06-01
We examine stochastic inequality probabilities of the form P (X > Y) and P (X > max (Y, Z)) where X, Y, and Z are random variables with beta, gamma, or inverse gamma distributions. We discuss the applications of such inequality probabilities to adaptively randomized clinical trials as well as methods for calculating their values.
Teaching Basic Probability in Undergraduate Statistics or Management Science Courses
ERIC Educational Resources Information Center
Naidu, Jaideep T.; Sanford, John F.
2017-01-01
Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…
A Quantum Theoretical Explanation for Probability Judgment Errors
ERIC Educational Resources Information Center
Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.
2011-01-01
A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…
A new method for estimating extreme rainfall probabilities
Harper, G.A.; O'Hara, T.F. ); Morris, D.I. )
1994-02-01
As part of an EPRI-funded research program, the Yankee Atomic Electric Company developed a new method for estimating probabilities of extreme rainfall. It can be used, along with other techniques, to improve the estimation of probable maximum precipitation values for specific basins or regions.
Effectiveness of Incorporating Adversary Probability Perception Modeling in Security Games
2015-01-30
security game (SSG) algorithms. Given recent work on human decision-making, we adjust the existing subjective utility function to account for...data from previous security game experiments with human subjects. Our results show the incorporation of probability perceptions into the SUQR can...provide improvements in the ability to predict probabilities of attack in certain games .
On the Provenance of Judgments of Conditional Probability
ERIC Educational Resources Information Center
Zhao, Jiaying; Shah, Anuj; Osherson, Daniel
2009-01-01
In standard treatments of probability, Pr(A[vertical bar]B) is defined as the ratio of Pr(A[intersection]B) to Pr(B), provided that Pr(B) greater than 0. This account of conditional probability suggests a psychological question, namely, whether estimates of Pr(A[vertical bar]B) arise in the mind via implicit calculation of…
The Influence of Phonotactic Probability on Word Recognition in Toddlers
ERIC Educational Resources Information Center
MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara
2014-01-01
This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response to…
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of...
Choice Strategies in Multiple-Cue Probability Learning
ERIC Educational Resources Information Center
White, Chris M.; Koehler, Derek J.
2007-01-01
Choice strategies for selecting among outcomes in multiple-cue probability learning were investigated using a simulated medical diagnosis task. Expected choice probabilities (the proportion of times each outcome was selected given each cue pattern) under alternative choice strategies were constructed from corresponding observed judged…
How Can Histograms Be Useful for Introducing Continuous Probability Distributions?
ERIC Educational Resources Information Center
Derouet, Charlotte; Parzysz, Bernard
2016-01-01
The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…
A Probability Model of Accuracy in Deception Detection Experiments.
ERIC Educational Resources Information Center
Park, Hee Sun; Levine, Timothy R.
2001-01-01
Extends the recent work on the veracity effect in deception detection. Explains the probabilistic nature of a receiver's accuracy in detecting deception and analyzes a receiver's detection of deception in terms of set theory and conditional probability. Finds that accuracy is shown to be a function of the relevant conditional probability and the…
Prizes in Cereal Boxes: An Application of Probability.
ERIC Educational Resources Information Center
Litwiller, Bonnie H.; Duncan, David R.
1992-01-01
Presents four cases of real-world probabilistic situations to promote more effective teaching of probability. Calculates the probability of obtaining six of six different prizes successively in six, seven, eight, and nine boxes of cereal, generalizes the problem to n boxes of cereal, and offers suggestions to extend the problem. (MDH)
Probabilities of Natural Events Occurring at Savannah River Plant
Huang, J.C.
2001-07-17
This report documents the comprehensive evaluation of probability models of natural events which are applicable to Savannah River Plant. The probability curves selected for these natural events are recommended to be used by all SRP/SRL safety analysts. This will ensure a consistency in analysis methodology for postulated SAR incidents involving natural phenomena.
A Mathematical Microworld for Students to Learn Introductory Probability.
ERIC Educational Resources Information Center
Jiang, Zhonghong; Potter, Walter D.
1993-01-01
Describes the Microworld Chance, a simulation-oriented computer environment that allows students to explore probability concepts in five subenvironments: coins, dice, spinners, thumbtacks, and marbles. Results of a teaching experiment to examine the effectiveness of the microworld in changing students' misconceptions about probability are…
Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel
ERIC Educational Resources Information Center
Gordon, Sheldon P.; Gordon, Florence S.
2009-01-01
The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…
A discussion on the origin of quantum probabilities
Holik, Federico; Sáenz, Manuel; Plastino, Angel
2014-01-15
We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.
Oil spill contamination probability in the southeastern Levantine basin.
Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam
2015-02-15
Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed.
Estimation of State Transition Probabilities: A Neural Network Model
NASA Astrophysics Data System (ADS)
Saito, Hiroshi; Takiyama, Ken; Okada, Masato
2015-12-01
Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.
Probability theory, not the very guide of life.
Juslin, Peter; Nilsson, Håkan; Winman, Anders
2009-10-01
Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.
Probability in the Many-Worlds Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Vaidman, Lev
It is argued that, although in the Many-Worlds Interpretation of quantum mechanics there is no "probability" for an outcome of a quantum experiment in the usual sense, we can understand why we have an illusion of probability. The explanation involves: (a) A "sleeping pill" gedanken experiment which makes correspondence between an illegitimate question: "What is the probability of an outcome of a quantum measurement?" with a legitimate question: "What is the probability that `I' am in the world corresponding to that outcome?"; (b) A gedanken experiment which splits the world into several worlds which are identical according to some symmetry condition; and (c) Relativistic causality, which together with (b) explain the Born rule of standard quantum mechanics. The Quantum Sleeping Beauty controversy and "caring measure" replacing probability measure are discussed.
2007-12-01
Implementing Risk Management on Software Intensive Projects. IEEE Software, 14(3):83-89. Fairley , R . (1994). Risk Management for Software Projects...conditional probability and the Bayesian effect is preceded by an introduction to some basic concepts of probability. Though this discussion draws from R ...Engineering Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213, USA. Charette, R . N. (1991). The Risks with Risk Analysis
ERIC Educational Resources Information Center
Maher, Nicole; Muir, Tracey
2014-01-01
This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…
The quantitative estimation of IT-related risk probabilities.
Herrmann, Andrea
2013-08-01
How well can people estimate IT-related risk? Although estimating risk is a fundamental activity in software management and risk is the basis for many decisions, little is known about how well IT-related risk can be estimated at all. Therefore, we executed a risk estimation experiment with 36 participants. They estimated the probabilities of IT-related risks and we investigated the effect of the following factors on the quality of the risk estimation: the estimator's age, work experience in computing, (self-reported) safety awareness and previous experience with this risk, the absolute value of the risk's probability, and the effect of knowing the estimates of the other participants (see: Delphi method). Our main findings are: risk probabilities are difficult to estimate. Younger and inexperienced estimators were not significantly worse than older and more experienced estimators, but the older and more experienced subjects better used the knowledge gained by knowing the other estimators' results. Persons with higher safety awareness tend to overestimate risk probabilities, but can better estimate ordinal ranks of risk probabilities. Previous own experience with a risk leads to an overestimation of its probability (unlike in other fields like medicine or disasters, where experience with a disease leads to more realistic probability estimates and nonexperience to an underestimation).
Electrofishing capture probability of smallmouth bass in streams
Dauwalter, D.C.; Fisher, W.L.
2007-01-01
Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.
Fixation of strategies driven by switching probabilities in evolutionary games
NASA Astrophysics Data System (ADS)
Xu, Zimin; Zhang, Jianlei; Zhang, Chunyan; Chen, Zengqiang
2016-12-01
We study the evolutionary dynamics of strategies in finite populations which are homogeneous and well mixed by means of the pairwise comparison process, the core of which is the proposed switching probability. Previous studies about this subject are usually based on the known payoff comparison of the related players, which is an ideal assumption. In real social systems, acquiring the accurate payoffs of partners at each round of interaction may be not easy. So we bypass the need of explicit knowledge of payoffs, and encode the payoffs into the willingness of any individual shift from her current strategy to the competing one, and the switching probabilities are wholly independent of payoffs. Along this way, the strategy updating can be performed when game models are fixed and payoffs are unclear, expected to extend ideal assumptions to be more realistic one. We explore the impact of the switching probability on the fixation probability and derive a simple formula which determines the fixation probability. Moreover we find that cooperation dominates defection if the probability of cooperation replacing defection is always larger than the probability of defection replacing cooperation in finite populations. Last, we investigate the influences of model parameters on the fixation of strategies in the framework of three concrete game models: prisoner's dilemma, snowdrift game and stag-hunt game, which effectively portray the characteristics of cooperative dilemmas in real social systems.
Dopamine D₁ receptors and nonlinear probability weighting in risky choice.
Takahashi, Hidehiko; Matsui, Hiroshi; Camerer, Colin; Takano, Harumasa; Kodaka, Fumitoshi; Ideno, Takashi; Okubo, Shigetaka; Takemura, Kazuhisa; Arakawa, Ryosuke; Eguchi, Yoko; Murai, Toshiya; Okubo, Yoshiro; Kato, Motoichiro; Ito, Hiroshi; Suhara, Tetsuya
2010-12-08
Misestimating risk could lead to disadvantaged choices such as initiation of drug use (or gambling) and transition to regular drug use (or gambling). Although the normative theory in decision-making under risks assumes that people typically take the probability-weighted expectation over possible utilities, experimental studies of choices among risks suggest that outcome probabilities are transformed nonlinearly into subjective decision weights by a nonlinear weighting function that overweights low probabilities and underweights high probabilities. Recent studies have revealed the neurocognitive mechanism of decision-making under risk. However, the role of modulatory neurotransmission in this process remains unclear. Using positron emission tomography, we directly investigated whether dopamine D₁ and D₂ receptors in the brain are associated with transformation of probabilities into decision weights in healthy volunteers. The binding of striatal D₁ receptors is negatively correlated with the degree of nonlinearity of weighting function. Individuals with lower striatal D₁ receptor density showed more pronounced overestimation of low probabilities and underestimation of high probabilities. This finding should contribute to a better understanding of the molecular mechanism of risky choice, and extreme or impaired decision-making observed in drug and gambling addiction.
Off-site ignition probability of flammable gases.
Rew, P J; Spencer, H; Daycock, J
2000-01-07
A key step in the assessment of risk for installations where flammable liquids or gases are stored is the estimation of ignition probability. A review of current modelling and data confirmed that ignition probability values used in risk analyses tend to be based on extrapolation of limited incident data or, in many cases, on the judgement of those conducting the safety assessment. Existing models tend to assume that ignition probability is a function of release rate (or flammable gas cloud size) alone and they do not consider location, density or type of ignition source. An alternative mathematical framework for calculating ignition probability is outlined in which the approach used is to model the distribution of likely ignition sources and to calculate ignition probability by considering whether the flammable gas cloud will reach these sources. Data are collated on the properties of ignition sources within three generic land-use types: industrial, urban and rural. These data are then incorporated into a working model for ignition probability in a form capable of being implemented within risk analysis models. The sensitivity of the model results to assumptions made in deriving the ignition source properties is discussed and the model is compared with other available ignition probability methods.
Conditional Probability Analyses of the Spike Activity of Single Neurons
Gray, Peter R.
1967-01-01
With the objective of separating stimulus-related effects from refractory effects in neuronal spike data, various conditional probability analyses have been developed. These analyses are introduced and illustrated with examples based on electrophysiological data from auditory nerve fibers. The conditional probability analyses considered here involve the estimation of the conditional probability of a firing in a specified time interval (defined relative to the time of the stimulus presentation), given that the last firing occurred during an earlier specified time interval. This calculation enables study of the stimulus-related effects in the spike data with the time-since-the-last-firing as a controlled variable. These calculations indicate that auditory nerve fibers “recover” from the refractory effects that follow a firing in the following sense: after a “recovery time” of approximately 20 msec, the firing probabilities no longer depend on the time-since-the-last-firing. Probabilities conditional on this minimum time since the last firing are called “recovered probabilities.” The recovered probabilities presented in this paper are contrasted with the corresponding poststimulus time histograms, and the differences are related to the refractory properties of the nerve fibers. Imagesp[762]-a PMID:19210997
Estimating Prior Model Probabilities Using an Entropy Principle
NASA Astrophysics Data System (ADS)
Ye, M.; Meyer, P. D.; Neuman, S. P.; Pohlmann, K.
2004-12-01
Considering conceptual model uncertainty is an important process in environmental uncertainty/risk analyses. Bayesian Model Averaging (BMA) (Hoeting et al., 1999) and its Maximum Likelihood version, MLBMA, (Neuman, 2003) jointly assess predictive uncertainty of competing alternative models to avoid bias and underestimation of uncertainty caused by relying on one single model. These methods provide posterior distribution (or, equivalently, leading moments) of quantities of interests for decision-making. One important step of these methods is to specify prior probabilities of alternative models for the calculation of posterior model probabilities. This problem, however, has not been satisfactorily resolved and equally likely prior model probabilities are usually accepted as a neutral choice. Ye et al. (2004) have shown that whereas using equally likely prior model probabilities has led to acceptable geostatistical estimates of log air permeability data from fractured unsaturated tuff at the Apache Leap Research Site (ALRS) in Arizona, identifying more accurate prior probabilities can improve these estimates. In this paper we present a new methodology to evaluate prior model probabilities by maximizing Shannon's entropy with restrictions postulated a priori based on model plausibility relationships. It yields optimum prior model probabilities conditional on prior information used to postulate the restrictions. The restrictions and corresponding prior probabilities can be modified as more information becomes available. The proposed method is relatively easy to use in practice as it is generally less difficult for experts to postulate relationships between models than to specify numerical prior model probability values. Log score, mean square prediction error (MSPE) and mean absolute predictive error (MAPE) criteria consistently show that applying our new method to the ALRS data reduces geostatistical estimation errors provided relationships between models are
Naive Probability: Model-Based Estimates of Unique Events.
Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N
2015-08-01
We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.
Two-slit experiment: quantum and classical probabilities
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2015-06-01
Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).
Estimating the probability of failure when testing reveals no failures
NASA Technical Reports Server (NTRS)
Miller, Keith W.; Morell, Larry J.; Noonan, Robert E.; Park, Stephen K.; Nicol, David M.; Murrill, Branson W.; Voas, Jeffrey M.
1992-01-01
Formulas for estimating the probability of failure when testing reveals no errors are introduced. These formulas incorporate random testing results, information about the input distribution, and prior assumptions about the probability of failure of the software. The formulas are not restricted to equally likely input distributions, and the probability of failure estimate can be adjusted when assumptions about the input distribution change. The formulas are based on a discrete sample space statistical model of software and include Bayesian prior assumptions. Reusable software and software in life-critical applications are particularly appropriate candidates for this type of analysis.
Hydrogeologic unit flow characterization using transition probability geostatistics.
Jones, Norman L; Walker, Justin R; Carle, Steven F
2005-01-01
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.
A new technique for predicting geosynchronous satellite collision probability
NASA Technical Reports Server (NTRS)
Mccormick, B.
1986-01-01
A new technique has been developed to predict the probability of an expired geosynchronous satellite colliding with an active satellite. This new technique employs deterministic methods for modeling the motion of satellites and applies statistical techniques to estimate the collision probability. The collision probability is used to estimate the expected time between collisions based on realistic distributions of expired and active satellites. The primary advantage of this new technique is that realistic distributions can be used in the prediction process instead of uniform distributions as has been used in previous techniques. The expected time between collisions based on a current NORAD database is estimated to be in the hundreds of years.
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.
Healthy life behaviors and suicide probability in university students.
Engin, Esra; Cuhadar, Dondu; Ozturk, Emel
2012-02-01
This study aims to determine the sociodemographic factors and healthy life behaviors affecting suicide and suicide probability of university students. The research was designed as a complementary study and conducted with 334 students from several faculties and colleges at Ege University, Turkey. The study findings indicated that suicide probability could be affected by the students' age, their problems at school, their troubled relations with friends, and a psychiatric disorder history within the last year. Moreover, it was concluded that the students with healthy life behaviors had significantly lower scores on the Suicide Probability Scale and its subscales.
Origin of probabilities and their application to the multiverse
NASA Astrophysics Data System (ADS)
Albrecht, Andreas; Phillips, Daniel
2014-12-01
We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.
10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR ...
10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR FIRE WATER STORAGE, LOOKING SOUTHWEST. - Whitsett Pump Plant, West side of Colorado River, north of Parker Dam, Parker Dam, San Bernardino County, CA
28 CFR 2.214 - Probable cause hearing and determination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... adverse witnesses (i.e., witnesses who have given information upon which revocation may be based) at a... confrontation. Whenever a probable cause hearing is postponed to secure the appearance of adverse witnesses...
Review of Literature for Model Assisted Probability of Detection
Meyer, Ryan M.; Crawford, Susan L.; Lareau, John P.; Anderson, Michael T.
2014-09-30
This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.
Sculpture, general view looking to the seated lions, probably from ...
Sculpture, general view looking to the seated lions, probably from the American Bungalow - National Park Seminary, Bounded by Capitol Beltway (I-495), Linden Lane, Woodstove Avenue, & Smith Drive, Silver Spring, Montgomery County, MD
Probability of Intrinsic Time-Arrow from Information Loss
NASA Astrophysics Data System (ADS)
Diósi, Lajos
Time-arrow s=±, intrinsic to a concrete physical system, is associated with the direction of information loss I displayed by the random evolution of the given system. When the information loss tends to zero the intrinsic time-arrow becomes uncertain. We propose the heuristic relationship for the probability of the intrinsic time-arrow. The main parts of the present work are trying to confirm this heuristic equation. The probability of intrinsic time arrow is defined by Bayesian inference from the observed random process. From irreversible thermodynamic systems, the proposed heuristic probabilities follow via the Gallavotti-Cohen relations between time-reversed random processes. In order to explore the underlying microscopic mechanism, a trivial microscopic process is analyzed and an obvious discrepancy is identified. It can be resolved by quantum theory. The corresponding trivial quantum process will exactly confirm the proposed heuristic time-arrow probability.
Local estimation of posterior class probabilities to minimize classification errors.
Guerrero-Curieses, Alicia; Cid-Sueiro, Jesús; Alaiz-Rodríguez, Rocío; Figueiras-Vidal, Aníbal R
2004-03-01
Decision theory shows that the optimal decision is a function of the posterior class probabilities. More specifically, in binary classification, the optimal decision is based on the comparison of the posterior probabilities with some threshold. Therefore, the most accurate estimates of the posterior probabilities are required near these decision thresholds. This paper discusses the design of objective functions that provide more accurate estimates of the probability values, taking into account the characteristics of each decision problem. We propose learning algorithms based on the stochastic gradient minimization of these loss functions. We show that the performance of the classifier is improved when these algorithms behave like sample selectors: samples near the decision boundary are the most relevant during learning.
Bayes estimate of the probability of exceedance of annual floods
NASA Astrophysics Data System (ADS)
Lye, L. M.
1990-03-01
In this paper Lindley's Bayesian approximation procedure is used to obtain the Bayes estimate of the probability of exceedence of a flood discharge. The Bayes estimates of the probability of exceedence has been shown by S.K. Sinha to be equivalent to the estimate of the probability of exceedence from the predictive or Bayesian disribution, of a future flood discharge. The evaluation of complex ratios of multiple integrals common in a Bayesian analysis is not necessary using Lindley's procedure. The Bayes estimates are compared to those obtained by the method of maximum likelihood and the method of moments. The results show that Bayes estimates of the probability of exceedence are larger as expected, but have smaller posterior standard deviations.
A comparison of tail probability estimators for flood frequency analysis
NASA Astrophysics Data System (ADS)
Moon, Young-Il; Lall, Upmanu; Bosworth, Ken
1993-11-01
Selected techniques for estimating exceedance frequencies of annual maximum flood events at a gaged site are compared in this paper. Four tail probability estimators proposed by Hill (PT1), Hosking and Wallis (PT2) and by Breiman and Stone (ET and QT), and a variable kernel distribution function estimator (VK-C-AC) were compared for three situations — Gaussian data, skewed data (three-parameter gamma) and Gaussian mixture data. The performance of these estimators was compared with method of moment estimates of tail probabilities, using the Gaussian, Pearson Type III, and extreme value distributions. Since the results of the tail probability estimators (PT1, PT2, ET, QT) varied according to the situation, it is not easy to say which tail probability estimator is the best. However, the performance of the variable kernel estimator was relatively consistent across the estimation situations considered in terms of bias and r.m.s.e.
42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED ...
42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED FUMES FROM THE DIPPING OPERATIONS IN BUILDING 49A; BUILDING 49 IS AT THE LEFT OF THE PHOTOGRAPH - Scovill Brass Works, 59 Mill Street, Waterbury, New Haven County, CT
2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES ...
2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES AN ORIGINAL 4/4 DOUBLE-HUNG WINDOW. - Tonto Ranger Station, Barn, Forest Service Road 65 at Tonto Wash, Skull Valley, Yavapai County, AZ
Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Markley, F Landis
2014-01-01
This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.
Ideas of Chance and Probability in Children and Adolescents.
ERIC Educational Resources Information Center
Bliss, Joan
1978-01-01
Describes children's reactions to chance and probability in a variety of experimental situations, using four different experiments from Piaget's work, to give a clearer picture of how children approach these ideas. (GA)
A Probability Problem from Real Life: The Tire Exploded.
ERIC Educational Resources Information Center
Bartlett, Albert A.
1993-01-01
Discusses the probability of seeing a tire explode or disintegrate while traveling down the highway. Suggests that a person observing 10 hours a day would see a failure on the average of once every 300 years. (MVL)
Liouville equation and Markov chains: epistemological and ontological probabilities
NASA Astrophysics Data System (ADS)
Costantini, D.; Garibaldi, U.
2006-06-01
The greatest difficulty of a probabilistic approach to the foundations of Statistical Mechanics lies in the fact that for a system ruled by classical or quantum mechanics a basic description exists, whose evolution is deterministic. For such a system any kind of irreversibility is impossible in principle. The probability used in this approach is epistemological. On the contrary for irreducible aperiodic Markov chains the invariant measure is reached with probability one whatever the initial conditions. Almost surely the uniform distributions, on which the equilibrium treatment of quantum and classical perfect gases is based, are reached when time goes by. The transition probability for binary collision, deduced by the Ehrenfest-Brillouin model, points out an irreducible aperiodic Markov chain and thus an equilibrium distribution. This means that we are describing the temporal probabilistic evolution of the system. The probability involved in this evolution is ontological.
17. Photocopy of photograph (source uncertain; probably the Bucks County ...
17. Photocopy of photograph (source uncertain; probably the Bucks County Historical Society) ca. 1913, photographer unknown SHOWING 1904 SOCIETY HEADQUARTERS AND THE EARLY PHASES OF CONSTRUCTION ON MERCER'S ADDITION - Mercer Museum, Pine & Ashland Streets, Doylestown, Bucks County, PA
Estimation of transition probabilities of credit ratings for several companies
NASA Astrophysics Data System (ADS)
Peng, Gan Chew; Hin, Pooi Ah
2016-10-01
This paper attempts to estimate the transition probabilities of credit ratings for a number of companies whose ratings have a dependence structure. Binary codes are used to represent the index of a company together with its ratings in the present and next quarters. We initially fit the data on the vector of binary codes with a multivariate power-normal distribution. We next compute the multivariate conditional distribution for the binary codes of rating in the next quarter when the index of the company and binary codes of the company in the present quarter are given. From the conditional distribution, we compute the transition probabilities of the company's credit ratings in two consecutive quarters. The resulting transition probabilities tally fairly well with the maximum likelihood estimates for the time-independent transition probabilities.
21. HISTORIC VIEW OF EARLY MIRAK DESIGN. PROBABLY AT THE ...
21. HISTORIC VIEW OF EARLY MIRAK DESIGN. PROBABLY AT THE FARM OF KLAUS RIEDEL'S GRANDPARENTS IN BERNSTADT, SAXONY, 1930. - Marshall Space Flight Center, Redstone Rocket (Missile) Test Stand, Dodd Road, Huntsville, Madison County, AL
Maximum Probability Reaction Sequences in Stochastic Chemical Kinetic Systems
Salehi, Maryam; Perkins, Theodore J.
2010-01-01
The detailed behavior of many molecular processes in the cell, such as protein folding, protein complex assembly, and gene regulation, transcription and translation, can often be accurately captured by stochastic chemical kinetic models. We investigate a novel computational problem involving these models – that of finding the most-probable sequence of reactions that connects two or more states of the system observed at different times. We describe an efficient method for computing the probability of a given reaction sequence, but argue that computing most-probable reaction sequences is EXPSPACE-hard. We develop exact (exhaustive) and approximate algorithms for finding most-probable reaction sequences. We evaluate these methods on test problems relating to a recently-proposed stochastic model of folding of the Trp-cage peptide. Our results provide new computational tools for analyzing stochastic chemical models, and demonstrate their utility in illuminating the behavior of real-world systems. PMID:21629860
Gauge transformation of quantum states in probability representation
NASA Astrophysics Data System (ADS)
Korennoy, Ya A.; Man’ko, V. I.
2017-04-01
The gauge invariance of the evolution equations of tomographic probability distribution functions of quantum particles in an electromagnetic field is illustrated. Explicit expressions for the transformations of ordinary tomograms of states under a gauge transformation of electromagnetic field potentials are obtained. Gauge-independent optical and symplectic tomographic quasi-distributions and tomographic probability distributions of states of quantum system are introduced, and their evolution equations have the Liouville equation in corresponding representations as the classical limits are found.
A short history of probability theory and its applications
NASA Astrophysics Data System (ADS)
Debnath, Lokenath; Basu, Kanadpriya
2015-01-01
Subjective Probability Distribution Elicitation in Cost Risk Analysis: A Review
2007-01-01
where reasonable, to counteract known biases in elicitation). 1 For the triangle distribution, the probability is set to zero outside the endpoints...probability is set to zero outside the endpoints, while between the endpoints the density rises linearly from the lower value to the most-likely values...Wheeler, T. A., S. C. Hora , W. R. Cramond, and S. D. Unwin, Analysis of Core Damage Frequency from Internal Events: Expert Judgment Elicitation
[A case of twins with probable superfetation (author's transl)].
Bertrams, J; Preuss, H
1980-01-01
A twin case of disputed paternity with probable superfetation is reported. The putative father could be excluded as the father of Twin F by HLA, GLO, and Ss typing results, but could not be excluded as the father of Twin S, with a probability of paternity for this twin of 99.995%. A birth weight difference of 450 g and the evidence for additional sexual intercourses by the mother suggest the very rare event of a superfetation.
Radiometric measurements of gap probability in conifer tree canopies
NASA Technical Reports Server (NTRS)
Albers, Bryan J.; Strahler, Alan H.; Li, Xiaowen; Liang, Shunlin; Clarke, Keith C.
1990-01-01
Measurements of gap probability were made for some moderate-sized, open-grown conifers of varying species. Results of the radiometric analysis show that the gap probability, which is taken as the mean of the binomial, fits well a negative exponential function of a path length. The conifer shadow, then, is an object of almost uniform darkness with some bright holes or gaps that are found near the shadow's edge and rapidly disappear toward the shadows center.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati
2014-06-19
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
NASA Astrophysics Data System (ADS)
Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati
2014-06-01
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.
NASA Technical Reports Server (NTRS)
Havens, K. A.; Minster, T. C.; Thadani, S. G.
1976-01-01
The probability of error or, alternatively, the probability of correct classification (PCC) is an important criterion in analyzing the performance of a classifier. Labeled samples (those with ground truth) are usually employed to evaluate the performance of a classifier. Occasionally, the numbers of labeled samples are inadequate, or no labeled samples are available to evaluate a classifier's performance; for example, when crop signatures from one area from which ground truth is available are used to classify another area from which no ground truth is available. This paper reports the results of an experiment to estimate the probability of error using unlabeled test samples (i.e., without the aid of ground truth).
Maximum entropy principle and partial probability weighted moments
NASA Astrophysics Data System (ADS)
Deng, Jian; Pandey, M. D.; Xie, W. C.
2012-05-01
Maximum entropy principle (MaxEnt) is usually used for estimating the probability density function under specified moment constraints. The density function is then integrated to obtain the cumulative distribution function, which needs to be inverted to obtain a quantile corresponding to some specified probability. In such analysis, consideration of higher ordermoments is important for accurate modelling of the distribution tail. There are three drawbacks for this conventional methodology: (1) Estimates of higher order (>2) moments from a small sample of data tend to be highly biased; (2) It can merely cope with problems with complete or noncensored samples; (3) Only probability weighted moments of integer orders have been utilized. These difficulties inevitably induce bias and inaccuracy of the resultant quantile estimates and therefore have been the main impediments to the application of the MaxEnt Principle in extreme quantile estimation. This paper attempts to overcome these problems and presents a distribution free method for estimating the quantile function of a non-negative randomvariable using the principle of maximum partial entropy subject to constraints of the partial probability weighted moments estimated from censored sample. The main contributions include: (1) New concepts, i.e., partial entropy, fractional partial probability weighted moments, and partial Kullback-Leibler measure are elegantly defined; (2) Maximum entropy principle is re-formulated to be constrained by fractional partial probability weighted moments; (3) New distribution free quantile functions are derived. Numerical analyses are performed to assess the accuracy of extreme value estimates computed from censored samples.
An improved probability mapping approach to assess genome mosaicism
Zhaxybayeva, Olga; Gogarten, J Peter
2003-01-01
Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984
A quantum theoretical explanation for probability judgment errors.
Busemeyer, Jerome R; Pothos, Emmanuel M; Franco, Riccardo; Trueblood, Jennifer S
2011-04-01
A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector spaces defined by features and similarities between vectors to determine probability judgments. On the other hand, quantum probability theory is a generalization of Bayesian probability theory because it is based on a set of (von Neumann) axioms that relax some of the classic (Kolmogorov) axioms. The quantum model is compared and contrasted with other competing explanations for these judgment errors, including the anchoring and adjustment model for probability judgments. In the quantum model, a new fundamental concept in cognition is advanced--the compatibility versus incompatibility of questions and the effect this can have on the sequential order of judgments. We conclude that quantum information-processing principles provide a viable and promising new way to understand human judgment and reasoning.
Anticipating abrupt shifts in temporal evolution of probability of eruption
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Loschetter, Annick
2016-04-01
Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value and to the one of high probability value: the latter value generally supports the call for evacuation. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to >70% could be identified up 4 hours in advance, ~2.5 days before the evacuation call (decided for an eruption probability >80% during the MESIMEX exercise). This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.
Probability analysis of position errors using uncooled IR stereo camera
NASA Astrophysics Data System (ADS)
Oh, Jun Ho; Lee, Sang Hwa; Lee, Boo Hwan; Park, Jong-Il
2016-05-01
This paper analyzes the random phenomenon of 3D positions when tracking moving objects using the infrared (IR) stereo camera, and proposes a probability model of 3D positions. The proposed probability model integrates two random error phenomena. One is the pixel quantization error which is caused by discrete sampling pixels in estimating disparity values of stereo camera. The other is the timing jitter which results from the irregular acquisition-timing in the uncooled IR cameras. This paper derives a probability distribution function by combining jitter model with pixel quantization error. To verify the proposed probability function of 3D positions, the experiments on tracking fast moving objects are performed using IR stereo camera system. The 3D depths of moving object are estimated by stereo matching, and be compared with the ground truth obtained by laser scanner system. According to the experiments, the 3D depths of moving object are estimated within the statistically reliable range which is well derived by the proposed probability distribution. It is expected that the proposed probability model of 3D positions can be applied to various IR stereo camera systems that deal with fast moving objects.
Fixation probability for lytic viruses: the attachment-lysis model.
Patwa, Z; Wahl, L M
2008-09-01
The fixation probability of a beneficial mutation is extremely sensitive to assumptions regarding the organism's life history. In this article we compute the fixation probability using a life-history model for lytic viruses, a key model organism in experimental studies of adaptation. The model assumes that attachment times are exponentially distributed, but that the lysis time, the time between attachment and host cell lysis, is constant. We assume that the growth of the wild-type viral population is controlled by periodic sampling (population bottlenecks) and also include the possibility that clearance may occur at a constant rate, for example, through washout in a chemostat. We then compute the fixation probability for mutations that increase the attachment rate, decrease the lysis time, increase the burst size, or reduce the probability of clearance. The fixation probability of these four types of beneficial mutations can be vastly different and depends critically on the time between population bottlenecks. We also explore mutations that affect lysis time, assuming that the burst size is constrained by the lysis time, for experimental protocols that sample either free phage or free phage and artificially lysed infected cells. In all cases we predict that the fixation probability of beneficial alleles is remarkably sensitive to the time between population bottlenecks.
Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.
Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark
2016-03-01
Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.
Classifier calibration using splined empirical probabilities in clinical risk prediction.
Gaudoin, René; Montana, Giovanni; Jones, Simon; Aylin, Paul; Bottle, Alex
2015-06-01
The aims of supervised machine learning (ML) applications fall into three broad categories: classification, ranking, and calibration/probability estimation. Many ML methods and evaluation techniques relate to the first two. Nevertheless, there are many applications where having an accurate probability estimate is of great importance. Deriving accurate probabilities from the output of a ML method is therefore an active area of research, resulting in several methods to turn a ranking into class probability estimates. In this manuscript we present a method, splined empirical probabilities, based on the receiver operating characteristic (ROC) to complement existing algorithms such as isotonic regression. Unlike most other methods it works with a cumulative quantity, the ROC curve, and as such can be tagged onto an ROC analysis with minor effort. On a diverse set of measures of the quality of probability estimates (Hosmer-Lemeshow, Kullback-Leibler divergence, differences in the cumulative distribution function) using simulated and real health care data, our approach compares favourably with the standard calibration method, the pool adjacent violators algorithm used to perform isotonic regression.
The preference of probability over negative values in action selection.
Neyedli, Heather F; Welsh, Timothy N
2015-01-01
It has previously been found that when participants are presented with a pair of motor prospects, they can select the prospect with the largest maximum expected gain (MEG). Many of those decisions, however, were trivial because of large differences in MEG between the prospects. The purpose of the present study was to explore participants' preferences when making non-trivial decisions between two motor prospects. Participants were presented with pairs of prospects that: 1) differed in MEG with either only the values or only the probabilities differing between the prospects; and 2) had similar MEG with one prospect having a larger probability of hitting the target and a higher penalty value and the other prospect a smaller probability of hitting the target but a lower penalty value. In different experiments, participants either had 400 ms or 2000 ms to decide between the prospects. It was found that participants chose the configuration with the larger MEG more often when the probability varied between prospects than when the value varied. In pairs with similar MEGs, participants preferred a larger probability of hitting the target over a smaller penalty value. These results indicate that participants prefer probability information over negative value information in a motor selection task.
Non-adiabatic transition probability dependence on conical intersection topography
NASA Astrophysics Data System (ADS)
Malhado, João Pedro; Hynes, James T.
2016-11-01
We derive a closed form analytical expression for the non-adiabatic transition probability for a distribution of trajectories passing through a generic conical intersection (CI), based on the Landau-Zener equation for the non-adiabatic transition probability for a single straight-line trajectory in the CI's vicinity. We investigate the non-adiabatic transition probability's variation with topographical features and find, for the same crossing velocity, no intrinsic difference in efficiency at promoting non-adiabatic decay between peaked and sloped CIs, a result in contrast to the commonly held view. Any increased efficiency of peaked over sloped CIs is thus due to dynamical effects rather than to any increased transition probability of topographical origin. It is also shown that the transition probability depends in general on the direction of approach to the CI, and that the coordinates' reduced mass can affect the transition probability via its influence on the CI topography in mass-scaled coordinates. The resulting predictions compare well with surface hopping simulation results.
Non-adiabatic transition probability dependence on conical intersection topography.
Malhado, João Pedro; Hynes, James T
2016-11-21
We derive a closed form analytical expression for the non-adiabatic transition probability for a distribution of trajectories passing through a generic conical intersection (CI), based on the Landau-Zener equation for the non-adiabatic transition probability for a single straight-line trajectory in the CI's vicinity. We investigate the non-adiabatic transition probability's variation with topographical features and find, for the same crossing velocity, no intrinsic difference in efficiency at promoting non-adiabatic decay between peaked and sloped CIs, a result in contrast to the commonly held view. Any increased efficiency of peaked over sloped CIs is thus due to dynamical effects rather than to any increased transition probability of topographical origin. It is also shown that the transition probability depends in general on the direction of approach to the CI, and that the coordinates' reduced mass can affect the transition probability via its influence on the CI topography in mass-scaled coordinates. The resulting predictions compare well with surface hopping simulation results.
Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E
2015-01-01
In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be
Nahorniak, Matthew
2015-01-01
In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools—linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be
On the probability of cure for heavy-ion radiotherapy.
Hanin, Leonid; Zaider, Marco
2014-07-21
The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.
On the probability of cure for heavy-ion radiotherapy
NASA Astrophysics Data System (ADS)
Hanin, Leonid; Zaider, Marco
2014-07-01
The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.
TRANSIT PROBABILITIES FOR STARS WITH STELLAR INCLINATION CONSTRAINTS
Beatty, Thomas G.; Seager, Sara
2010-04-01
The probability that an exoplanet transits its host star is high for planets in close orbits, but drops off rapidly for increasing semimajor axes. This makes transit surveys for planets with large semimajor axes orbiting bright stars impractical, since one would need to continuously observe hundreds of stars that are spread out over the entire sky. One way to make such a survey tractable is to constrain the inclination of the stellar rotation axes in advance, and thereby enhance the transit probabilities. We derive transit probabilities for stars with stellar inclination constraints, considering a reasonable range of planetary system inclinations. We find that stellar inclination constraints can improve the transit probability by almost an order of magnitude for habitable-zone planets. When applied to an ensemble of stars, such constraints dramatically lower the number of stars that need to be observed in a targeted transit survey. We also consider multiplanet systems where only one planet has an identified transit and derive the transit probabilities for the second planet assuming a range of mutual planetary inclinations.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
Multiclass Posterior Probability Twin SVM for Motor Imagery EEG Classification.
She, Qingshan; Ma, Yuliang; Meng, Ming; Luo, Zhizeng
2015-01-01
Motor imagery electroencephalography is widely used in the brain-computer interface systems. Due to inherent characteristics of electroencephalography signals, accurate and real-time multiclass classification is always challenging. In order to solve this problem, a multiclass posterior probability solution for twin SVM is proposed by the ranking continuous output and pairwise coupling in this paper. First, two-class posterior probability model is constructed to approximate the posterior probability by the ranking continuous output techniques and Platt's estimating method. Secondly, a solution of multiclass probabilistic outputs for twin SVM is provided by combining every pair of class probabilities according to the method of pairwise coupling. Finally, the proposed method is compared with multiclass SVM and twin SVM via voting, and multiclass posterior probability SVM using different coupling approaches. The efficacy on the classification accuracy and time complexity of the proposed method has been demonstrated by both the UCI benchmark datasets and real world EEG data from BCI Competition IV Dataset 2a, respectively.
Anticipating abrupt shifts in temporal evolution of probability of eruption
NASA Astrophysics Data System (ADS)
Rohmer, J.; Loschetter, A.
2016-04-01
Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value to a state of high probability value. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to > 70% could be identified up to 1-3 h in advance. This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.
Rejecting probability summation for radial frequency patterns, not so Quick!
Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F
2016-05-01
Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF.
The factor of 10 in forensic DNA match probabilities.
Gittelson, Simone; Moretti, Tamyra R; Onorato, Anthony J; Budowle, Bruce; Weir, Bruce S; Buckleton, John
2017-02-16
An update was performed of the classic experiments that led to the view that profile probability assignments are usually within a factor of 10 of each other. The data used in this study consist of 15 Identifiler loci collected from a wide range of forensic populations. Following Budowle et al. [1], the terms cognate and non-cognate are used. The cognate database is the database from which the profiles are simulated. The profile probability assignment was usually larger in the cognate database. In 44%-65% of the cases, the profile probability for 15 loci in the non-cognate database was within a factor of 10 of the profile probability in the cognate database. This proportion was between 60% and 80% when the FBI and NIST data were used as the non-cognate databases. A second experiment compared the match probability assignment using a generalised database and recommendation 4.2 from NRC II (the 4.2 assignment) with a proxy for the matching proportion developed using subpopulation allele frequencies and the product rule. The findings support that the 4.2 assignment has a large conservative bias. These results are in agreement with previous research results.
Visual search and location probability learning from variable perspectives.
Jiang, Yuhong V; Swallow, Khena M; Capistrano, Christian G
2013-05-28
Do moving observers code attended locations relative to the external world or relative to themselves? To address this question we asked participants to conduct visual search on a tabletop. The search target was more likely to occur in some locations than others. Participants walked to different sides of the table from trial to trial, changing their perspective. The high-probability locations were stable on the tabletop but variable relative to the viewer. When participants were informed of the high-probability locations, search was faster when the target was in those locations, demonstrating probability cuing. However, in the absence of explicit instructions and awareness, participants failed to acquire an attentional bias toward the high-probability locations even when the search items were displayed over an invariant natural scene. Additional experiments showed that locomotion did not interfere with incidental learning, but the lack of a consistent perspective prevented participants from acquiring probability cuing incidentally. We conclude that spatial biases toward target-rich locations are directed by two mechanisms: incidental learning and goal-driven attention. Incidental learning codes attended locations in a viewer-centered reference frame and is not updated with viewer movement. Goal-driven attention can be deployed to prioritize an environment-rich region.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories.
Probability as Possibility Spaces: Communicating Uncertainty to Policymakers
NASA Astrophysics Data System (ADS)
Stiso, C.
2015-12-01
One problem facing the sciences is communicating results and recommendations to policymakers. This is perhaps particularly difficult in the geosciences where results are often based on probabilistic models, as probability is often and unduly equated with a specific kind of uncertainty or unreliability in the results. This leads to a great deal of miscommunication and misguided policy decisions. It is, then, valid to ask how scientists should talk about probability, uncertainty, and models in a way that correctly conveys what the users of these models intend. What I propose is a new way to think and, importantly, talk about probability which will hopefully make this much more transparent to both users and policy makers. Rather than using a frequentist (prior percentages) or Bayesian (observer uncertainty) framework, we should talk about probability as a tool for defining a possibility space for measurements. This model is conceptually simple and makes probability a tool of refinement rather than a source of inaccuracy. A similar possibility-space model has proven useful in the climate sciences and there is good reason to believe it will have similar applications in hydrology.
{alpha} particle preformation in heavy nuclei and penetration probability
Zhang, H. F.; Royer, G.
2008-05-15
The {alpha} particle preformation in the even-even nuclei from {sup 108}Te to {sup 294}118 and the penetration probability have been studied. The isotopes from Pb to U have been firstly investigated since the experimental data allow us to extract the microscopic features for each element. The assault frequency has been estimated using classical methods and the penetration probability from tunneling through the Generalized Liquid Drop Model (GLDM) potential barrier. The preformation factor has been extracted from experimental {alpha} decay energies and half-lives. The shell closure effects play the key role in the {alpha} preformation. The more the nucleon number is close to the magic numbers, the more the formation of {alpha} cluster is difficult inside the mother nucleus. The penetration probabilities reflect that 126 is a neutron magic number. The penetration probability range is very large compared to that of the preformation factor. The penetration probability determines mainly the {alpha} decay half-life while the preformation factor allows us to obtain information on the nuclear structure. The study has been extended to the newly observed heaviest nuclei.
Detection probability of vocalizing dugongs during playback of conspecific calls.
Ichikawa, Kotaro; Akamatsu, Tomonari; Shinke, Tomio; Sasamori, Kotoe; Miyauchi, Yukio; Abe, Yuki; Adulyanukosol, Kanjana; Arai, Nobuaki
2009-10-01
Dugongs (Dugong dugon) were monitored using simultaneous passive acoustic methods and visual observations in Thai waters during January 2008. Chirp and trill calls were detected by a towed stereo hydrophone array system. Two teams of experienced observers conducted standard visual observations on the same boat. Comparisons of detection probabilities of acoustic and visual monitoring between two independent observers were calculated. Acoustic and visual detection probabilities were 15.1% and 15.7%, respectively, employing a 300 s matching time interval. When conspecific chirp calls were broadcast from an underwater speaker deployed on the side of the observation boat, the detection probability of acoustic monitoring rose to 19.2%. The visual detection probability was 12.5%. Vocal hot spots characterized by frequent acoustic detection of calls were suggested by dispersion analysis, while dugongs were visually observed constantly throughout the focal area (p<0.001). Passive acoustic monitoring assisted the survey since detection performance similar to that of experienced visual observers was shown. Playback of conspecific chirps appeared to increase the detection probability, which could be beneficial for future field surveys using passive acoustics in order to ensure the attendance of dugongs in the focal area.
Looking inside Mount Vesuvius by potential fields integrated probability tomographies
NASA Astrophysics Data System (ADS)
Iuliano, Teresa; Mauriello, Paolo; Patella, Domenico
2002-03-01
First, we outline the theory of the three-dimensional (3D) probability tomography for any generic vector or scalar geophysical field and define an approach to the integrated tomography of any pair of geophysical data sets collected in the same area. Then, we discuss the results of the application of the 3D probability tomography to the Mount Vesuvius volcanic complex, considering gravity, magnetic and self-potential survey data. The most important feature resulting from the integrated tomography regards the Mt. Vesuvius plumbing system. A unique central conduit is outlined at the intersection between a W-E- and a N—S-trending vertical boundary planes. The top terminal part of this conduit appears completely filled with magnetized and less dense volcanic material. This new information, combined with previous indications about the probable existence of a magma reservoir at 8-10 km of depth, strengthen the hypothesis that Mount Vesuvius is still to be considered a highly hazardous volcano.
Testing the Value of Probability Forecasts for Calibrated Combining
Lahiri, Kajal; Peng, Huaming; Zhao, Yongchen
2014-01-01
We combine the probability forecasts of a real GDP decline from the U.S. Survey of Professional Forecasters, after trimming the forecasts that do not have “value”, as measured by the Kuiper Skill Score and in the sense of Merton (1981). For this purpose, we use a simple test to evaluate the probability forecasts. The proposed test does not require the probabilities to be converted to binary forecasts before testing, and it accommodates serial correlation and skewness in the forecasts. We find that the number of forecasters making valuable forecasts decreases sharply as the horizon increases. The beta-transformed linear pool combination scheme, based on the valuable individual forecasts, is shown to outperform the simple average for all horizons on a number of performance measures, including calibration and sharpness. The test helps to identify the good forecasters ex ante, and therefore contributes to the accuracy of the combined forecasts. PMID:25530646
Evaluation of wake detection probability of underwater vehicle by IR
NASA Astrophysics Data System (ADS)
Kou, Wei; Chen, Xuan; Yang, Li; Jin, Fang-yuan
2016-10-01
The thermal or cold wake of the underwater vehicles will be formed at the sea surface in different region during sailing, then the underwater vehicles will be detected by airborne or space borne infrared detectors easily, which will imperil their security. A model between the detection probability and the Noise Equivalent Temperature Difference (NETD) of the detectors, and the temperature difference between the wake and the sea surface, etc., was established and the evaluation of detection probability in different discrimination levels and other parameters, such as time, location, atmosphere, sea, detector performance, wake temperature, etc., was realized, and a software named Wake Detection of Underwater Vehicle by Infrared (WDPUV-IR) was developed. The results showed that the detection probability to the wake with high detector performance or large temperature difference or short detection distance or low discrimination level was relatively high, but it was difficult to detect targets with small temperature difference and size when the atmospheric transmittance value was low.
Assault frequency and preformation probability of the {alpha} emission process
Zhang, H. F.; Royer, G.; Li, J. Q.
2011-08-15
A study of the assault frequency and preformation factor of the {alpha}-decay description is performed from the experimental {alpha}-decay constant and the penetration probabilities calculated from the generalized liquid-drop model (GLDM) potential barriers. To determine the assault frequency a quantum-mechanical method using a harmonic oscillator is introduced and leads to values of around 10{sup 21} s{sup -1}, similar to the ones calculated within the classical method. The preformation probability is around 10{sup -1}-10{sup -2}. The results for even-even Po isotopes are discussed for illustration. While the assault frequency presents only a shallow minimum in the vicinity of the magic neutron number 126, the preformation factor and mainly the penetrability probability diminish strongly around N=126.
Probability theory versus simulation of petroleum potential in play analysis
Crovelli, R.A.
1987-01-01
An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.
Quantum Probability Cancellation Due to a Single-Photon State
NASA Technical Reports Server (NTRS)
Ou, Z. Y.
1996-01-01
When an N-photon state enters a lossless symmetric beamsplitter from one input port, the photon distribution for the two output ports has the form of Bernouli Binormial, with highest probability at equal partition (N/2 at one outport and N/2 at the other). However, injection of a single photon state at the other input port can dramatically change the photon distribution at the outputs, resulting in zero probability at equal partition. Such a strong deviation from classical particle theory stems from quantum probability amplitude cancellation. The effect persists even if the N-photon state is replaced by an arbitrary state of light. A special case is the coherent state which corresponds to homodyne detection of a single photon state and can lead to the measurement of the wave function of a single photon state.
Conservative Analytical Collision Probability for Design of Orbital Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Conservative Analytical Collision Probabilities for Orbital Formation Flying
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Flux continuity and probability conservation in complexified Bohmian mechanics
Poirier, Bill
2008-02-15
Recent years have seen increased interest in complexified Bohmian mechanical trajectory calculations for quantum systems as both a pedagogical and computational tool. In the latter context, it is essential that trajectories satisfy probability conservation to ensure they are always guided to where they are most needed. We consider probability conservation for complexified Bohmian trajectories. The analysis relies on time-reversal symmetry considerations, leading to a generalized expression for the conjugation of wave functions of complexified variables. This in turn enables meaningful discussion of complexified flux continuity, which turns out not to be satisfied in general, though a related property is found to be true. The main conclusion, though, is that even under a weak interpretation, probability is not conserved along complex Bohmian trajectories.
Probability effects on stimulus evaluation and response processes
NASA Technical Reports Server (NTRS)
Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.
1992-01-01
This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.
Haplotype Probabilities for Multiple-Strain Recombinant Inbred Lines
Teuscher, Friedrich; Broman, Karl W.
2007-01-01
Recombinant inbred lines (RIL) derived from multiple inbred strains can serve as a powerful resource for the genetic dissection of complex traits. The use of such multiple-strain RIL requires a detailed knowledge of the haplotype structure in such lines. Broman (2005) derived the two- and three-point haplotype probabilities for 2n-way RIL; the former required hefty computation to infer the symbolic results, and the latter were strictly numerical. We describe a simpler approach for the calculation of these probabilities, which allowed us to derive the symbolic form of the three-point haplotype probabilities. We also extend the two-point results for the case of additional generations of intermating, including the case of 2n-way intermated recombinant inbred populations (IRIP). PMID:17151250
Scene text detection based on probability map and hierarchical model
NASA Astrophysics Data System (ADS)
Zhou, Gang; Liu, Yuehu
2012-06-01
Scene text detection is an important step for the text-based information extraction system. This problem is challenging due to the variations of size, unknown colors, and background complexity. We present a novel algorithm to robustly detect text in scene images. To segment text candidate connected components (CC) from images, a text probability map consisting of the text position and scale information is estimated by a text region detector. To filter out the non-text CCs, a hierarchical model consisting of two classifiers in cascade is utilized. The first stage of the model estimates text probabilities with unary component features. The second stage classifier is trained with both probability features and similarity features. Since the proposed method is learning-based, there are very few manual parameters required. Experimental results on the public benchmark ICDAR dataset show that our algorithm outperforms other state-of-the-art methods.
Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics
Jones, N L; Walker, J R; Carle, S F
2003-11-21
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of the technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.
On estimating the fracture probability of nuclear graphite components
NASA Astrophysics Data System (ADS)
Srinivasan, Makuteswara
2008-10-01
The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.
Defining Predictive Probability Functions for Species Sampling Models.
Lee, Jaeyong; Quintana, Fernando A; Müller, Peter; Trippa, Lorenzo
2013-01-01
We review the class of species sampling models (SSM). In particular, we investigate the relation between the exchangeable partition probability function (EPPF) and the predictive probability function (PPF). It is straightforward to define a PPF from an EPPF, but the converse is not necessarily true. In this paper we introduce the notion of putative PPFs and show novel conditions for a putative PPF to define an EPPF. We show that all possible PPFs in a certain class have to define (unnormalized) probabilities for cluster membership that are linear in cluster size. We give a new necessary and sufficient condition for arbitrary putative PPFs to define an EPPF. Finally, we show posterior inference for a large class of SSMs with a PPF that is not linear in cluster size and discuss a numerical method to derive its PPF.
Experimental evidence for the reducibility of multifragment emission probabilities
Wozniak, G.J.; Tso, K.; Phair, L.
1995-01-01
Multifragmentation has been studied for {sup 36}Ar-induced reactions on a {sup 197}Au target at E/A = 80 and 110 MeV and for {sup 129}Xe-induced reactions on several targets ({sup nat}Cu, {sup 89}y, {sup 165}ho, {sup 197}Au) and E/A = 40, 50 and 60 MeV. The probability of emitting n intermediate-mass-fragments is shown to be binomial at each transversal energy and reducible to an elementary binary probability p. For each target and at each bombarding energy, this probability p shows a thermal nature by giving linear Arrhenius plots. For the {sup 129}Xe-induced reactions, a nearly universal linear Arrhenius plot is observed at each bombarding energy, indicating a large degree of target independence.
Some properties of probability inversion algorithms to elicit expert opinion.
NASA Astrophysics Data System (ADS)
Lark, Murray
2015-04-01
Probability inversion methods have been developed to infer underlying expert utility functions from rankings that experts offer of subsets of scenarios. The method assumes that the expert ranking reflects an underlying utility, which can be modelled as a function of predictive covariates. This is potentially useful as a method for the extraction of expert opinions for prediction in new scenarios. Two particular algorithms are considered here, the IPF algorithm and the PURE algorithm. The former always converges for consistent sets of rankings and finds a solution which minimizes the mutual information of the estimated utilities and an initial random sample of proposed utilities drawn in the algorithm. In this poster I report some empirical studies on the probability inversion procedure, investigating the effects of the size of the expert panel, the consistency and quality of the expert panel and the validity of the predictive covariates. These results have practical implications for the design of elicitation by probability inversion methods.
How to simulate a universal quantum computer using negative probabilities
NASA Astrophysics Data System (ADS)
Hofmann, Holger F.
2009-07-01
The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.
Gravity and count probabilities in an expanding universe
NASA Technical Reports Server (NTRS)
Bouchet, Francois R.; Hernquist, Lars
1992-01-01
The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.
Defining Baconian Probability for Use in Assurance Argumentation
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.
2016-01-01
The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.
How Life History Can Sway the Fixation Probability of Mutants.
Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne
2016-07-01
In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected.
Probability Learning: Changes in Behavior Across Time and Development.
Plate, Rista C; Fulvio, Jacqueline M; Shutts, Kristin; Green, C Shawn; Pollak, Seth D
2017-01-25
Individuals track probabilities, such as associations between events in their environments, but less is known about the degree to which experience-within a learning session and over development-influences people's use of incoming probabilistic information to guide behavior in real time. In two experiments, children (4-11 years) and adults searched for rewards hidden in locations with predetermined probabilities. In Experiment 1, children (n = 42) and adults (n = 32) changed strategies to maximize reward receipt over time. However, adults demonstrated greater strategy change efficiency. Making the predetermined probabilities more difficult to learn (Experiment 2) delayed effective strategy change for children (n = 39) and adults (n = 33). Taken together, these data characterize how children and adults alike react flexibly and change behavior according to incoming information.
Predicting loss exceedance probabilities for US hurricane landfalls
NASA Astrophysics Data System (ADS)
Murnane, R.
2003-04-01
The extreme winds, rains, and floods produced by landfalling hurricanes kill and injure many people and cause severe economic losses. Many business, planning, and emergency management decisions are based on the probability of hurricane landfall and associated emergency management considerations; however, the expected total economic and insured losses also are important. Insured losses generally are assumed to be half the total economic loss from hurricanes in the United States. Here I describe a simple model that can be used to estimate deterministic and probabilistic exceedance probabilities for insured losses associated with landfalling hurricanes along the US coastline. The model combines wind speed exceedance probabilities with loss records from historical hurricanes striking land. The wind speed exceedance probabilities are based on the HURDAT best track data and use the storm’s maximum sustained wind just prior to landfall. The loss records are normalized to present-day values using a risk model and account for historical changes in inflation, population, housing stock, and other factors. Analysis of the correlation between normalized losses and a number of storm-related parameters suggests that the most relevant, statistically-significant predictor for insured loss is the storm’s maximum sustained wind at landfall. Insured loss exceedance probabilities thus are estimated using a linear relationship between the log of the maximum sustained winds and normalized insured loss. Model estimates for insured losses from Hurricanes Isidore (US45 million) and Lili (US275 million) compare well with loss estimates from more sophisticated risk models and recorded losses. The model can also be used to estimate how exceedance probabilities for insured loss vary as a function of the North Atlantic Oscillation and the El Niño-Southern Oscillation.
Modeling the effect of reward amount on probability discounting.
Myerson, Joel; Green, Leonard; Morris, Joshua
2011-03-01
The present study with college students examined the effect of amount on the discounting of probabilistic monetary rewards. A hyperboloid function accurately described the discounting of hypothetical rewards ranging in amount from $20 to $10,000,000. The degree of discounting increased continuously with amount of probabilistic reward. This effect of amount was not due to changes in the rate parameter of the discounting function, but rather was due to increases in the exponent. These results stand in contrast to those observed with the discounting of delayed monetary rewards, in which the degree of discounting decreases with reward amount due to amount-dependent decreases in the rate parameter. Taken together, this pattern of results suggests that delay and probability discounting reflect different underlying mechanisms. That is, the fact that the exponent in the delay discounting function is independent of amount is consistent with a psychophysical scaling interpretation, whereas the finding that the exponent of the probability-discounting function is amount-dependent is inconsistent with such an interpretation. Instead, the present results are consistent with the idea that the probability-discounting function is itself the product of a value function and a weighting function. This idea was first suggested by Kahneman and Tversky (1979), although their prospect theory does not predict amount effects like those observed. The effect of amount on probability discounting was parsimoniously incorporated into our hyperboloid discounting function by assuming that the exponent was proportional to the amount raised to a power. The amount-dependent exponent of the probability-discounting function may be viewed as reflecting the effect of amount on the weighting of the probability with which the reward will be received.
Using Correlation to Compute Better Probability Estimates in Plan Graphs
NASA Technical Reports Server (NTRS)
Bryce, Daniel; Smith, David E.
2006-01-01
Plan graphs are commonly used in planning to help compute heuristic "distance" estimates between states and goals. A few authors have also attempted to use plan graphs in probabilistic planning to compute estimates of the probability that propositions can be achieved and actions can be performed. This is done by propagating probability information forward through the plan graph from the initial conditions through each possible action to the action effects, and hence to the propositions at the next layer of the plan graph. The problem with these calculations is that they make very strong independence assumptions - in particular, they usually assume that the preconditions for each action are independent of each other. This can lead to gross overestimates in probability when the plans for those preconditions interfere with each other. It can also lead to gross underestimates of probability when there is synergy between the plans for two or more preconditions. In this paper we introduce a notion of the binary correlation between two propositions and actions within a plan graph, show how to propagate this information within a plan graph, and show how this improves probability estimates for planning. This notion of correlation can be thought of as a continuous generalization of the notion of mutual exclusion (mutex) often used in plan graphs. At one extreme (correlation=0) two propositions or actions are completely mutex. With correlation = 1, two propositions or actions are independent, and with correlation > 1, two propositions or actions are synergistic. Intermediate values can and do occur indicating different degrees to which propositions and action interfere or are synergistic. We compare this approach with another recent approach by Bryce that computes probability estimates using Monte Carlo simulation of possible worlds in plan graphs.
Time Dependence of Collision Probabilities During Satellite Conjunctions
NASA Technical Reports Server (NTRS)
Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.
2017-01-01
The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.
Probability of detection of nests and implications for survey design
Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.
2009-01-01
Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.
The probability of preservation of a newly arisen gene duplicate.
Lynch, M; O'Hely, M; Walsh, B; Force, A
2001-01-01
Newly emerging data from genome sequencing projects suggest that gene duplication, often accompanied by genetic map changes, is a common and ongoing feature of all genomes. This raises the possibility that differential expansion/contraction of various genomic sequences may be just as important a mechanism of phenotypic evolution as changes at the nucleotide level. However, the population-genetic mechanisms responsible for the success vs. failure of newly arisen gene duplicates are poorly understood. We examine the influence of various aspects of gene structure, mutation rates, degree of linkage, and population size (N) on the joint fate of a newly arisen duplicate gene and its ancestral locus. Unless there is active selection against duplicate genes, the probability of permanent establishment of such genes is usually no less than 1/(4N) (half of the neutral expectation), and it can be orders of magnitude greater if neofunctionalizing mutations are common. The probability of a map change (reassignment of a key function of an ancestral locus to a new chromosomal location) induced by a newly arisen duplicate is also generally >1/(4N) for unlinked duplicates, suggesting that recurrent gene duplication and alternative silencing may be a common mechanism for generating microchromosomal rearrangements responsible for postreproductive isolating barriers among species. Relative to subfunctionalization, neofunctionalization is expected to become a progressively more important mechanism of duplicate-gene preservation in populations with increasing size. However, even in large populations, the probability of neofunctionalization scales only with the square of the selective advantage. Tight linkage also influences the probability of duplicate-gene preservation, increasing the probability of subfunctionalization but decreasing the probability of neofunctionalization. PMID:11779815
The probability of preservation of a newly arisen gene duplicate.
Lynch, M; O'Hely, M; Walsh, B; Force, A
2001-12-01
Newly emerging data from genome sequencing projects suggest that gene duplication, often accompanied by genetic map changes, is a common and ongoing feature of all genomes. This raises the possibility that differential expansion/contraction of various genomic sequences may be just as important a mechanism of phenotypic evolution as changes at the nucleotide level. However, the population-genetic mechanisms responsible for the success vs. failure of newly arisen gene duplicates are poorly understood. We examine the influence of various aspects of gene structure, mutation rates, degree of linkage, and population size (N) on the joint fate of a newly arisen duplicate gene and its ancestral locus. Unless there is active selection against duplicate genes, the probability of permanent establishment of such genes is usually no less than 1/(4N) (half of the neutral expectation), and it can be orders of magnitude greater if neofunctionalizing mutations are common. The probability of a map change (reassignment of a key function of an ancestral locus to a new chromosomal location) induced by a newly arisen duplicate is also generally >1/(4N) for unlinked duplicates, suggesting that recurrent gene duplication and alternative silencing may be a common mechanism for generating microchromosomal rearrangements responsible for postreproductive isolating barriers among species. Relative to subfunctionalization, neofunctionalization is expected to become a progressively more important mechanism of duplicate-gene preservation in populations with increasing size. However, even in large populations, the probability of neofunctionalization scales only with the square of the selective advantage. Tight linkage also influences the probability of duplicate-gene preservation, increasing the probability of subfunctionalization but decreasing the probability of neofunctionalization.
Effects of grapheme-to-phoneme probability on writing durations.
Afonso, Olivia; Álvarez, Carlos J; Kandel, Sonia
2015-05-01
The relative involvement of the lexical and sublexical routes across different writing tasks remains a controversial topic in the field of handwriting production research. The present article reports two experiments examining whether or not the probability of a grapheme-to-phoneme (G-P) mapping affected production during copy of polyvalent graphemes embedded in French (Exps. 1a and 1b) and Spanish (Exp. 2) known words. The relative probabilities of two different G-P mappings associated with the same polyvalent grapheme were manipulated (higher vs. lower probability). In Experiment 1a, we used the polyvalent French grapheme E. Writing durations revealed that the interletter intervals (ILIs) located before and after this letter were shorter and that the letter itself was executed faster in the condition of higher probability of the G-P mapping (e.g., S E RVICE, "service") than in the lower-probability condition (e.g., S E MAINE, "week"). In Experiment 1b, we used the sequence TI (e.g., VIC TI ME-MAR TI EN, "victim-Martian"), which is less frequent. In this case, we failed to observe significant differences between the conditions. In Experiment 2, effects similar to those obtained in Experiment 1a were found with Spanish words using different pronunciations of the letter C (e.g., DES C ANSO-DES C ENSO, "rest-descent"). Altogether, these results reveal that the link between a grapheme and a phoneme is weighted according to its probability in the language. Moreover, they suggest that a two-phase route linking graphemes to phonemes and phonemes to graphemes is functional during copy.
Does probability of occurrence relate to population dynamics?
Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.
2014-01-01
Hutchinson defined species’ realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species’ niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K) and population density (N) relate to occurrence probability (Pocc). We hypothesized that these relationships vary with species’ competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence probability
Ask Marilyn in the mathematics classroom: probability questions
NASA Astrophysics Data System (ADS)
Vasko, Francis J.
2012-06-01
Since 1986, Marilyn Vos Savant, who is listed in the Guinness Book of World Records Hall of Fame for the highest IQ, has had a weekly column that is published in Parade Magazine. In this column, she answers readers' questions on a wide variety of subjects including mathematics and particularly probability. Many of the mathematically oriented questions are directly relevant to high school and undergraduate college level mathematics courses. For nearly 20 years, I have incorporated many of these questions into a variety of mathematics courses that I teach. In this note, I will discuss some of the questions that I use dealing with probability.
Transition Probabilities for Spectral Lines in Co I
NASA Astrophysics Data System (ADS)
Nitz, D. E.; Wilson, K. L.; Lentz, L. R.
1996-05-01
We are in the process of determining transition probabilities for visible and uv lines in Co I from Fourier transform spectra recorded at Kitt Peak and made available to us by Prof. W. Whaling. Normalization of relative transition probabilities obtained from these spectra is achieved using recently-measured Co I lifetimes.(D. E. Nitz, S. D. Bergeson, and J. E. Lawler, J. Opt. Soc. Am. B 12, 377 (1995).) To date we have obtained preliminary results for 240 lines having branch fractions > 1
Transmission Probability for Charged Dilatonic Black Holes in Various Dimensions
NASA Astrophysics Data System (ADS)
Ngampitipan, Tritos; Boonserm, Petarpa
A dilaton is a theoretical particle, which results from the Plank mass raised to a dynamical field. In this paper, the rigorous bounds on the transmission probabilities for charged black holes, coupled to a dilaton field in various dimensions, are calculated. The results show that in the absence of the cosmological constant, the black holes in (2 + 1) dimensions have only one event horizon. Moreover, the charges of the black holes can increase the transmission probabilities. However, for the black holes in (3 + 1) dimensions, the charges of the black holes can filter Hawking radiation.
Atomic transition probabilities of Ce I from Fourier transform spectra
NASA Astrophysics Data System (ADS)
Nitz, D. E.; Lawler, J. E.; Chisholm, J.; Wood, M. P.; Sobeck, J.; den Hartog, E. A.
2010-03-01
We report transition probabilities for 2874 lines of CeI in the wavelength range 360 -- 1500 nm. These are derived from new branching fraction measurements on Fourier transform spectra normalized with recently-reported radiative lifetimes (Den Hartog et al., J. Phys. B 42, 085006 (2009)). We have analyzed the decay branches for 153 upper levels in 14 different spectra recorded under a variety of discharge lamp conditions. Comparison of results with previous less extensive investigations shows good agreement for lines studied in common. Accurate Ce I transition probabilities are needed for applications in astrophysics and in lighting research, particularly for the development of improved metal halide high-intensity discharge lamps.
Atomic transition probabilities of Ce I from Fourier transform spectra
NASA Astrophysics Data System (ADS)
Lawler, J. E.; Chisholm, J.; Nitz, D. E.; Wood, M. P.; Sobeck, J.; Den Hartog, E. A.
2010-04-01
Atomic transition probabilities for 2874 lines of the first spectrum of cerium (Ce I) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2009 J. Phys. B: At. Mol. Opt. Phys. 42 085006). The wavelength range of the data set is from 360 to 1500 nm. Comparisons are made to previous investigations which are less extensive. Accurate Ce i transition probabilities are needed for lighting research and development on metal halide high-intensity discharge lamps.
A quantum probability explanation for violations of 'rational' decision theory.
Pothos, Emmanuel M; Busemeyer, Jerome R
2009-06-22
Two experimental tasks in psychology, the two-stage gambling game and the Prisoner's Dilemma game, show that people violate the sure thing principle of decision theory. These paradoxical findings have resisted explanation by classical decision theory for over a decade. A quantum probability model, based on a Hilbert space representation and Schrödinger's equation, provides a simple and elegant explanation for this behaviour. The quantum model is compared with an equivalent Markov model and it is shown that the latter is unable to account for violations of the sure thing principle. Accordingly, it is argued that quantum probability provides a better framework for modelling human decision-making.
A probability generating function method for stochastic reaction networks
NASA Astrophysics Data System (ADS)
Kim, Pilwon; Lee, Chang Hyeong
2012-06-01
In this paper we present a probability generating function (PGF) approach for analyzing stochastic reaction networks. The master equation of the network can be converted to a partial differential equation for PGF. Using power series expansion of PGF and Padé approximation, we develop numerical schemes for finding probability distributions as well as first and second moments. We show numerical accuracy of the method by simulating chemical reaction examples such as a binding-unbinding reaction, an enzyme-substrate model, Goldbeter-Koshland ultrasensitive switch model, and G2/M transition model.
LFSPMC: Linear feature selection program using the probability of misclassification
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Marion, B. P.
1975-01-01
The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.
Gendist: An R Package for Generated Probability Distribution Models
Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; ABSL Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim
2016-01-01
In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements. PMID:27272043
Evaluation of DNA match probability in criminal case.
Lee, J W; Lee, H S; Park, M; Hwang, J J
2001-02-15
The new emphasis on quantification of evidence has led to perplexing courtroom decisions and it has been difficult for forensic scientists to pursue logical arguments. Especially, for evaluating DNA evidence, though both the genetic relationship for two compared persons and the examined locus system should be considered, the understanding for this has not yet drawn much attention. In this paper, we suggest to calculate the match probability by using coancestry coefficient when the family relationship is considered, and thus the performances of the identification values depending on the calculation of match probability are compared under various situations.
Gendist: An R Package for Generated Probability Distribution Models.
Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; Absl Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim
2016-01-01
In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.
Gap probability - Measurements and models of a pecan orchard
NASA Technical Reports Server (NTRS)
Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI
1992-01-01
Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.
High Probabilities of Planet Detection during Microlensing Events.
NASA Astrophysics Data System (ADS)
Peale, S. J.
2000-10-01
The averaged probability of detecting a planetary companion of a lensing star during a gravitational microlensing event toward the Galactic center when the planet-lens mass ratio is 0.001 is shown to have a maximum exceeding 20% for a distribution of source-lens impact parameters that is determined by the efficiency of event detection, and a maximum exceeding 10% for a uniform distribution of impact parameters. The probability varies as the square root of the planet-lens mass ratio. A planet is assumed detectable if the perturbation of the light curve exceeds 2/(S/N) for a significant number of data points, where S/N is the signal-to noise ratio for the photometry of the source. The probability peaks at a planetary semimajor axis a that is close to the mean Einstein ring radius of the lenses of about 2 AU along the line of sight, and remains significant for 0.6<= a<= 10 AU. The low value of the mean Einstein ring radius results from the dominance of M stars in the mass function of the lenses. The probability is averaged over the distribution of the projected position of the planet onto the lens plane, over the lens mass function, over the distribution of impact parameters, over the distribution of lens along the line of sight to the source star, over the I band luminosity function of the sources adjusted for the source distance, and over the source distribution along the line of sight. If two or more parameters of the lensing event are known, such as the I magnitude of the source and the impact parameter, the averages over these parameters can be omitted and the probability of detection determined for a particular event. The calculated probabilities behave as expected with variations in the line of sight, the mass function of the lenses, the extinction and distance to and magnitude of the source, and with a more demanding detection criterion. The relatively high values of the probabilities are robust to plausible variations in the assumptions. The high
Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2010-01-01
When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's
An application of recurrent nets to phone probability estimation.
Robinson, A J
1994-01-01
This paper presents an application of recurrent networks for phone probability estimation in large vocabulary speech recognition. The need for efficient exploitation of context information is discussed; a role for which the recurrent net appears suitable. An overview of early developments of recurrent nets for phone recognition is given along with the more recent improvements that include their integration with Markov models. Recognition results are presented for the DARPA TIMIT and Resource Management tasks, and it is concluded that recurrent nets are competitive with traditional means for performing phone probability estimation.
Probability of detection of defects in coatings with electronic shearography
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Horton, Charles M.; Lansing, Matthew D.; Gnacek, William J.; Newton, Patrick L.
1994-01-01
The goal of this research was to utilize statistical methods to evaluate the probability of detection (POD) of defects in coatings using electronic shearography. The coating system utilized in the POD studies was to be the paint system currently utilized on the external casings of the NASA Space Transportation System (STS) Revised Solid Rocket Motor (RSRM) boosters. The population of samples was to be large enough to determine the minimum defect size for 90 percent probability of detection of 95 percent confidence POD on these coatings. Also, the best methods to excite coatings on aerospace components to induce deformations for measurement by electronic shearography were to be determined.
Confidence as Bayesian Probability: From Neural Origins to Behavior.
Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F
2015-10-07
Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions.
Ask Marilyn in the Mathematics Classroom: Probability Questions
ERIC Educational Resources Information Center
Vasko, Francis J.
2012-01-01
Since 1986, Marilyn Vos Savant, who is listed in the "Guinness Book of World Records Hall of Fame" for the highest IQ, has had a weekly column that is published in "Parade Magazine." In this column, she answers readers' questions on a wide variety of subjects including mathematics and particularly probability. Many of the mathematically oriented…
Methods for estimating drought streamflow probabilities for Virginia streams
Austin, Samuel H.
2014-01-01
Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.
Differentiating Phonotactic Probability and Neighborhood Density in Adult Word Learning
ERIC Educational Resources Information Center
Storkel, Holly L.; Armbruster, Jonna; Hogan, Tiffany P.
2006-01-01
Purpose: The purpose of this study was to differentiate effects of phonotactic probability, the likelihood of occurrence of a sound sequence, and neighborhood density, the number of words that sound similar to a given word, on adult word learning. A second purpose was to determine what aspect of word learning (viz., triggering learning, formation…
A proposed physical analog for a quantum probability amplitude
NASA Astrophysics Data System (ADS)
Boyd, Jeffrey
What is the physical analog of a probability amplitude? All quantum mathematics, including quantum information, is built on amplitudes. Every other science uses probabilities; QM alone uses their square root. Why? This question has been asked for a century, but no one previously has proposed an answer. We will present cylindrical helices moving toward a particle source, which particles follow backwards. Consider Feynman's book QED. He speaks of amplitudes moving through space like the hand of a spinning clock. His hand is a complex vector. It traces a cylindrical helix in Cartesian space. The Theory of Elementary Waves changes direction so Feynman's clock faces move toward the particle source. Particles follow amplitudes (quantum waves) backwards. This contradicts wave particle duality. We will present empirical evidence that wave particle duality is wrong about the direction of particles versus waves. This involves a paradigm shift; which are always controversial. We believe that our model is the ONLY proposal ever made for the physical foundations of probability amplitudes. We will show that our ``probability amplitudes'' in physical nature form a Hilbert vector space with adjoints, an inner product and support both linear algebra and Dirac notation.
Delay, Probability, and Social Discounting in a Public Goods Game
ERIC Educational Resources Information Center
Jones, Bryan A.; Rachlin, Howard
2009-01-01
A human social discount function measures the value to a person of a reward to another person at a given social distance. Just as delay discounting is a hyperbolic function of delay, and probability discounting is a hyperbolic function of odds-against, social discounting is a hyperbolic function of social distance. Experiment 1 obtained individual…
View south of hydraulic hammer in boilermakers shop (probably the ...
View south of hydraulic hammer in boilermakers shop (probably the oldest piece of equipment in the yard, originally powered by steam) nameplate: United Engineers and FDRY. Co. Pittsburgh, Pa, USA Davy Brothers LTD. Patents - Aug 1, 1905, Feb, 1901, Sept 8, 1908 - 10000 lbs. - Naval Base Philadelphia-Philadelphia Naval Shipyard, Structure Shop, League Island, Philadelphia, Philadelphia County, PA
Support Theory: A Nonextensional Representation of Subjective Probability.
ERIC Educational Resources Information Center
Tversky, Amos; Koehler, Derek J.
1994-01-01
A new theory of subjective probability is presented. According to this theory, different descriptions of the same event can give rise to different judgments. Experimental evidence supporting this theory is summarized, demonstrating that the theory provides a unified treatment of a wide range of empirical findings. (SLD)
The Notions of Chance and Probabilities in Preschoolers
ERIC Educational Resources Information Center
Nikiforidou, Zoi; Pange, Jenny
2010-01-01
Chance, randomness and probability constitute statistical notions that are interrelated and characterize the logicomathematical thinking of children. Traditional theories support that probabilistic thinking evolves after the age of 7. However, recent research has underlined that children, as young as 4, may possess and develop basic notions,…
The Use of Monte Carlo Techniques to Teach Probability.
ERIC Educational Resources Information Center
Newell, G. J.; MacFarlane, J. D.
1985-01-01
Presents sports-oriented examples (cricket and football) in which Monte Carlo methods are used on microcomputers to teach probability concepts. Both examples include computer programs (with listings) which utilize the microcomputer's random number generator. Instructional strategies, with further challenges to help students understand the role of…
Sets, Probability and Statistics: The Mathematics of Life Insurance.
ERIC Educational Resources Information Center
Clifford, Paul C.; And Others
The practical use of such concepts as sets, probability and statistics are considered by many to be vital and necessary to our everyday life. This student manual is intended to familiarize students with these concepts and to provide practice using real life examples. It also attempts to illustrate how the insurance industry uses such mathematic…
Delay and Probability Discounting in Humans: An Overview
ERIC Educational Resources Information Center
McKerchar, Todd L.; Renda, C. Renee
2012-01-01
The purpose of this review is to introduce the reader to the concepts of delay and probability discounting as well as the major empirical findings to emerge from research with humans on these concepts. First, we review a seminal discounting study by Rachlin, Raineri, and Cross (1991) as well as an influential extension of this study by Madden,…
Flipping Out: Calculating Probability with a Coin Game
ERIC Educational Resources Information Center
Degner, Kate
2015-01-01
In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…
Facilitating normative judgments of conditional probability: frequency or nested sets?
Yamagishi, Kimihiko
2003-01-01
Recent probability judgment research contrasts two opposing views. Some theorists have emphasized the role of frequency representations in facilitating probabilistic correctness; opponents have noted that visualizing the probabilistic structure of the task sufficiently facilitates normative reasoning. In the current experiment, the following conditional probability task, an isomorph of the "Problem of Three Prisoners" was tested. "A factory manufactures artificial gemstones. Each gemstone has a 1/3 chance of being blurred, a 1/3 chance of being cracked, and a 1/3 chance of being clear. An inspection machine removes all cracked gemstones, and retains all clear gemstones. However, the machine removes 1/2 of the blurred gemstones. What is the chance that a gemstone is blurred after the inspection?" A 2 x 2 design was administered. The first variable was the use of frequency instruction. The second manipulation was the use of a roulette-wheel diagram that illustrated a "nested-sets" relationship between the prior and the posterior probabilities. Results from two experiments showed that frequency alone had modest effects, while the nested-sets instruction achieved a superior facilitation of normative reasoning. The third experiment compared the roulette-wheel diagram to tree diagrams that also showed the nested-sets relationship. The roulette-wheel diagram outperformed the tree diagrams in facilitation of probabilistic reasoning. Implications for understanding the nature of intuitive probability judgments are discussed.
Nonprobability and probability-based sampling strategies in sexual science.
Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah
2015-01-01
With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.
Pinochle Poker: An Activity for Counting and Probability
ERIC Educational Resources Information Center
Wroughton, Jacqueline; Nolan, Joseph
2012-01-01
Understanding counting rules is challenging for students; in particular, they struggle with determining when and how to implement combinations, permutations, and the multiplication rule as tools for counting large sets and computing probability. We present an activity--using ideas from the games of poker and pinochle--designed to help students…
Approximating the Probability of Mortality Due to Protracted Radiation Exposures
2016-06-01
for prompt doses to determine the probability of mortality for the protracted exposure. MARCELL is a physiologically based, cell-kinetics model of...characteristic time constants associated with the physiological processes modeled in MARCELL. These characteristic times are associated with cell...describe experimental toxicity data when a suitable physiologically based model of response (either human or animal) is not available. Because
Generating quantum-measurement probabilities from an optimality principle
NASA Astrophysics Data System (ADS)
Suykens, Johan A. K.
2013-05-01
An alternative formulation to the (generalized) Born rule is presented. It involves estimating an unknown model from a finite set of measurement operators on the state. An optimality principle is given that relates to achieving bounded solutions by regularizing the unknown parameters in the model. The objective function maximizes a lower bound on the quadratic Renyi classical entropy. The unknowns of the model in the primal are interpreted as transition witnesses. An interpretation of the Born rule in terms of fidelity is given with respect to transition witnesses for the pure state and the case of positive operator-valued measures (POVMs). The models for generating quantum-measurement probabilities apply to orthogonal projective measurements and POVM measurements, and to isolated and open systems with Kraus maps. A straightforward and constructive method is proposed for deriving the probability rule, which is based on Lagrange duality. An analogy is made with a kernel-based method for probability mass function estimation, for which similarities and differences are discussed. These combined insights from quantum mechanics, statistical modeling, and machine learning provide an alternative way of generating quantum-measurement probabilities.
Ladar range image denoising by a nonlocal probability statistics algorithm
NASA Astrophysics Data System (ADS)
Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi
2013-01-01
According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.