Science.gov

Sample records for probable nmp para

  1. Genome-Wide Mapping and Interrogation of the Nmp4 Antianabolic Bone Axis.

    PubMed

    Childress, Paul; Stayrook, Keith R; Alvarez, Marta B; Wang, Zhiping; Shao, Yu; Hernandez-Buquer, Selene; Mack, Justin K; Grese, Zachary R; He, Yongzheng; Horan, Daniel; Pavalko, Fredrick M; Warden, Stuart J; Robling, Alexander G; Yang, Feng-Chun; Allen, Matthew R; Krishnan, Venkatesh; Liu, Yunlong; Bidwell, Joseph P

    2015-09-01

    PTH is an osteoanabolic for treating osteoporosis but its potency wanes. Disabling the transcription factor nuclear matrix protein 4 (Nmp4) in healthy, ovary-intact mice enhances bone response to PTH and bone morphogenetic protein 2 and protects from unloading-induced osteopenia. These Nmp4(-/-) mice exhibit expanded bone marrow populations of osteoprogenitors and supporting CD8(+) T cells. To determine whether the Nmp4(-/-) phenotype persists in an osteoporosis model we compared PTH response in ovariectomized (ovx) wild-type (WT) and Nmp4(-/-) mice. To identify potential Nmp4 target genes, we performed bioinformatic/pathway profiling on Nmp4 chromatin immunoprecipitation sequencing (ChIP-seq) data. Mice (12 w) were ovx or sham operated 4 weeks before the initiation of PTH therapy. Skeletal phenotype analysis included microcomputed tomography, histomorphometry, serum profiles, fluorescence-activated cell sorting and the growth/mineralization of cultured WT and Nmp4(-/-) bone marrow mesenchymal stem progenitor cells (MSPCs). ChIP-seq data were derived using MC3T3-E1 preosteoblasts, murine embryonic stem cells, and 2 blood cell lines. Ovx Nmp4(-/-) mice exhibited an improved response to PTH coupled with elevated numbers of osteoprogenitors and CD8(+) T cells, but were not protected from ovx-induced bone loss. Cultured Nmp4(-/-) MSPCs displayed enhanced proliferation and accelerated mineralization. ChIP-seq/gene ontology analyses identified target genes likely under Nmp4 control as enriched for negative regulators of biosynthetic processes. Interrogation of mRNA transcripts in nondifferentiating and osteogenic differentiating WT and Nmp4(-/-) MSPCs was performed on 90 Nmp4 target genes and differentiation markers. These data suggest that Nmp4 suppresses bone anabolism, in part, by regulating IGF-binding protein expression. Changes in Nmp4 status may lead to improvements in osteoprogenitor response to therapeutic cues.

  2. Genome-Wide Mapping and Interrogation of the Nmp4 Antianabolic Bone Axis

    PubMed Central

    Childress, Paul; Stayrook, Keith R.; Alvarez, Marta B.; Wang, Zhiping; Shao, Yu; Hernandez-Buquer, Selene; Mack, Justin K.; Grese, Zachary R.; He, Yongzheng; Horan, Daniel; Pavalko, Fredrick M.; Warden, Stuart J.; Robling, Alexander G.; Yang, Feng-Chun; Allen, Matthew R.; Krishnan, Venkatesh; Liu, Yunlong

    2015-01-01

    PTH is an osteoanabolic for treating osteoporosis but its potency wanes. Disabling the transcription factor nuclear matrix protein 4 (Nmp4) in healthy, ovary-intact mice enhances bone response to PTH and bone morphogenetic protein 2 and protects from unloading-induced osteopenia. These Nmp4−/− mice exhibit expanded bone marrow populations of osteoprogenitors and supporting CD8+ T cells. To determine whether the Nmp4−/− phenotype persists in an osteoporosis model we compared PTH response in ovariectomized (ovx) wild-type (WT) and Nmp4−/− mice. To identify potential Nmp4 target genes, we performed bioinformatic/pathway profiling on Nmp4 chromatin immunoprecipitation sequencing (ChIP-seq) data. Mice (12 w) were ovx or sham operated 4 weeks before the initiation of PTH therapy. Skeletal phenotype analysis included microcomputed tomography, histomorphometry, serum profiles, fluorescence-activated cell sorting and the growth/mineralization of cultured WT and Nmp4−/− bone marrow mesenchymal stem progenitor cells (MSPCs). ChIP-seq data were derived using MC3T3-E1 preosteoblasts, murine embryonic stem cells, and 2 blood cell lines. Ovx Nmp4−/− mice exhibited an improved response to PTH coupled with elevated numbers of osteoprogenitors and CD8+ T cells, but were not protected from ovx-induced bone loss. Cultured Nmp4−/− MSPCs displayed enhanced proliferation and accelerated mineralization. ChIP-seq/gene ontology analyses identified target genes likely under Nmp4 control as enriched for negative regulators of biosynthetic processes. Interrogation of mRNA transcripts in nondifferentiating and osteogenic differentiating WT and Nmp4−/− MSPCs was performed on 90 Nmp4 target genes and differentiation markers. These data suggest that Nmp4 suppresses bone anabolism, in part, by regulating IGF-binding protein expression. Changes in Nmp4 status may lead to improvements in osteoprogenitor response to therapeutic cues. PMID:26244796

  3. [Monograph for N-Methyl-pyrrolidone (NMP) and human biomonitoring values for the metabolites 5-Hydroxy-NMP and 2-Hydroxy-N-methylsuccinimide].

    PubMed

    2015-10-01

    1-Methyl-pyrrolidone (NMP) is used as a solvent in many technical applications. The general population may be exposed to NMP from the use as ingredient in paint and graffiti remover, indoors also from use in paints and carpeting. Because of developmental toxic effects, the use of NMP in consumer products in the EU is regulated. The developmental effects accompanied by weak maternally toxic effects in animal experiments are considered as the critical effects by the German HBM Commission. Based on these effects, HBM-I values of 10 mg/l urine for children and of 15 mg/l for adults, respectively, were derived for the metabolites 5-Hydroxy-NMP and 2-Hydroxy-N-methylsuccinimide. HBM-II-values were set to 30 mg/l urine for children and 50 mg/l for adults, respectively. Because of similar effects of the structural analogue 1-ethyl-2-pyrrolidone (NEP), the possible mixed exposure to both compounds has to be taken into account when evaluating the total burden.

  4. Human volunteer study on the inhalational and dermal absorption of N-methyl-2-pyrrolidone (NMP) from the vapour phase.

    PubMed

    Bader, Michael; Wrbitzky, Renate; Blaszkewicz, Meinolf; Schäper, Michael; van Thriel, Christoph

    2008-01-01

    N-Methyl-2-pyrrolidone (NMP) is a versatile organic solvent frequently used for surface cleaning such as paint stripping or graffiti removal. Liquid NMP is rapidly absorbed through the skin but dermal vapour phase absorption might also play an important role for the uptake of the solvent. This particular aspect was investigated in an experimental study with 16 volunteers exposed to 80 mg/m(3) NMP for 8 h under either whole-body, i.e. inhalational plus dermal, or dermal-only conditions. Additionally, the influence of moderate physical workload on the uptake of NMP was studied. The urinary concentrations of NMP and its metabolites 5-hydroxy-N-methyl-2-pyrrolidone (5-HNMP) and 2-hydroxy-N-methylsuccinimide (2-HMSI) were followed for 48 h and analysed by gas chromatography-mass spectrometry (GC-MS). Percutaneous uptake delayed the elimination peak times and the apparent biological half-lives of NMP and 5-HNMP. Under resting conditions, dermal-only exposure resulted in the elimination of 71 +/- 8 mg NMP equivalents as compared to 169 +/- 15 mg for whole-body exposure. Moderate workload yielded 79 +/- 8 mg NMP (dermal-only) and 238 +/- 18 mg (whole-body). Thus, dermal absorption from the vapour phase may contribute significantly to the total uptake of NMP, e.g. from workplace atmospheres. As the concentration of airborne NMP does not reflect the body dose, biomonitoring should be carried out for surveillance purposes.

  5. Nmp4/CIZ inhibits mechanically-induced β-catenin signaling activity in osteoblasts

    PubMed Central

    Yang, Zhouqi; Bidwell, Joseph P.; Young, Suzanne R.; Gerard-O’Riley, Rita; Wang, Haifang; Pavalko, Fredrick M.

    2010-01-01

    Cellular mechanotransduction, the process of converting mechanical signals into biochemical responses within cells, is a critical aspect of bone health. While the effects of mechanical loading on bone are well recognized, elucidating the specific molecular pathways involved in the processing of mechanical signals by bone cells represents a challenge and an opportunity to identify therapeutic strategies to combat bone loss. In this study we have for the first time examined the relationship between the nucleocytoplasmic shuttling transcription factor nuclear matrix protein-4/cas interacting zinc finger protein (Nmp4/CIZ) and β-catenin signaling in response to a physiologic mechanical stimulation (oscillatory fluid shear stress, OFSS) in osteoblasts. Using calvaria-derived osteoblasts from Nmp4-deficient and wild-type mice, we found that the normal translocation of β-catenin to the nucleus in osteoblasts that is induced by OFSS is enhanced when Nmp4/CIZ is absent. Furthermore, we found that other aspects of OFSS-induced mechanotransduction generally associated with the β-catenin signaling pathway, including ERK, Akt and GSK3β activity, as well as expression of the β-catenin-responsive protein cyclin D1 are also enhanced in cells lacking Nmp4/CIZ. Finally, we found that in the absence of Nmp4/CIZ, OFSS-induced cytoskeletal reorganization and the formation of focal adhesions between osteoblasts and the extracellular substrate is qualitatively enhanced, suggesting that Nmp4/CIZ may reduce the sensitivity of bone cells to mechanical stimuli. Together these results provide experimental support for the concept that Nmp4/CIZ plays an inhibitory role in the response of bone cells to mechanical stimulation induced by OFSS. PMID:20112285

  6. The neutral metallopeptidase NMP1 of Trichoderma guizhouense is required for mycotrophy and self-defence.

    PubMed

    Zhang, Jian; Bayram Akcapinar, Gunseli; Atanasova, Lea; Rahimi, Mohammad Javad; Przylucka, Agnieszka; Yang, Dongqing; Kubicek, Christian P; Zhang, Ruifu; Shen, Qirong; Druzhinina, Irina S

    2016-02-01

    Trichoderma guizhouense NJAU 4742 (Harzianum clade) can suppress the causative agent of banana wild disease Fusarium oxysporum f. sp. cubense 4 (Foc4). To identify genes involved in this trait, we used T-DNA insertional mutagenesis and isolated one mutant that was unable to overgrow Foc4 and had reduced antifungal ability. Using the high-efficiency thermal asymmetric interlaced-PCR, the T-DNA was located in the terminator of a neutral metalloprotease gene (encoding a MEROPS family M35 protease), which was named nmp1. The antifungal activity of the mutant was recovered by retransformation with wild-type nmp1 gene. The purified NMP1 (overexpressed in Pichia pastoris) did not inhibit the growth and germination of other fungi in vitro. Its addition, however, partly recovered the antifungal activity of the mutant strain against some fungi. The expression of nmp1 is induced by the presence of fungi and by dead fungal biomass, but the time-course of transcript accumulation following the physical contact depends on mode of interaction: it increases in cases of long-lasting parasitism and decreases if the prey fungus is dead shortly after or even before the contact (predation). We thus conclude that NMP1 protein of T. guizhouense has major importance for mycotrophic interactions and defence against other fungi. PMID:26118314

  7. cNMP-AMs mimic and dissect bacterial nucleotidyl cyclase toxin effects.

    PubMed

    Beckert, Ulrike; Grundmann, Manuel; Wolter, Sabine; Schwede, Frank; Rehmann, Holger; Kaever, Volkhard; Kostenis, Evi; Seifert, Roland

    2014-09-01

    In addition to the well-known second messengers cAMP and cGMP, mammalian cells contain the cyclic pyrimidine nucleotides cCMP and cUMP. The Pseudomonas aeruginosa toxin ExoY massively increases cGMP and cUMP in cells, whereas the Bordetella pertussis toxin CyaA increases cAMP and, to a lesser extent, cCMP. To mimic and dissect toxin effects, we synthesized cNMP-acetoxymethylesters as prodrugs. cNMP-AMs rapidly and effectively released the corresponding cNMP in cells. The combination of cGMP-AM plus cUMP-AM mimicked cytotoxicity of ExoY. cUMP-AM and cGMP-AM differentially activated gene expression. Certain cCMP and cUMP effects were independent of the known cNMP effectors protein kinases A and G and guanine nucleotide exchange factor Epac. In conclusion, cNMP-AMs are useful tools to mimic and dissect bacterial nucleotidyl cyclase toxin effects. PMID:25108158

  8. Synthesis of NMP, a Fluoxetine (Prozac) Precursor, in the Introductory Organic Laboratory

    NASA Astrophysics Data System (ADS)

    Perrine, Daniel M.; Sabanayagam, Nathan R.; Reynolds, Kristy J.

    1998-10-01

    A synthesis of the immediate precursor of the widely used antidepressant fluoxetine (Prozac) is described. The procedure is short, safe, and simple enough to serve as a laboratory exercise for undergraduate students in the second semester of introductory organic chemistry and is one which will be particularly interesting to those planning a career in the health sciences. The compound synthesized is (°)-N,N-dimethyl-3-(p-trifluoromethylphenoxy)-3-phenylpropylamine, or "N-methyl Prozac" (NMP). The synthesis of NMP requires one two-hour period and a second three-hour period. In the first period, a common Mannich base, 3-dimethylaminopropiophenone, is reduced with sodium borohydride to form (°)-3-dimethylamino-1-phenylpropanol. In the second period, potassium t-butoxide is used to couple (°)-3-dimethylamino-1-phenylpropanol with p-chlorotrifluoromethylbenzene to form NMP, which is isolated as its oxalate salt. All processes use equipment and materials that are inexpensive and readily available in most undergraduate laboratories. Detailed physical data are given on NMP, including high-field DEPT 13C NMR.

  9. Synthesis and First Principles Investigation of HMX/NMP Cocrystal Explosive

    NASA Astrophysics Data System (ADS)

    Lin, He; Zhu, Shun-Guan; Zhang, Lin; Peng, Xin-Hua; LI, Hong-Zhen

    2013-10-01

    1,3,5,7-Tetranitro-l,3,5,7-tetrazocine (HMX)/N-methyl-2-pyrrolidone (NMP) cocrystal explosive was prepared by a solution evaporation method. This cocrystal explosive crystallized in the trigonal system (space group ? ), with cell parameters a = 16.605(8) Å and c = 31.496(4) Å. Theoretical investigations of the formation mechanism of HMX/NMP cocrystal were carried out in Cambridge serial total energy package (CASTEP) based on dispersion-corrected density functional theory (DFT-D) with a plane wave scheme. The exchange-correlation potential was treated with the Perdew-Burke-Ernzerhof function of generalized gradient approximation, and dispersion force was correlated using Grimme's method. The band structure, density of states, projected density of states, and Mulliken populations were calculated at the generalized gradient approximation level. The results showed that the main host-guest interactions in HMX/NMP cocrystal were hydrogen bonds and stacking interactions, which were the same as those analyzed using X-ray diffraction. Theoretical investigations of HMX/NMP cocrystal explosive may provide the basis for the preparation of cocrystal explosive composed of HMX and energetic materials.

  10. OPPT workplan risk assessment Methylene Chloride (dichloromethane; CASRN 75-09-2) and N-Methylpyrrolidone (NMP; CASRN 872-50-4)

    EPA Science Inventory

    These assessments will focus on the use of DCM and NMP in paint stripping. NMP exposures scenarios for this use result in inhalation and dermal exposure to consumers and workers. The low concern for environmental effects of NMP will be discussed in the assessment. In the case of ...

  11. Functional classification of cNMP-binding proteins and nucleotide cyclases with implications for novel regulatory pathways in Mycobacterium tuberculosis.

    PubMed

    McCue, L A; McDonough, K A; Lawrence, C E

    2000-02-01

    We have analyzed the cyclic nucleotide (cNMP)-binding protein and nucleotide cyclase superfamilies using Bayesian computational methods of protein family identification and classification. In addition to the known cNMP-binding proteins (cNMP-dependent kinases, cNMP-gated channels, cAMP-guanine nucleotide exchange factors, and bacterial cAMP-dependent transcription factors), new functional groups of cNMP-binding proteins were identified, including putative ABC-transporter subunits, translocases, and esterases. Classification of the nucleotide cyclases revealed subtle differences in sequence conservation of the active site that distinguish the five classes of cyclases: the multicellular eukaryotic adenylyl cyclases, the eukaryotic receptor-type guanylyl cyclases, the eukaryotic soluble guanylyl cyclases, the unicellular eukaryotic and prokaryotic adenylyl cyclases, and the putative prokaryotic guanylyl cyclases. Phylogenetic distribution of the cNMP-binding proteins and cyclases was analyzed, with particular attention to the 22 complete archaeal and eubacterial genome sequences. Mycobacterium tuberculosis H37Rv and Synechocystis PCC6803 were each found to encode several more putative cNMP-binding proteins than other prokaryotes; many of these proteins are of unknown function. M. tuberculosis also encodes several more putative nucleotide cyclases than other prokaryotic species. PMID:10673278

  12. CIZ/NMP4 is expressed in B16 melanoma and forms a positive feedback loop with RANKL to promote migration of the melanoma cells.

    PubMed

    Sakuma, Tomomi; Nakamoto, Tetsuya; Hemmi, Hiroaki; Kitazawa, Sohei; Kitazawa, Riko; Notomi, Takuya; Hayata, Tadayoshi; Ezura, Yoichi; Amagasa, Teruo; Noda, Masaki

    2012-07-01

    Tumor metastasis to bone is a serious pathological situation that causes severe pain, and deterioration in locomoter function. However, the mechanisms underlying tumor metastasis is still incompletely understood. CIZ/NMP4 is a nucleocytoplasmic shuttling protein and its roles in tumor cells have not been known. We, therefore, hypothesized the role of CIZ/NMP4 in B16 melanoma cells that metastasize to bone. CIZ/NMP4 is expressed in B16 cells. The CIZ/NMP4 expression levels are correlated to the metastatic activity in divergent types of melanoma cells. Overexpression of CIZ/NMP4 increased B16 cell migration in Trans-well assay. Conversely, siRNA-based knockdown of CIZ/NMP4 suppressed migratory activity of these cells. As RANKL promotes metastasis of tumor cells in bone, we tested its effect on CIZ in melanoma cells. RANKL treatment enhanced CIZ/NMP4 expression. This increase of CIZ by RANKL promoted migration. Conversely, we identified CIZ/NMP4 binding site in the promoter of RANKL. Furthermore, luciferase assay indicated that CIZ/NMP4 overexpression enhanced RANKL promoter activities, revealing a positive feedback loop of CIZ/NMP4 and RANKL in melanoma. These observations indicate that CIZ/NMP4 is critical regulator of metastasis of melanoma cells. PMID:22307584

  13. N-methyl pyrrolidone (NMP) ameliorates the hypoxia-reduced osteoblast differentiation via inhibiting the NF-κB signaling.

    PubMed

    Li, Qiang; Liu, Rui; Zhao, Jianmin; Lu, Quanli

    2016-01-01

    Ischemic-hypoxic condition for local osteoblasts and bone mesenchymal stem cells during bone fracture inhibits bone repairing. N-methyl pyrrolidone (NMP) has been approved as a safe and biologically inactive small chemical molecule, and might be useful for bone fracture repairing. In the present study, we investigated the effect of NMP on the hypoxia-reduced cellular viability and the expression of differentiation-associated markers, such as bone morphogenetic protein 2 (BMP-2), propeptide of type I procollagen I (PINP), alkaline phosphatase (ALP) or runt-related transcription factor 2 (Runx2) in the osteoblasts, and then we examined the molecular mechanism underlining such effect in the human osteoblastic hFOB 1.19 cells. Our results demonstrated that NMP significantly blocked the hypoxia-induced cell viability reduction and inhibited the hypoxia-caused expression downregulation of BMP-2, PINP, ALP and Runx2 in hFOB 1.19 cells. Then we confirmed the involvement of nuclear factor κB (NF-κB) pathway in the regulation by NMP on the hypoxia-mediated the reduction of osteoblast differentiation. The upregulated expression and transcriptional activity of NF-κB, while the downregulated inhibitory κB expression by the hypoxia treatment was reversed by the treatment with 10 mM NMP. In conclusion, our study found a protective role of NMP in osteoblast differentiation in response to hypoxia, and such protection was through inhibiting the NF-κB signaling. This suggests that NMP might be a protective agent in bone fracture repairing. PMID:27665779

  14. Low Cost Environmental Sensors for Spaceflight: NMP Space Environmental Monitor (SEM) Requirements

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Buehler, Martin G.; Brinza, D.; Patel, J. U.

    2005-01-01

    An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.

  15. Low cost environmental sensors for Spaceflight : NMP Space Environmental Monitor (SEM) requirements

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Buelher, Martin G.; Brinza, D.; Patel, J. U.

    2005-01-01

    An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.

  16. DEMONSTRATION OF N-METHYL PYRROLIDONE (NMP) AS A POLLUTION PREVENTION ALTERNATIVE TO PAINT STRIPPING WITH METHYLENE CHLORIDE

    EPA Science Inventory

    This objective of this paper is to demonstrate that NMP is a viable pollution prevention alternative to methylene chloride. Maine Corps Logistics Base (MCLB), Albany, GA, USA was the host site for the demonstration. MCLB's primary function is maintenance of military ground supp...

  17. Folding Properties of Cytosine Monophosphate Kinase from E. coli Indicate Stabilization through an Additional Insert in the NMP Binding Domain

    PubMed Central

    Beitlich, Thorsten; Lorenz, Thorsten; Reinstein, Jochen

    2013-01-01

    The globular 25 kDa protein cytosine monophosphate kinase (CMPK, EC ID: 2.7.4.14) from E. coli belongs to the family of nucleoside monophosphate (NMP) kinases (NMPK). Many proteins of this family share medium to high sequence and high structure similarity including the frequently found α/β topology. A unique feature of CMPK in the family of NMPKs is the positioning of a single cis-proline residue in the CORE-domain (cis-Pro124) in conjunction with a large insert in the NMP binding domain. This insert is not found in other well studied NMPKs such as AMPK or UMP/CMPK. We have analyzed the folding pathway of CMPK using time resolved tryptophan and FRET fluorescence as well as CD. Our results indicate that unfolding at high urea concentrations is governed by a single process, whereas refolding in low urea concentrations follows at least a three step process which we interpret as follows: Pro124 in the CORE-domain is in cis in the native state (Nc) and equilibrates with its trans-isomer in the unfolded state (Uc - Ut). Under refolding conditions, at least the Ut species and possibly also the Uc species undergo a fast initial collapse to form intermediates with significant amount of secondary structure, from which the trans-Pro124 fraction folds to the native state with a 100-fold lower rate constant than the cis-Pro124 species. CMPK thus differs from homologous NMP kinases like UMP/CMP kinase or AMP kinase, where folding intermediates show much lower content of secondary structure. Importantly also unfolding is up to 100-fold faster compared to CMPK. We therefore propose that the stabilizing effect of the long NMP-domain insert in conjunction with a subtle twist in the positioning of a single cis-Pro residue allows for substantial stabilization compared to other NMP kinases with α/β topology. PMID:24205218

  18. The novel fluorescent CDP-analogue (Pbeta)MABA-CDP is a specific probe for the NMP binding site of UMP/CMP kinase.

    PubMed Central

    Rudolph, M. G.; Veit, T. J.; Reinstein, J.

    1999-01-01

    Direct thermodynamic and kinetic investigations of the binding of nucleotides to the nucleoside monophosphate (NMP) site of NMP kinases have not been possible so far because a spectroscopic probe was not available. By coupling a fluorescent N-methylanthraniloyl- (mant) group to the beta-phosphate of CDP via a butyl linker, a CDP analogue [(Pbeta)MABA-CDP] was obtained that still binds specifically to the NMP site of UmpKdicty, because the base and the ribose moieties, which are involved in specific interactions, are not modified. This allows the direct determination of binding constants for its substrates in competition experiments. PMID:10631985

  19. Biochemical characterization of Arabidopsis APYRASE family reveals their roles in regulating endomembrane NDP/NMP homoeostasis.

    PubMed

    Chiu, Tsan-Yu; Lao, Jeemeng; Manalansan, Bianca; Loqué, Dominique; Roux, Stanley J; Heazlewood, Joshua L

    2015-11-15

    Plant apyrases are nucleoside triphosphate (NTP) diphosphohydrolases (NTPDases) and have been implicated in an array of functions within the plant including the regulation of extracellular ATP. Arabidopsis encodes a family of seven membrane bound apyrases (AtAPY1-7) that comprise three distinct clades, all of which contain the five conserved apyrase domains. With the exception of AtAPY1 and AtAPY2, the biochemical and the sub-cellular characterization of the other members are currently unavailable. In this research, we have shown all seven Arabidopsis apyrases localize to internal membranes comprising the cis-Golgi, endoplasmic reticulum (ER) and endosome, indicating an endo-apyrase classification for the entire family. In addition, all members, with the exception of AtAPY7, can function as endo-apyrases by complementing a yeast double mutant (Δynd1Δgda1) which lacks apyrase activity. Interestingly, complementation of the mutant yeast using well characterized human apyrases could only be accomplished by using a functional ER endo-apyrase (NTPDase6), but not the ecto-apyrase (NTPDase1). Furthermore, the substrate specificity analysis for the Arabidopsis apyrases AtAPY1-6 indicated that each member has a distinct set of preferred substrates covering various NDPs (nucleoside diphosphates) and NTPs. Combining the biochemical analysis and sub-cellular localization of the Arabidopsis apyrases family, the data suggest their possible roles in regulating endomembrane NDP/NMP (nucleoside monophosphate) homoeostasis.

  20. Symbiotic fungi that are essential for plant nutrient uptake investigated with NMP

    NASA Astrophysics Data System (ADS)

    Pallon, J.; Wallander, H.; Hammer, E.; Arteaga Marrero, N.; Auzelyte, V.; Elfman, M.; Kristiansson, P.; Nilsson, C.; Olsson, P. A.; Wegdén, M.

    2007-07-01

    The nuclear microprobe (NMP) technique using PIXE for elemental analysis and STIM on/off axis for parallel mass density normalization has proven successful to investigate possible interactions between minerals and ectomycorrhizal (EM) mycelia that form symbiotic associations with forest trees. The ability for the EM to make elements biologically available from minerals and soil were compared in field studies and in laboratory experiments, and molecular analysis (PCR-RFLP) was used to identify ectomycorrhizal species from the field samplings. EM rhizomorphs associated with apatite in laboratory systems and in mesh bags incubated in forest ecosystems contained larger amounts of Ca than similar rhizomorphs connected to acid-washed sand. EM mycelium produced in mesh bags had a capacity to mobilize P from apatite-amended sand and a high concentration of K in some rhizomorphs suggests that these fungi are good accumulators of K and may have a significant role in transporting K to trees. Spores formed by arbuscular mycorrhizal (AM) fungi in laboratory cultures were compared with spores formed in saline soils in Tunisia in Northern Africa. We found lower concentrations of P and higher concentrations of Cl in the spores collected from the field than in the spores collected from laboratory cultures. For the case of laboratory cultures, the distribution of e.g. P and K was found to be clearly correlated.

  1. Embryotoxic potential of N-methyl-pyrrolidone (NMP) and three of its metabolites using the rat whole embryo culture system

    SciTech Connect

    Flick, Burkhard Talsness, Chris E.; Jaeckh, Rudolf; Buesen, Roland; Klug, Stephan

    2009-06-01

    N-methyl-2-pyrrolidone (NMP), which undergoes extensive biotransformation, has been shown in vivo to cause developmental toxicity and, especially after oral treatment, malformations in rats and rabbits. Data are lacking as to whether the original compound or one of its main metabolites is responsible for the toxic effects observed. Therefore, the relative embryotoxicity of the parent compound and its metabolites was evaluated using rat whole embryo culture (WEC) and the balb/c 3T3 cytotoxicity test. The resulting data were evaluated using two strategies; namely, one based on using all endpoints determined in the WEC and the other including endpoints from both the WEC and the cytotoxicity test. On basis of the first analysis, the substance with the highest embryotoxic potential is NMP, followed by 5-hydroxy-N-methyl-pyrrolidone (5-HNMP), 2-hydroxy-N-methylsuccinimide (2-HMSI) and N-methylsuccinimide (MSI). Specific dysmorphogeneses induced by NMP and 5-HNMP were aberrations in the head region of the embryos, abnormal development of the second visceral arches and open neural pores. The second evaluation strategy used only two endpoints of the WEC, i.e. the no observed adverse effect concentration (NOAEC{sub WEC}) and the lowest concentration leading to dysmorphogenesis in 100% of the cultured embryos (IC{sub MaxWEC}). In addition to these WEC endpoints the IC{sub 503T3} from the cytotoxicity test (balb/c 3T3 fibroblasts) was included in the evaluation scheme. These three endpoints were applied to a prediction model developed during a validation study of the European Centre for the Validation of Alternative Methods (ECVAM) allowing the classification of the embryotoxic potential of each compound into three classes (non-, weakly- and strongly embryotoxic). Consistent results from both evaluation strategies were observed, whereby NMP and its metabolites revealed a direct embryotoxic potential. Hereby, only NMP and 5-HNMP induced specific embryotoxic effects and were

  2. DMAC and NMP as Electrolyte Additives for Li-Ion Cells

    NASA Technical Reports Server (NTRS)

    Smart, Marshall; Bugga, Ratnakumar; Lucht, Brett

    2008-01-01

    Dimethyl acetamide (DMAC) and N-methyl pyrrolidinone (NMP) have been found to be useful as high-temperature-resilience-enhancing additives to a baseline electrolyte used in rechargeable lithium-ion electrochemical cells. The baseline electrolyte, which was previously formulated to improve low-temperature performance, comprises LiPF6 dissolved at a concentration of 1.0 M in a mixture comprising equal volume proportions of ethylene carbonate, diethyl carbonate, and dimethyl carbonate. This and other electrolytes comprising lithium salts dissolved in mixtures of esters (including alkyl carbonates) have been studied in continuing research directed toward extending the lower limits of operating temperatures and, more recently, enhancing the high-temperature resilience of such cells. This research at earlier stages, and the underlying physical and chemical principles, were reported in numerous previous NASA Tech Briefs articles. Although these electrolytes provide excellent performance at low temperatures (typically as low as -40 C), when the affected Li-ion cells are subjected to high temperatures during storage and cycling, there occur irreversible losses of capacity accompanied by power fade and deterioration of low-temperature performance. The term "high-temperature resilience" signifies, loosely, the ability of a cell to resist such deterioration, retaining as much as possible of its initial charge/discharge capacity during operation or during storage in the fully charged condition at high temperature. For the purposes of the present development, a temperature is considered to be high if it equals or exceeds the upper limit (typically, 30 C) of the operating-temperature range for which the cells in question are generally designed.

  3. Induced Probabilities.

    ERIC Educational Resources Information Center

    Neel, John H.

    Induced probabilities have been largely ignored by educational researchers. Simply stated, if a new or random variable is defined in terms of a first random variable, then induced probability is the probability or density of the new random variable that can be found by summation or integration over the appropriate domains of the original random…

  4. Probability Theory

    NASA Astrophysics Data System (ADS)

    Jaynes, E. T.; Bretthorst, G. Larry

    2003-04-01

    Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.

  5. Dynamic coupling between the LID and NMP domain motions in the catalytic conversion of ATP and AMP to ADP by adenylate kinase

    NASA Astrophysics Data System (ADS)

    Jana, Biman; Adkar, Bharat V.; Biswas, Rajib; Bagchi, Biman

    2011-01-01

    The catalytic conversion of adenosine triphosphate (ATP) and adenosine monophosphate (AMP) to adenosine diphosphate (ADP) by adenylate kinase (ADK) involves large amplitude, ligand induced domain motions, involving the opening and the closing of ATP binding domain (LID) and AMP binding domain (NMP) domains, during the repeated catalytic cycle. We discover and analyze an interesting dynamical coupling between the motion of the two domains during the opening, using large scale atomistic molecular dynamics trajectory analysis, covariance analysis, and multidimensional free energy calculations with explicit water. Initially, the LID domain must open by a certain amount before the NMP domain can begin to open. Dynamical correlation map shows interesting cross-peak between LID and NMP domain which suggests the presence of correlated motion between them. This is also reflected in our calculated two-dimensional free energy surface contour diagram which has an interesting elliptic shape, revealing a strong correlation between the opening of the LID domain and that of the NMP domain. Our free energy surface of the LID domain motion is rugged due to interaction with water and the signature of ruggedness is evident in the observed root mean square deviation variation and its fluctuation time correlation functions. We develop a correlated dynamical disorder-type theoretical model to explain the observed dynamic coupling between the motion of the two domains in ADK. Our model correctly reproduces several features of the cross-correlation observed in simulations.

  6. Dynamic coupling between the LID and NMP domain motions in the catalytic conversion of ATP and AMP to ADP by adenylate kinase.

    PubMed

    Jana, Biman; Adkar, Bharat V; Biswas, Rajib; Bagchi, Biman

    2011-01-21

    The catalytic conversion of adenosine triphosphate (ATP) and adenosine monophosphate (AMP) to adenosine diphosphate (ADP) by adenylate kinase (ADK) involves large amplitude, ligand induced domain motions, involving the opening and the closing of ATP binding domain (LID) and AMP binding domain (NMP) domains, during the repeated catalytic cycle. We discover and analyze an interesting dynamical coupling between the motion of the two domains during the opening, using large scale atomistic molecular dynamics trajectory analysis, covariance analysis, and multidimensional free energy calculations with explicit water. Initially, the LID domain must open by a certain amount before the NMP domain can begin to open. Dynamical correlation map shows interesting cross-peak between LID and NMP domain which suggests the presence of correlated motion between them. This is also reflected in our calculated two-dimensional free energy surface contour diagram which has an interesting elliptic shape, revealing a strong correlation between the opening of the LID domain and that of the NMP domain. Our free energy surface of the LID domain motion is rugged due to interaction with water and the signature of ruggedness is evident in the observed root mean square deviation variation and its fluctuation time correlation functions. We develop a correlated dynamical disorder-type theoretical model to explain the observed dynamic coupling between the motion of the two domains in ADK. Our model correctly reproduces several features of the cross-correlation observed in simulations.

  7. Immortalization and characterization of osteoblast cell lines generated from wild-type and Nmp4-null mouse bone marrow stromal cells using murine telomerase reverse transcriptase (mTERT).

    PubMed

    Alvarez, Marta B; Childress, Paul; Philip, Binu K; Gerard-O'Riley, Rita; Hanlon, Michael; Herbert, Brittney-Shea; Robling, Alexander G; Pavalko, Fredrick M; Bidwell, Joseph P

    2012-05-01

    Intermittent parathyroid hormone (PTH) adds new bone to the osteoporotic skeleton; the transcription factor Nmp4/CIZ represses PTH-induced bone formation in mice and as a consequence is a potential drug target for improving hormone clinical efficacy. To explore the impact of Nmp4/CIZ on osteoblast phenotype, we immortalized bone marrow stromal cells from wildtype (WT) and Nmp4-knockout (KO) mice using murine telomerase reverse transcriptase. Clonal lines were initially chosen based on their positive staining for alkaline phosphatase and capacity for mineralization. Disabling Nmp4/CIZ had no gross impact on osteoblast phenotype development. WT and KO clones exhibited identical sustained growth, reduced population doubling times, extended maintenance of the mature osteoblast phenotype, and competency for differentiating toward the osteoblast and adipocyte lineages. Additional screening of the immortalized cells for PTH-responsiveness permitted further studies with single WT and KO clones. We recently demonstrated that PTH-induced c-fos femoral mRNA expression is enhanced in Nmp4-KO mice and in the present study we observed that hormone stimulated either an equivalent or modestly enhanced increase in c-fos mRNA expression in both primary null and KO clone cells depending on PTH concentration. The null primary osteoblasts and KO clone cells exhibited a transiently enhanced response to bone morphogenetic protein 2 (BMP2). The clones exhibited lower and higher expressions of the PTH receptor (Pthr1) and the BMP2 receptor (Bmpr1a, Alk3), respectively, as compared to primary cells. These immortalized cell lines will provide a valuable tool for disentangling the complex functional roles underlying Nmp4/CIZ regulation of bone anabolism.

  8. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  10. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  11. Is quantum probability rational?

    PubMed

    Houston, Alasdair I; Wiesner, Karoline

    2013-06-01

    We concentrate on two aspects of the article by Pothos & Busemeyer (P&B): the relationship between classical and quantum probability and quantum probability as a basis for rational decisions. We argue that the mathematical relationship between classical and quantum probability is not quite what the authors claim. Furthermore, it might be premature to regard quantum probability as the best practical rational scheme for decision making.

  12. Predicted probabilities' relationship to inclusion probabilities.

    PubMed

    Fang, Di; Chong, Jenny; Wilson, Jeffrey R

    2015-05-01

    It has been shown that under a general multiplicative intercept model for risk, case-control (retrospective) data can be analyzed by maximum likelihood as if they had arisen prospectively, up to an unknown multiplicative constant, which depends on the relative sampling fraction. (1) With suitable auxiliary information, retrospective data can also be used to estimate response probabilities. (2) In other words, predictive probabilities obtained without adjustments from retrospective data will likely be different from those obtained from prospective data. We highlighted this using binary data from Medicare to determine the probability of readmission into the hospital within 30 days of discharge, which is particularly timely because Medicare has begun penalizing hospitals for certain readmissions. (3).

  13. Racing To Understand Probability.

    ERIC Educational Resources Information Center

    Van Zoest, Laura R.; Walker, Rebecca K.

    1997-01-01

    Describes a series of lessons designed to supplement textbook instruction of probability by addressing the ideas of "equally likely,""not equally likely," and "fairness," as well as to introduce the difference between theoretical and experimental probability. Presents four lessons using The Wind Racer games to study probability. (ASK)

  14. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  15. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  16. Effect of the inhibitors phenylalanine arginyl ß-naphthylamide (PAßN) and 1-(1-naphthylmethyl)-piperazine (NMP) on expression of genes in multidrug efflux systems of Escherichia coli isolates from bovine mastitis.

    PubMed

    Barrero, M A Ospina; Pietralonga, P A G; Schwarz, D G G; Silva, A; Paula, S O; Moreira, M A S

    2014-10-01

    The multidrug efflux system in bacteria can reduce antibiotic concentration inside the cell, leading to failure in the treatment of bacterial diseases. This study evaluated the influence of two efflux pump inhibitors (EPIs), phenylalanine arginyl ß-naphthylamide (PAßN) and 1-(1-naphthylmethyl)-piperazine (NMP), on the gene expression of three multidrug efflux systems, AcrAB, AcrEF and EmrAB in Escherichia coli bovine mastitis isolates resistant to ampicillin and sulfamethoxazole/trimethoprim simultaneously. Each isolate had at least three multidrug efflux system genes. The acrA and acrB had the lowest expression levels in all treatments, while the emrA or emrB showed the highest expression levels in the presence of ampicillin, sulfamethoxazole/trimethoprim, PAβN and NMP. EPIs also contributed to the decrease in arcF expression when used in combination with ampicillin treatment. Since PAßN showed stronger effects than NMP, it may serve as an alternative to assist in the antimicrobial therapy of mastitis.

  17. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  18. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  19. Abstract Models of Probability

    NASA Astrophysics Data System (ADS)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  20. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  1. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  3. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  5. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  6. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  7. On Probability Domains

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  8. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  9. Fractal probability laws.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  10. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  11. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  12. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  13. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  14. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  15. Probability distributions for magnetotellurics

    SciTech Connect

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  16. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  17. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  18. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  19. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  20. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  1. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  4. Probability Interpretation of Quantum Mechanics.

    ERIC Educational Resources Information Center

    Newton, Roger G.

    1980-01-01

    This paper draws attention to the frequency meaning of the probability concept and its implications for quantum mechanics. It emphasizes that the very meaning of probability implies the ensemble interpretation of both pure and mixed states. As a result some of the "paradoxical" aspects of quantum mechanics lose their counterintuitive character.…

  5. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  6. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  7. Tiempo para un cambio

    NASA Astrophysics Data System (ADS)

    Woltjer, L.

    1987-06-01

    En la reunion celebrada en diciembre dei ano pasado informe al Consejo de mi deseo de terminar mi contrato como Director General de la ESO una vez que fuera aprobado el proyecto dei VLT, que se espera sucedera hacia fines de este aAo. Cuando fue renovada mi designacion hace tres aAos, el Consejo conocia mi intencion de no completar los cinco aAos dei contrato debido a mi deseo de disponer de mas tiempo para otras actividades. Ahora, una vez terminada la fase preparatoria para el VLT, Y habiendose presentado el proyecto formalmente al Consejo el dia 31 de marzo, y esperando su muy probable aprobacion antes dei termino de este ano, me parece que el 10 de enero de 1988 presenta una excelente fecha para que se produzca un cambio en la administracion de la ESO.

  8. Holographic probabilities in eternal inflation.

    PubMed

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  9. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  10. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  11. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  12. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  15. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  16. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  17. Children's understanding of posterior probability.

    PubMed

    Girotto, Vittorio; Gonzalez, Michel

    2008-01-01

    Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five, children's decisions under uncertainty (Study 1) and judgments about random outcomes (Study 2) are correctly affected by posterior information. From the same age, children correctly revise their decisions in situations in which they face a single, uncertain event, produced by an intentional agent (Study 3). The finding that young children have some understanding of posterior probability supports the theory of naive extensional reasoning, and contravenes some pessimistic views of probabilistic reasoning, in particular the evolutionary claim that the human mind cannot deal with single-case probability. PMID:17391661

  18. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  19. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  20. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  1. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  2. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  3. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  4. Comments on quantum probability theory.

    PubMed

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  5. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  6. Knot probabilities in random diagrams

    NASA Astrophysics Data System (ADS)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  7. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  8. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  9. Objective Probability and Quantum Fuzziness

    NASA Astrophysics Data System (ADS)

    Mohrhoff, U.

    2009-02-01

    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which

  10. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  11. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew R.; Piro, Anthony; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  12. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  13. Persistence probabilities for stream populations.

    PubMed

    Samia, Yasmine; Lutscher, Frithjof

    2012-07-01

    Individuals in streams and rivers are constantly at risk of being washed downstream and thereby lost to their population. The possibility of diffusion-mediated persistence of populations in advective environments has been the focus of a multitude of recent modeling efforts. Most of these recent models are deterministic, and they predict the existence of a critical advection velocity, above which a population cannot persist. In this work, we present a stochastic approach to the persistence problem in streams and rivers. We use the dominant eigenvalue of the advection-diffusion operator to transition from a spatially explicit description to a spatially implicit birth-death process, in which individual washout from the domain appears as an additional death term. We find that the deterministic persistence threshold is replaced by a smooth transition from almost sure persistence to extinction as advection velocity increases. More interestingly, we explore how temporal variation in flow rate and other parameters affect the persistence probability. In line with general expectations, we find that temporal variation often decreases the persistence probability, and we focus on a few examples of how variation can increase population persistence.

  14. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  15. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  16. Lévy laws in free probability

    PubMed Central

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes. PMID:12473744

  17. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  18. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  19. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  20. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  1. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  2. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  3. Lévy processes in free probability

    PubMed Central

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability. PMID:12473745

  4. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  5. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  6. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  7. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  8. Illustrating Basic Probability Calculations Using "Craps"

    ERIC Educational Resources Information Center

    Johnson, Roger W.

    2006-01-01

    Instructors may use the gambling game of craps to illustrate the use of a number of fundamental probability identities. For the "pass-line" bet we focus on the chance of winning and the expected game length. To compute these, probabilities of unions of disjoint events, probabilities of intersections of independent events, conditional probabilities…

  9. Subjective and objective probabilities in quantum mechanics

    SciTech Connect

    Srednicki, Mark

    2005-05-15

    We discuss how the apparently objective probabilities predicted by quantum mechanics can be treated in the framework of Bayesian probability theory, in which all probabilities are subjective. Our results are in accord with earlier work by Caves, Fuchs, and Schack, but our approach and emphasis are different. We also discuss the problem of choosing a noninformative prior for a density matrix.

  10. Calibrating Subjective Probabilities Using Hierarchical Bayesian Models

    NASA Astrophysics Data System (ADS)

    Merkle, Edgar C.

    A body of psychological research has examined the correspondence between a judge's subjective probability of an event's outcome and the event's actual outcome. The research generally shows that subjective probabilities are noisy and do not match the "true" probabilities. However, subjective probabilities are still useful for forecasting purposes if they bear some relationship to true probabilities. The purpose of the current research is to exploit relationships between subjective probabilities and outcomes to create improved, model-based probabilities for forecasting. Once the model has been trained in situations where the outcome is known, it can then be used in forecasting situations where the outcome is unknown. These concepts are demonstrated using experimental psychology data, and potential applications are discussed.

  11. The uncertainty in earthquake conditional probabilities

    USGS Publications Warehouse

    Savage, J.C.

    1992-01-01

    The Working Group on California Earthquake Probabilities (WGCEP) questioned the relevance of uncertainty intervals assigned to earthquake conditional probabilities on the basis that the uncertainty in the probability estimate seemed to be greater the smaller the intrinsic breadth of the recurrence-interval distribution. It is shown here that this paradox depends upon a faulty measure of uncertainty in the conditional probability and that with a proper measure of uncertainty no paradox exists. The assertion that the WGCEP probability assessment in 1988 correctly forecast the 1989 Loma Prieta earthquake is also challenged by showing that posterior probability of rupture inferred after the occurrence of the earthquake from the prior WGCEP probability distribution reverts to a nearly informationless distribution. -Author

  12. Integrated statistical modelling of spatial landslide probability

    NASA Astrophysics Data System (ADS)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  13. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  14. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  15. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  16. Entropy analysis of systems exhibiting negative probabilities

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J. A.

    2016-07-01

    This paper addresses the concept of negative probability and its impact upon entropy. An analogy between the probability generating functions, in the scope of quasiprobability distributions, and the Grünwald-Letnikov definition of fractional derivatives, is explored. Two distinct cases producing negative probabilities are formulated and their distinct meaning clarified. Numerical calculations using the Shannon entropy characterize further the characteristics of the two limit cases.

  17. Calculating the CEP (Circular Error Probable)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report compares the probability contained in the Circular Error Probable associated with an Elliptical Error Probable to that of the EEP at a given confidence level. The levels examined are 50 percent and 95 percent. The CEP is found to be both more conservative and less conservative than the associated EEP, depending on the eccentricity of the ellipse. The formulas used are derived in the appendix.

  18. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  19. Is probability of frequency too narrow?

    SciTech Connect

    Martz, H.F.

    1993-10-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed.

  20. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  1. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  2. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  3. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  4. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  5. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computed to no less than three significant digits. Probabilities will be truncated to the number of significant digits used in a particular lottery. (b) Divide the total number of applicants into 1.00 to... than .40, then multiply each such intermediate probability by the ratio of .40 to such sum. Divide...

  6. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  7. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  8. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  9. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  10. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  11. Teenagers' Perceived and Actual Probabilities of Pregnancy.

    ERIC Educational Resources Information Center

    Namerow, Pearila Brickner; And Others

    1987-01-01

    Explored adolescent females' (N=425) actual and perceived probabilities of pregnancy. Subjects estimated their likelihood of becoming pregnant the last time they had intercourse, and indicated the dates of last intercourse and last menstrual period. Found that the distributions of perceived probability of pregnancy were nearly identical for both…

  12. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  13. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  14. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  15. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  16. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  17. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  18. Quantum probability assignment limited by relativistic causality

    PubMed Central

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  19. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  20. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  1. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  2. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  3. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  4. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma. PMID:27477933

  5. The Animism Controversy Revisited: A Probability Analysis

    ERIC Educational Resources Information Center

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  6. Classical and Quantum Spreading of Position Probability

    ERIC Educational Resources Information Center

    Farina, J. E. G.

    1977-01-01

    Demonstrates that the standard deviation of the position probability of a particle moving freely in one dimension is a function of the standard deviation of its velocity distribution and time in classical or quantum mechanics. (SL)

  7. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  8. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record

  9. The cognitive substrate of subjective probability.

    PubMed

    Nilsson, Håkan; Olsson, Henrik; Juslin, Peter

    2005-07-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as prototype similarity, relative likelihood, or evidential support accumulation (ESAM; D. J. Koehler, C. M. White, & R. Grondin, 2003); cue-based relative frequency; and exemplar memory, implemented by probabilities from exemplars (PROBEX; P. Juslin & M. Persson, 2002). Three experiments with different task structures consistently demonstrate that exemplar memory is the best account of the data whereas the results are inconsistent with extant formulations of the representativeness heuristic and cue-based relative frequency. PMID:16060768

  10. Rare Gases Transition Probabilities for Plasma Diagnostics

    SciTech Connect

    Katsonis, K.; Siskos, A.; Ndiaye, A.; Clark, R. E. H.; Cornille, M.; Abdallah, J. Jr.

    2006-01-15

    Evaluation of Ar and Xe transition probabilities to be used in Collisional-Radiative models for plasma diagnostics is addressed. Partial results are given for the typical case of the 4p <- 4d Ar III multiplet.

  11. Teaching Elementary Probability Through its History.

    ERIC Educational Resources Information Center

    Kunoff, Sharon; Pines, Sylvia

    1986-01-01

    Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)

  12. Determining Probabilities by Examining Underlying Structure.

    ERIC Educational Resources Information Center

    Norton, Robert M.

    2001-01-01

    Discusses how dice games pose fairness issues that appeal to students and examines a structure for three games involving two dice in a way that leads directly to the theoretical probabilities for all possible outcomes. (YDS)

  13. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  14. Non-Gaussian Photon Probability Distribution

    NASA Astrophysics Data System (ADS)

    Solomon, Benjamin T.

    2010-01-01

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√u ) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al. (2006) microwave cloaking, and Oulton et al. (2008) sub wavelength confinement; thereby providing a strong case that

  15. Probability distribution of the vacuum energy density

    SciTech Connect

    Duplancic, Goran; Stefancic, Hrvoje; Glavan, Drazen

    2010-12-15

    As the vacuum state of a quantum field is not an eigenstate of the Hamiltonian density, the vacuum energy density can be represented as a random variable. We present an analytical calculation of the probability distribution of the vacuum energy density for real and complex massless scalar fields in Minkowski space. The obtained probability distributions are broad and the vacuum expectation value of the Hamiltonian density is not fully representative of the vacuum energy density.

  16. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  17. When probability trees don't work

    NASA Astrophysics Data System (ADS)

    Chan, K. C.; Lenard, C. T.; Mills, T. M.

    2016-08-01

    Tree diagrams arise naturally in courses on probability at high school or university, even at an elementary level. Often they are used to depict outcomes and associated probabilities from a sequence of games. A subtle issue is whether or not the Markov condition holds in the sequence of games. We present two examples that illustrate the importance of this issue. Suggestions as to how these examples may be used in a classroom are offered.

  18. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  19. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  20. Familiarity and preference for pitch probability profiles.

    PubMed

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge. PMID:25838257

  1. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a

  2. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  3. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  4. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.

  5. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  6. Classical and Quantum Probability for Biologists - Introduction

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2010-01-01

    The aim of this review (oriented to biologists looking for applications of QM) is to provide a detailed comparative analysis of classical (Kolmogorovian) and quantum (Dirac-von Neumann) models. We will stress differences in the definition of conditional probability and as a consequence in the structures of matrices of transition probabilities, especially the condition of double stochasticity which arises naturally in QM. One of the most fundamental differences between two models is deformation of the classical formula of total probability (FTP) which plays an important role in statistics and decision making. An additional term appears in the QM-version of FTP - so called interference term. Finally, we discuss Bell's inequality and show that the common viewpoint that its violation induces either nonlocality or "death of realism" has not been completely justified. For us it is merely a sign of non-Kolmogorovianity of probabilistic data collected in a few experiments with incompatible setups of measurement devices.

  7. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  8. Transition Probabilities for Hydrogen-Like Atoms

    NASA Astrophysics Data System (ADS)

    Jitrik, Oliverio; Bunge, Carlos F.

    2004-12-01

    E1, M1, E2, M2, E3, and M3 transition probabilities for hydrogen-like atoms are calculated with point-nucleus Dirac eigenfunctions for Z=1-118 and up to large quantum numbers l=25 and n=26, increasing existing data more than a thousandfold. A critical evaluation of the accuracy shows a higher reliability with respect to previous works. Tables for hydrogen containing a subset of the results are given explicitly, listing the states involved in each transition, wavelength, term energies, statistical weights, transition probabilities, oscillator strengths, and line strengths. The complete results, including 1 863 574 distinct transition probabilities, lifetimes, and branching fractions are available at http://www.fisica.unam.mx/research/tables/spectra/1el

  9. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  10. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  11. Sampling Quantum Nonlocal Correlations with High Probability

    NASA Astrophysics Data System (ADS)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(< u_i,v_jrangle)_{i,j=1}^n}, where the vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  12. Match probabilities in racially admixed populations.

    PubMed

    Lange, K

    1993-02-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors.

  13. Genotypic probabilities for pairs of inbred relatives.

    PubMed

    Liu, Wenlei; Weir, B S

    2005-07-29

    Expressions for the joint genotypic probabilities of two related individuals are used in many population and quantitative genetic analyses. These expressions, resting on a set of 15 probabilities of patterns of identity by descent among the four alleles at a locus carried by the relatives, are generally well known. There has been recent interest in special cases where the two individuals are both related and inbred, although there have been differences among published results. Here, we return to the original 15-probability treatment and show appropriate reductions for relatives when they are drawn from a population that itself is inbred or when the relatives have parents who are related. These results have application in affected-relative tests for linkage, and in methods for interpreting forensic genetic profiles.

  14. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  15. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  16. Survival probability for the stadium billiard

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.; Georgiou, Orestis

    2009-12-01

    We consider the open stadium billiard, consisting of two semicircles joined by parallel straight sides with one hole situated somewhere on one of the sides. Due to the hyperbolic nature of the stadium billiard, the initial decay of trajectories, due to loss through the hole, appears exponential. However, some trajectories (bouncing ball orbits) persist and survive for long times and therefore form the main contribution to the survival probability function at long times. Using both numerical and analytical methods, we concur with previous studies that the long-time survival probability for a reasonably small hole drops like Constant×(; here we obtain an explicit expression for the Constant.

  17. Does Probability Interference Exist In Social Science?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.; Haven, Emmanuel

    2007-02-01

    In this paper we discuss the rationale why sub(super)-additive probabilities in a psychological setting could be explained via the use of quantum probability interference. We propose to measure the complementarity of two variables: i) time of processing (by experiment participants) of (non-moving) images and ii) the ability (by experiment participants) of recognizing deformations of (non-moving) pictures. We argue in the paper why we can not find this complementarity using the Heisenberg Uncertainty Principle. The paper provides for the details on the experimental set up to test the complementarity.

  18. Random walks with similar transition probabilities

    NASA Astrophysics Data System (ADS)

    Schiefermayr, Klaus

    2003-04-01

    We consider random walks on the nonnegative integers with a possible absorbing state at -1. A random walk is called [alpha]-similar to a random walk if there exist constants Cij such that for the corresponding n-step transition probabilities , i,j[greater-or-equal, slanted]0, hold. We give necessary and sufficient conditions for the [alpha]-similarity of two random walks both in terms of the parameters and in terms of the corresponding spectral measures which appear in the spectral representation of the n-step transition probabilities developed by Karlin and McGregor.

  19. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  20. Nonstationary envelope process and first excursion probability.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.

  1. Probabilities for separating sets of order statistics.

    PubMed

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  2. Electric quadrupole transition probabilities for atomic lithium

    SciTech Connect

    Çelik, Gültekin; Gökçe, Yasin; Yıldız, Murat

    2014-05-15

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT.

  3. Non-Gaussian Photon Probability Distribution

    SciTech Connect

    Solomon, Benjamin T.

    2010-01-28

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mGAMMA distribution (whose parameters are alpha = r, betar/sq root(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact P{sub i}, the probabilistic function and the ability to interact A{sub i}, the electromagnetic function. Splitting the probability function P{sub i} from the electromagnetic function A{sub i} enables the investigation of the photon behavior from a purely probabilistic P{sub i} perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function P{sub i} and the ability to interact A{sub i}, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon P{sub i} of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub

  4. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    SciTech Connect

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  5. Using a Fluorescent Cytosine Analogue tC[superscript o] To Probe the Effect of the Y567 to Ala Substitution on the Preinsertion Steps of dNMP Incorporation by RB69 DNA Polymerase

    SciTech Connect

    Xia, Shuangluo; Beckman, Jeff; Wang, Jimin; Konigsberg, William H.

    2012-10-10

    Residues in the nascent base pair binding pocket (NBP) of bacteriophage RB69 DNA polymerase (RB69pol) are responsible for base discrimination. Replacing Tyr567 with Ala leads to greater flexibility in the NBP, increasing the probability of misincorporation. We used the fluorescent cytosine analogue, 1,3-diaza-2-oxophenoxazine (tC{sup o}), to identify preinsertion step(s) altered by NBP flexibility. When tC{sup o} is the templating base in a wild-type (wt) RB69pol ternary complex, its fluorescence is quenched only in the presence of dGTP. However, with the RB69pol Y567A mutant, the fluorescence of tC{sup o} is also quenched in the presence of dATP. We determined the crystal structure of the dATP/tC{sup o}-containing ternary complex of the RB69pol Y567A mutant at 1.9 {angstrom} resolution and found that the incoming dATP formed two hydrogen bonds with an imino-tautomerized form of tC{sup o}. Stabilization of the dATP/tC{sup o} base pair involved movement of the tC{sup o} backbone sugar into the DNA minor groove and required tilting of the tC{sup o} tricyclic ring to prevent a steric clash with L561. This structure, together with the pre-steady-state kinetic parameters and dNTP binding affinity, estimated from equilibrium fluorescence titrations, suggested that the flexibility of the NBP, provided by the Y567 to Ala substitution, led to a more favorable forward isomerization step resulting in an increase in dNTP binding affinity.

  6. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  7. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  8. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  9. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  10. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  11. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  12. Probable Bright Supernovae discovered by PSST

    NASA Astrophysics Data System (ADS)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-01-01

    Three bright transients, which are probable supernovae, have been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  13. Probable Bright Supernova discovered by PSST

    NASA Astrophysics Data System (ADS)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  14. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  15. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  16. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  17. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  18. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  19. Quantum temporal probabilities in tunneling systems

    NASA Astrophysics Data System (ADS)

    Anastopoulos, Charis; Savvidou, Ntina

    2013-09-01

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines 'classical' time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems.

  20. Teaching Mathematics with Technology: Probability Simulations.

    ERIC Educational Resources Information Center

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  1. Conceptual Variation and Coordination in Probability Reasoning

    ERIC Educational Resources Information Center

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  2. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  3. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  4. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  5. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  6. Quantum temporal probabilities in tunneling systems

    SciTech Connect

    Anastopoulos, Charis Savvidou, Ntina

    2013-09-15

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.

  7. The albedo effect on neutron transmission probability.

    PubMed

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  8. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  9. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in other journals. However, there is…

  10. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  11. You Say "Probable" and I Say "Likely": Improving Interpersonal Communication With Verbal Probability Phrases

    ERIC Educational Resources Information Center

    Karelitz, Tzur M.; Budescu, David V.

    2004-01-01

    When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…

  12. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  13. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  14. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  15. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  16. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  17. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. PMID:24039093

  18. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  19. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  20. Cheating Probabilities on Multiple Choice Tests

    NASA Astrophysics Data System (ADS)

    Rizzuto, Gaspard T.; Walters, Fred

    1997-10-01

    This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.

  1. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

  2. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  3. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  4. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  5. Probability of photoassociation from a quasicontinuum approach

    NASA Astrophysics Data System (ADS)

    Javanainen, Juha; Mackie, Matt

    1998-08-01

    We examine photoassociation by using a quasicontinuum to describe the colliding atoms. The quasicontinuum system is analyzed using methods adapted from the theory of laser spectroscopy and quantum optics, and a continuum limit is then taken. In a degenerate gas the equilibrium probability of photoassociation may be close to unity. In the continuum limit, for a thermal atomic sample, the stimulated Raman adiabatic passage (STIRAP) mechanism cannot be employed to eliminate unwanted spontaneous transitions.

  6. Neural coding of uncertainty and probability.

    PubMed

    Ma, Wei Ji; Jazayeri, Mehrdad

    2014-01-01

    Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability.

  7. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  8. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  9. Neural coding of uncertainty and probability.

    PubMed

    Ma, Wei Ji; Jazayeri, Mehrdad

    2014-01-01

    Organisms must act in the face of sensory, motor, and reward uncertainty stemming from a pandemonium of stochasticity and missing information. In many tasks, organisms can make better decisions if they have at their disposal a representation of the uncertainty associated with task-relevant variables. We formalize this problem using Bayesian decision theory and review recent behavioral and neural evidence that the brain may use knowledge of uncertainty, confidence, and probability. PMID:25032495

  10. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  11. Understanding Deutsch's probability in a deterministic multiverse

    NASA Astrophysics Data System (ADS)

    Greaves, H.

    2004-09-01

    Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from 'probability' without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: all she needs is a particular rationality principle. The decision-theoretic approach recently developed by Deutsch and Wallace claims to provide just such a principle. But, according to Wallace, decision theory is itself applicable only if the correct attitude to a future Everettian measurement outcome is subjective uncertainty. I argue that subjective uncertainty is not available to the Everettian, but I offer an alternative: we can justify the Everettian application of decision theory on the basis that an Everettian should care about all her future branches. The probabilities appearing in the decision-theoretic representation theorem can then be interpreted as the degrees to which the rational agent cares about each future branch. This reinterpretation, however, reduces the intuitive plausibility of one of the Deutsch-Wallace axioms (measurement neutrality).

  12. The Probability Distribution of Daily Streamflow

    NASA Astrophysics Data System (ADS)

    Blum, A.; Vogel, R. M.

    2015-12-01

    Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.

  13. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  14. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  15. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  16. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  17. Instability of Wave Trains and Wave Probabilities

    NASA Astrophysics Data System (ADS)

    Babanin, Alexander

    2013-04-01

    Centre for Ocean Engineering, Science and Technology, Swinburne University of Technology, Melbourne, Australia, ababanin@swin.edu.au Design criteria in ocean engineering, whether this is one in 50 years or one in 5000 years event, are hardly ever based on measurements, and rather on statistical distributions of relevant metocean properties. Of utmost interest is the tail of distribution, that is rare events such as the highest waves with low probability. Engineers have long since realised that the superposition of linear waves with narrow-banded spectrum as depicted by the Rayleigh distribution underestimates the probability of extreme wave heights and crests, which is a critical shortcoming as far as the engineering design is concerned. Ongoing theoretical and experimental efforts have been under way for decades to address this issue. Typical approach is the treating all possible waves in the ocean or at a particular location as a single ensemble for which some comprehensive solution can be obtained. The oceanographic knowledge, however, now indicates that no single and united comprehensive solution is available. We would expect the probability distributions of wave height to depend on a) whether the waves are at the spectral peak or at the tail; b) on wave spectrum and mean steepness in the wave field; c) on the directional distribution of the peak waves; d) on whether the waves are in deep water, in intermediate depth or in shallow water; e) on wave breaking; f) on the wind, particularly if it is very strong, and on the currents if they have suitable horizontal gradients. Probability distributions in the different circumstances according to these groups of conditions should be different, and by combining them together the inevitable scatter is introduced. The scatter and the accuracy will not improve by increasing the bulk data quality and quantity, and it hides the actual distribution of extremes. The groups have to be separated and their probability

  18. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  19. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  20. Para-aortic lymphocyst.

    PubMed

    Helmkamp, B F; Krebs, H B; Isikoff, M B; Poliakoff, S R; Averette, H E

    1980-10-15

    Although numerous articles regarding the etiology, incidence, complications, and management of pelvic lymphocysts have been published in the American literature since 1958, there has been no mention of para-aortic lymphocyst as a complication of para-aortic node dissection. Two recent cases of symptomatic para-aortic lymphocyst have prompted a review of our para-aortic node dissection technique when this procedure is not combined with a more extensive pelvic lymphadenectomy. Our modification in technique is to use retroperitoneal para-aortic drainage by constant pressure-controlled suction following closure of the posterior parietal peritoneum, and the results in our first 15 patients are presented. There were no complications related to the drainage technique. Abdominal ultrasound and intravenous urography have proved to be excellent diagnostic tools in the initial evaluation and subsequent follow-up of para-aortic lymphocytes.

  1. [Subjective probability of reward receipt and the magnitude effect in probability discounting].

    PubMed

    Isomura, Mieko; Aoyama, Kenjiro

    2008-06-01

    Previous research suggested that larger probabilistic rewards were discounted more steeply than smaller probabilistic rewards (the magnitude effect). This research tests the hypothesis that the magnitude effect reflects the extent to which individuals distrust the stated probability of receiving different amounts of rewards. The participants were 105 college students. Probability discounting of two different amounts of rewards (5 000 yen and 100 000 yen) and the subjective probability of reward receipt of the different amounts (5 000 yen, 100 000 yen and 1 000 000 yen) were measured. The probabilistic 100 000 yen was discounted more steeply than the probabilistic 5 000 yen. The subjective probability of reward receipt was higher in the 5 000 yen than in the 100 000 yen condition. The proportion of subjective probability of receiving 5 000 yen to that of receiving 100 000 yen was significantly correlated with the proportion of degree of probability discounting for 5 000 yen to that for 100 000 yen. These results were consistent with the hypothesis stated above.

  2. On the universality of knot probability ratios

    NASA Astrophysics Data System (ADS)

    Janse van Rensburg, E. J.; Rechnitzer, A.

    2011-04-01

    Let pn denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let pn(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is pn(K)/pn and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of pn(K), but there is substantial numerical evidence (Orlandini et al 1988 J. Phys. A: Math. Gen. 31 5953-67, Marcone et al 2007 Phys. Rev. E 75 41105, Rawdon et al 2008 Macromolecules 41 4444-51, Janse van Rensburg and Rechnitzer 2008 J. Phys. A: Math. Theor. 41 105002) that pn(K) grows as p_n(K) \\simeq C_K \\mu _\\emptyset ^n n^{\\alpha -3+N_K}, \\qquad as\\quad n \\rightarrow \\infty, where NK is the number of prime components of the knot type K. It is believed that the entropic exponent, α, is universal, while the exponential growth rate, μ∅, is independent of the knot type but varies with the lattice. The amplitude, CK, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L is \\frac{p_n(K)/p_n}{p_n(L)/p_n} = \\frac{p_n(K)}{p_n(L)} \\simeq \\left[ \\frac{C_K}{C_L} \\right].\\\\[-8pt] In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot

  3. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  4. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  5. Symmetry, probability, and recognition in face space.

    PubMed

    Sirovich, Lawrence; Meytlis, Marsha

    2009-04-28

    The essential midline symmetry of human faces is shown to play a key role in facial coding and recognition. This also has deep and important connections with recent explorations of the organization of primate cortex, as well as human psychophysical experiments. Evidence is presented that the dimension of face recognition space for human faces is dramatically lower than previous estimates. One result of the present development is the construction of a probability distribution in face space that produces an interesting and realistic range of (synthetic) faces. Another is a recognition algorithm that by reasonable criteria is nearly 100% accurate.

  6. The Prediction of Spatial Aftershock Probabilities (PRESAP)

    NASA Astrophysics Data System (ADS)

    McCloskey, J.

    2003-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemeter seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following day to tens of days. Specifically, the project aim is to assess the

  7. Mapping probability of shipping sound exposure level.

    PubMed

    Gervaise, Cédric; Aulanier, Florian; Simard, Yvan; Roy, Nathalie

    2015-06-01

    Mapping vessel noise is emerging as one method of identifying areas where sound exposure due to shipping noise could have negative impacts on aquatic ecosystems. The probability distribution function (pdf) of sound exposure levels (SEL) is an important metric for identifying areas of concern. In this paper a probabilistic shipping SEL modeling method is described to obtain the pdf of SEL using the sonar equation and statistical relations linking the pdfs of ship traffic density, source levels, and transmission losses to their products and sums.

  8. Probability of detection calculations using MATLAB

    NASA Astrophysics Data System (ADS)

    Wei, Yung-Chung

    1993-06-01

    A set of highly efficient computer programs based on the Marcum and Swerling's analysis on radar detection has been written in MATLAB to evaluate the probability of detection. The programs are based on accurate methods unlike the detectability method which is based on approximation. This thesis also outlines radar detection theory and target models as a background. The goal of this effort is to provide a set of efficient computer programs for student usage and teacher's aid. Programs are designed to be user friendly and run on personal computers.

  9. Uncertainty analysis for Probable Maximum Precipitation estimates

    NASA Astrophysics Data System (ADS)

    Micovic, Zoran; Schaefer, Melvin G.; Taylor, George H.

    2015-02-01

    An analysis of uncertainty associated with Probable Maximum Precipitation (PMP) estimates is presented. The focus of the study is firmly on PMP estimates derived through meteorological analyses and not on statistically derived PMPs. Theoretical PMP cannot be computed directly and operational PMP estimates are developed through a stepwise procedure using a significant degree of subjective professional judgment. This paper presents a methodology for portraying the uncertain nature of PMP estimation by analyzing individual steps within the PMP derivation procedure whereby for each parameter requiring judgment, a set of possible values is specified and accompanied by expected probabilities. The resulting range of possible PMP values can be compared with the previously derived operational single-value PMP, providing measures of the conservatism and variability of the original estimate. To our knowledge, this is the first uncertainty analysis conducted for a PMP derived through meteorological analyses. The methodology was tested on the La Joie Dam watershed in British Columbia. The results indicate that the commonly used single-value PMP estimate could be more than 40% higher when possible changes in various meteorological variables used to derive the PMP are considered. The findings of this study imply that PMP estimates should always be characterized as a range of values recognizing the significant uncertainties involved in PMP estimation. In fact, we do not know at this time whether precipitation is actually upper-bounded, and if precipitation is upper-bounded, how closely PMP estimates approach the theoretical limit.

  10. On the probability of matching DNA fingerprints.

    PubMed

    Risch, N J; Devlin, B

    1992-02-01

    Forensic scientists commonly assume that DNA fingerprint patterns are infrequent in the general population and that genotypes are independent across loci. To test these assumptions, the number of matching DNA patterns in two large databases from the Federal Bureau of Investigation (FBI) and from Lifecodes was determined. No deviation from independence across loci in either database was apparent. For the Lifecodes database, the probability of a three-locus match ranges from 1 in 6,233 in Caucasians to 1 in 119,889 in Blacks. When considering all trios of five loci in the FBI database, there was only a single match observed out of more than 7.6 million comparisons. If independence is assumed, the probability of a five-locus match ranged from 1.32 x 10(-12) in Southeast Hispanics to 5.59 x 10(-14) in Blacks, implying that the minimum number of possible patterns for each ethnic group is several orders of magnitude greater than their corresponding population sizes in the United States. The most common five-locus pattern can have a frequency no greater than about 10(-6). Hence, individual five-locus DNA profiles are extremely uncommon, if not unique. PMID:1738844

  11. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  12. Probably maximum flood of the Sava River

    NASA Astrophysics Data System (ADS)

    Brilly, Mitja; Vidmar, Andrej; Raj, Mojca Å.

    2010-05-01

    The Nuclear Power Plant Krško (NEK) situated on the left bank of the Save River close to the border of Croatia. Probably Maximum Flood, on the location of the NEK could result in combination of probably maximum precipitation, sequential storm before PMP or snowmelt on the Sava River watershed. Mediterranean climate characterises very high precipitation and temporary high snow pack. The HBV-96 model as Integrated Hydrological Modelling System (IHMS) used for modelling. Model was calibrated and verification for daily time step at first for time period 1190-2006. Calibration and verification for hourly time step was done for period 1998-1999. The stream routing parameters were calibrated for flood event in years 1998 and 2007 and than verification for flood event in 1990. Discharge routing data analysis shown that possible inundation of Ljubljana and Savinja valley was not properly estimated. The flood areas are protected with levees and water does not spread over flooded areas in events used for calibration. Inundated areas in Ljubljana valley and Savinja valley are protected by levees and model could not simulate properly inundation of PMF. We recalibrate parameters controlled inundation on those areas for the worst scenario. Calculated PMF values drop down tramendosly after recalibration.

  13. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  14. Significance of "high probability/low damage" versus "low probability/high damage" flood events

    NASA Astrophysics Data System (ADS)

    Merz, B.; Elmer, F.; Thieken, A. H.

    2009-06-01

    The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.

  15. Economic choices reveal probability distortion in macaque monkeys.

    PubMed

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing.

  16. Economic Choices Reveal Probability Distortion in Macaque Monkeys

    PubMed Central

    Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-01-01

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. PMID:25698750

  17. Model estimates hurricane wind speed probabilities

    NASA Astrophysics Data System (ADS)

    Mumane, Richard J.; Barton, Chris; Collins, Eric; Donnelly, Jeffrey; Eisner, James; Emanuel, Kerry; Ginis, Isaac; Howard, Susan; Landsea, Chris; Liu, Kam-biu; Malmquist, David; McKay, Megan; Michaels, Anthony; Nelson, Norm; O Brien, James; Scott, David; Webb, Thompson, III

    In the United States, intense hurricanes (category 3, 4, and 5 on the Saffir/Simpson scale) with winds greater than 50 m s -1 have caused more damage than any other natural disaster [Pielke and Pielke, 1997]. Accurate estimates of wind speed exceedance probabilities (WSEP) due to intense hurricanes are therefore of great interest to (re)insurers, emergency planners, government officials, and populations in vulnerable coastal areas.The historical record of U.S. hurricane landfall is relatively complete only from about 1900, and most model estimates of WSEP are derived from this record. During the 1899-1998 period, only two category-5 and 16 category-4 hurricanes made landfall in the United States. The historical record therefore provides only a limited sample of the most intense hurricanes.

  18. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  19. Probability density function learning by unsupervised neurons.

    PubMed

    Fiori, S

    2001-10-01

    In a recent work, we introduced the concept of pseudo-polynomial adaptive activation function neuron (FAN) and presented an unsupervised information-theoretic learning theory for such structure. The learning model is based on entropy optimization and provides a way of learning probability distributions from incomplete data. The aim of the present paper is to illustrate some theoretical features of the FAN neuron, to extend its learning theory to asymmetrical density function approximation, and to provide an analytical and numerical comparison with other known density function estimation methods, with special emphasis to the universal approximation ability. The paper also provides a survey of PDF learning from incomplete data, as well as results of several experiments performed on real-world problems and signals. PMID:11709808

  20. Probability of Brownian motion hitting an obstacle

    SciTech Connect

    Knessl, C.; Keller, J.B.

    2000-02-01

    The probability p(x) that Brownian motion with drift, starting at x, hits an obstacle is analyzed. The obstacle {Omega} is a compact subset of R{sup n}. It is shown that p(x) is expressible in terms of the field U(x) scattered by {Omega} when it is hit by plane wave. Therefore results for U(x), and methods for finding U(x) can be used to determine p(x). The authors illustrate this by obtaining exact and asymptotic results for p(x) when {Omega} is a slit in R{sup 2}, and asymptotic results when {Omega} is a disc in R{sup 3}.

  1. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2004-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  2. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-11

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data.

  3. Parabolic Ejecta Features on Titan? Probably Not

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.; Melosh, H. J.

    1996-03-01

    Radar mapping of Venus by Magellan indicated a number of dark parabolic features, associated with impact craters. A suggested mechanism for generating such features is that ejecta from the impact event is 'winnowed' by the zonal wind field, with smaller ejecta particles falling out of the atmosphere more slowly, and hence drifting further. What discriminates such features from simple wind streaks is the 'stingray' or parabolic shape. This is due to the ejecta's spatial distribution prior to being winnowed during fallout, and this distribution is generated by the explosion plume of the impact piercing the atmosphere, allowing the ejecta to disperse pseudoballistically before re-entering the atmosphere, decelerating to terminal velocity and then being winnowed. Here we apply this model to Titan, which has a zonal wind field similar to that of Venus. We find that Cassini will probably not find parabolic features, as the winds stretch the deposition so far that ejecta will form streaks or bands instead.

  4. Trending in Probability of Collision Measurements

    NASA Technical Reports Server (NTRS)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  5. Quantum probabilities for inflation from holography

    SciTech Connect

    Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  6. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2006-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  7. 5426 Sharp: A Probable Hungaria Binary

    NASA Astrophysics Data System (ADS)

    Warner, Brian D.; Benishek, Vladimir; Ferrero, Andrea

    2015-07-01

    Initial CCD photometry observations of the Hungaria asteroid 5426 Sharp in 2014 December and 2015 January at the Center of Solar System Studies-Palmer Divide Station in Landers, CA, showed attenuations from the general lightcurve, indicating the possibility of the asteroid being a binary system. The secondary period was almost exactly an Earth day, prompting a collaboration to be formed with observers in Europe, which eventually allowed establishing two periods: P1 = 4.5609 ± 0.0003 h, A1 = 0.18 ± 0.01 mag and P2 = 24.22 ± 0.02 h, A2 = 0.08 ± 0.01 mag. No mutual events, i.e., occultations and/or eclipses, were seen, therefore the asteroid is considered a probable and not confirmed binary

  8. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-01

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data. PMID:26754250

  9. Evolution probabilities and phylogenetic distance of dinucleotides.

    PubMed

    Michel, Christian J

    2007-11-21

    We develop here an analytical evolution model based on a dinucleotide mutation matrix 16 x 16 with six substitution parameters associated with the three types of substitutions in the two dinucleotide sites. It generalizes the previous models based on the nucleotide mutation matrices 4 x 4. It determines at some time t the exact occurrence probabilities of dinucleotides mutating randomly according to these six substitution parameters. Furthermore, several properties and two applications of this model allow to derive 16 evolutionary analytical solutions of dinucleotides and also a dinucleotide phylogenetic distance. Finally, based on this mathematical model, the SED (Stochastic Evolution of Dinucleotides) web server has been developed for deriving evolutionary analytical solutions of dinucleotides.

  10. Homonymous Hemianopsia Associated with Probable Alzheimer's Disease.

    PubMed

    Ishiwata, Akiko; Kimura, Kazumi

    2016-01-01

    Posterior cortical atrophy (PCA) is a rare neurodegenerative disorder that has cerebral atrophy in the parietal, occipital, or occipitotemporal cortices and is characterized by visuospatial and visuoperceptual impairments. The most cases are pathologically compatible with Alzheimer's disease (AD). We describe a case of PCA in which a combination of imaging methods, in conjunction with symptoms and neurological and neuropsychological examinations, led to its being diagnosed and to AD being identified as its probable cause. Treatment with donepezil for 6 months mildly improved alexia symptoms, but other symptoms remained unchanged. A 59-year-old Japanese woman with progressive alexia, visual deficit, and mild memory loss was referred to our neurologic clinic for the evaluation of right homonymous hemianopsia. Our neurological examination showed alexia, constructional apraxia, mild disorientation, short-term memory loss, and right homonymous hemianopsia. These findings resulted in a score of 23 (of 30) points on the Mini-Mental State Examination. Occipital atrophy was identified, with magnetic resonance imaging (MRI) showing left-side dominance. The MRI data were quantified with voxel-based morphometry, and PCA was diagnosed on the basis of these findings. Single photon emission computed tomography with (123)I-N-isopropyl-p-iodoamphetamine showed hypoperfusion in the corresponding voxel-based morphometry occipital lobes. Additionally, the finding of hypoperfusion in the posterior associate cortex, posterior cingulate gyrus, and precuneus was consistent with AD. Therefore, the PCA was considered to be a result of AD. We considered Lewy body dementia as a differential diagnosis because of the presence of hypoperfusion in the occipital lobes. However, the patient did not meet the criteria for Lewy body dementia during the course of the disease. We therefore consider including PCA in the differential diagnoses to be important for patients with visual deficit, cognitive

  11. Repetition probability effects for inverted faces.

    PubMed

    Grotheer, Mareike; Hermann, Petra; Vidnyánszky, Zoltán; Kovács, Gyula

    2014-11-15

    It has been shown, that the repetition related reduction of the blood-oxygen level dependent (BOLD) signal is modulated by the probability of repetitions (P(rep)) for faces (Summerfield et al., 2008), providing support for the predictive coding (PC) model of visual perception (Rao and Ballard, 1999). However, the stage of face processing where repetition suppression (RS) is modulated by P(rep) is still unclear. Face inversion is known to interrupt higher level configural/holistic face processing steps and if modulation of RS by P(rep) takes place at these stages of face processing, P(rep) effects are expected to be reduced for inverted when compared to upright faces. Therefore, here we aimed at investigating whether P(rep) effects on RS observed for face stimuli originate at the higher-level configural/holistic stages of face processing by comparing these effects for upright and inverted faces. Similarly to previous studies, we manipulated P(rep) for pairs of stimuli in individual blocks of fMRI recordings. This manipulation significantly influenced repetition suppression in the posterior FFA, the OFA and the LO, independently of stimulus orientation. Our results thus reveal that RS in the ventral visual stream is modulated by P(rep) even in the case of face inversion and hence strongly compromised configural/holistic face processing. An additional whole-brain analysis could not identify any areas where the modulatory effect of probability was orientation specific either. These findings imply that P(rep) effects on RS might originate from the earlier stages of face processing.

  12. Practical implementation of joint multitarget probabilities

    NASA Astrophysics Data System (ADS)

    Musick, Stanton; Kastella, Keith D.; Mahler, Ronald P. S.

    1998-07-01

    A Joint Multitarget Probability (JMP) is a posterior probability density pT(x1,...,xTZ) that there are T targets (T an unknown number) with unknown locations specified by the multitarget state X equals (x1,...,xT)T conditioned on a set of observations Z. This paper presents a numerical approximation for implementing JMP in detection, tracking and sensor management applications. A problem with direct implementation of JMP is that, if each xt, t equals 1,...,T, is discretized on a grid of N elements, NT variables are required to represent JMP on the T-target sector. This produces a large computational requirement even for small values of N and T. However, when the sensor easily separates targets, the resulting JMP factorizes and can be approximated by a product representation requiring only O(T2N) variables. Implementation of JMP for multitarget tracking requires a Bayes' rule step for measurement update and a Markov transition step for time update. If the measuring sensor is only influenced by the cell it observes, the JMP product representation is preserved under measurement update. However, the product form is not quite preserved by the Markov time update, but can be restored using a minimum discrimination approach. All steps for the approximation can be performed with O(N) effort. This notion is developed and demonstrated in numerical examples with at most two targets in a 1-dimensional surveillance region. In this case, numerical results for detection and tracking for the product approximation and the full JMP are very similar.

  13. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    PubMed

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  14. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    SciTech Connect

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  15. Exercise in probability and statistics, or the probability of winning at tennis

    NASA Astrophysics Data System (ADS)

    Fischer, Gaston

    1980-01-01

    The relationships between the probabilities p, x, s, and M, of winning, respectively, a point, a game, a set, or a match have been derived. The calculations are carried out under the assumption that these probabilities are averages. For example, x represents an average probability of winning a game when serving and receiving, and the same value of x is assumed to hold also for tie-break games. The formulas derived are for sets played with a tie-break game at the level of 6-6, as well as for the traditional rule requiring an advantage of two games to win a set. Matches to the best of three and five sets are considered. As is to be expected, a small advantage in the probability p of winning a point leads to advantages which are amplified by large factors : 2.5 for games, 7.1 for sets with tie-break at 6-6, 10.6 for matches to the best of three sets, and 13.3 for matches to the best of five sets. When sets are decided according to the traditional rule, the last three factors become, respectively, 7.4, 11.1, and 13.8. The theoretical calculations are compared with real and synthetic tennis scores and good agreement is found. The scatter of the data is seen to obey the predictions of a normal distribution. Some classroom problems are suggested at the end.

  16. Enhanced awakening probability of repetitive impulse sounds.

    PubMed

    Vos, Joos; Houben, Mark M J

    2013-09-01

    In the present study relations between the level of impulse sounds and the observed proportion of behaviorally confirmed awakening reactions were determined. The sounds (shooting sounds, bangs produced by door slamming or by container transshipment, aircraft landings) were presented by means of loudspeakers in the bedrooms of 50 volunteers. The fragments for the impulse sounds consisted of single or multiple events. The sounds were presented during a 6-h period that started 75 min after the subjects wanted to sleep. In order to take account of habituation, each subject participated during 18 nights. At equal indoor A-weighted sound exposure levels, the proportion of awakening for the single impulse sounds was equal to that for the aircraft sounds. The proportion of awakening induced by the multiple impulse sounds, however, was significantly higher. For obtaining the same rate of awakening, the sound level of each of the successive impulses in a fragment had to be about 15-25 dB lower than the level of one single impulse. This level difference was largely independent of the degree of habituation. Various explanations for the enhanced awakening probability are discussed. PMID:23967934

  17. Lectures on probability and statistics. Revision

    SciTech Connect

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  18. The probability of finding suitable directed donors.

    PubMed

    Kanter, M; Selvin, S; Myhre, B A

    1989-02-01

    A series of tables based on mathematical calculations is given as guidelines for the number of directed donors needed by members of various ethnic/racial groups to provide a desired number of units of blood with a selected probability of achieving this result. From these tables, certain conclusions can be drawn. Unrelated donors who do not know their blood type are an inefficient source of directed donors. Rh-negative patients are unlikely to obtain enough directed-donor units from either related or unrelated donors with confidence unless these donors known their blood type. In general, siblings, parents, and offspring are the most efficient directed donors from the standpoint of compatibility. Cousins, uncles, aunts, nieces, and nephews are not much more likely to be compatible than unrelated donors are. It is easier to obtain suitable directed-donor units among Hispanics than among whites, blacks, or Asians, due to their skewed blood group frequencies. In general, using O-negative directed donors for Rh-positive recipients does not significantly increase the likelihood of finding suitable donors.

  19. Probability of rupture of multiple fault segments

    USGS Publications Warehouse

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  20. Essays on probability elicitation scoring rules

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  1. Atomic Transition Probabilities for Neutral Cerium

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; den Hartog, E. A.; Wood, M. P.; Nitz, D. E.; Chisholm, J.; Sobeck, J.

    2009-10-01

    The spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are more complex than spectra of other rare earth species. The resulting high density of lines in the visible makes Ce ideal for use in metal halide (MH) High Intensity Discharge (HID) lamps. Inclusion of cerium-iodide in a lamp dose can improve both the Color Rendering Index and luminous efficacy of a MH-HID lamp. Basic spectroscopic data including absolute atomic transition probabilities for Ce I and Ce II are needed for diagnosing and modeling these MH-HID lamps. Recent work on Ce II [1] is now being augmented with similar work on Ce I. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2000 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. [4pt] [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [0pt] [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).

  2. Do aftershock probabilities decay with time?

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  3. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  4. Probability judgments under ambiguity and conflict

    PubMed Central

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081

  5. Levetiracetam: Probably Associated Diurnal Frequent Urination.

    PubMed

    Ju, Jun; Zou, Li-Ping; Shi, Xiu-Yu; Hu, Lin-Yan; Pang, Ling-Yu

    2016-01-01

    Diurnal frequent urination is a common condition in elementary school children who are especially at risk for associated somatic and behavioral problems. Levetiracetam (LEV) is a broad-spectrum antiepileptic drug that has been used in both partial and generalized seizures and less commonly adverse effects including psychiatric and behavioral problems. Diurnal frequent urination is not a well-known adverse effect of LEV. Here, we reported 2 pediatric cases with epilepsy that developed diurnal frequent urination after LEV administration. Case 1 was a 6-year-old male patient who presented urinary frequency and urgency in the daytime since the third day after LEV was given as adjunctive therapy. Symptoms increased accompanied by the raised dosage of LEV. Laboratory tests and auxiliary examinations did not found evidence of organic disease. Diurnal frequent urination due to LEV was suspected, and then the drug was discontinued. As expected, his frequency of urination returned to normal levels. Another 13-year-old female patient got similar clinical manifestations after oral LEV monotherapy and the symptoms became aggravated while in stress state. Since the most common causes of frequent micturition had been ruled out, the patient was considered to be diagnosed with LEV-associated psychogenic frequent urination. The dosage of LEV was reduced to one-third, and the frequency of urination was reduced by 60%. Both patients got the Naranjo score of 6, which indicated that LEV was a "probable" cause of diurnal frequent urination. Although a definite causal link between LEV and diurnal urinary frequency in the 2 cases remains to be established, we argue that diurnal frequent urination associated with LEV deserves clinician's attention. PMID:26938751

  6. Probably good diagrams for learning: representational epistemic recodification of probability theory.

    PubMed

    Cheng, Peter C-H

    2011-07-01

    The representational epistemic approach to the design of visual displays and notation systems advocates encoding the fundamental conceptual structure of a knowledge domain directly in the structure of a representational system. It is claimed that representations so designed will benefit from greater semantic transparency, which enhances comprehension and ease of learning, and plastic generativity, which makes the meaningful manipulation of the representation easier and less error prone. Epistemic principles for encoding fundamental conceptual structures directly in representational schemes are described. The diagrammatic recodification of probability theory is undertaken to demonstrate how the fundamental conceptual structure of a knowledge domain can be analyzed, how the identified conceptual structure may be encoded in a representational system, and the cognitive benefits that follow. An experiment shows the new probability space diagrams are superior to the conventional approach for learning this conceptually challenging topic.

  7. What is preexisting strength? Predicting free association probabilities, similarity ratings, and cued recall probabilities.

    PubMed

    Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B

    2005-08-01

    Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength. PMID:16447386

  8. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  9. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. PMID:25704578

  10. On lacunary statistical convergence of order α in probability

    NASA Astrophysics Data System (ADS)

    Işık, Mahmut; Et, Kübra Elif

    2015-09-01

    In this study, we examine the concepts of lacunary statistical convergence of order α in probability and Nθ—convergence of order α in probability. We give some relations connected to these concepts.

  11. Probability-summation model of multiple laser-exposure effects.

    PubMed

    Menendez, A R; Cheney, F E; Zuclich, J A; Crump, P

    1993-11-01

    A probability-summation model is introduced to provide quantitative criteria for discriminating independent from interactive effects of multiple laser exposures on biological tissue. Data that differ statistically from predictions of the probability-summation model indicate the action of sensitizing (synergistic/positive) or desensitizing (hardening/negative) biophysical interactions. Interactions are indicated when response probabilities vary with changes in the spatial or temporal separation of exposures. In the absence of interactions, probability-summation parsimoniously accounts for "cumulative" effects. Data analyzed using the probability-summation model show instances of both sensitization and desensitization of retinal tissue by laser exposures. Other results are shown to be consistent with probability-summation. The relevance of the probability-summation model to previous laser-bioeffects studies, models, and safety standards is discussed and an appeal is made for improved empirical estimates of response probabilities for single exposures.

  12. Probability in Theories With Complex Dynamics and Hardy's Fifth Axiom

    NASA Astrophysics Data System (ADS)

    Burić, Nikola

    2010-08-01

    L. Hardy has formulated an axiomatization program of quantum mechanics and generalized probability theories that has been quite influential. In this paper, properties of typical Hamiltonian dynamical systems are used to argue that there are applications of probability in physical theories of systems with dynamical complexity that require continuous spaces of pure states. Hardy’s axiomatization program does not deal with such theories. In particular Hardy’s fifth axiom does not differentiate between such applications of classical probability and quantum probability.

  13. Pretest probability assessment derived from attribute matching

    PubMed Central

    Kline, Jeffrey A; Johnson, Charles L; Pollack, Charles V; Diercks, Deborah B; Hollander, Judd E; Newgard, Craig D; Garvey, J Lee

    2005-01-01

    Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation <2% in a validation set of 8,120 patients evaluated for possible ACS and did not have ST segment elevation on ECG. 1,061 patients were excluded prior to validation analysis because of ST-segment elevation (713), missing data (77) or being lost to follow-up (271). Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82) for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77) for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25%) patients as having a PTP < 2.0%; 28 developed ACS (1.7% 95% CI = 1.1–2.4%). The LRE categorized 244 (4%, 95% CI = 3–4%) with PTP < 2.0%; four developed ACS (1.6%, 95% CI = 0.4–4.1%). Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE. PMID:16095534

  14. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  15. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  16. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  17. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  18. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  19. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  20. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  1. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  2. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  3. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  4. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  5. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  6. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  7. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  8. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  9. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  10. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  11. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  12. Probability Distribution for Flowing Interval Spacing

    SciTech Connect

    S. Kuzio

    2004-09-22

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  13. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  14. Surprisingly rational: probability theory plus noise explains biases in judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. PMID:25090427

  15. Probability shapes perceptual precision: A study in orientation estimation.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention."

  16. On the shape of the probability weighting function.

    PubMed

    Gonzalez, R; Wu, G

    1999-02-01

    Empirical studies have shown that decision makers do not usually treat probabilities linearly. Instead, people tend to overweight small probabilities and underweight large probabilities. One way to model such distortions in decision making under risk is through a probability weighting function. We present a nonparametric estimation procedure for assessing the probability weighting function and value function at the level of the individual subject. The evidence in the domain of gains supports a two-parameter weighting function, where each parameter is given a psychological interpretation: one parameter measures how the decision maker discriminates probabilities, and the other parameter measures how attractive the decision maker views gambling. These findings are consistent with a growing body of empirical and theoretical work attempting to establish a psychological rationale for the probability weighting function. PMID:10090801

  17. Nonlinear Neurobiological Probability Weighting Functions For Aversive Outcomes

    PubMed Central

    Berns, Gregory S.; Capra, C. Monica; Chappelow, Jonathan; Moore, Sara; Noussair, Charles

    2008-01-01

    While mainstream economic models assume that individuals treat probabilities objectively, many people tend to overestimate the likelihood of improbable events and underestimate the likelihood of probable events. However, a biological account for why probabilities would be treated this way does not yet exist. While undergoing fMRI, we presented individuals with a series of lotteries, defined by the voltage of an impending cutaneous electric shock and the probability with which the shock would be received. During the prospect phase, neural activity that tracked the probability of the expected outcome was observed in a circumscribed network of brain regions that included the anterior cingulate, visual, parietal, and temporal cortices. Most of these regions displayed responses to probabilities consistent with nonlinear probability weighting. The neural responses to passive lotteries predicted 79% of subsequent decisions when individuals were offered choices between different lotteries, and exceeded that predicted by behavior alone near the indifference point. PMID:18060809

  18. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  19. Young Star Probably Ejected From Triple System

    NASA Astrophysics Data System (ADS)

    2003-01-01

    Astronomers analyzing nearly 20 years of data from the National Science Foundation's Very Large Array radio telescope have discovered that a small star in a multiple-star system in the constellation Taurus probably has been ejected from the system after a close encounter with one of the system's more-massive components, presumed to be a compact double star. This is the first time any such event has been observed. Path of Small Star, 1983-2001 "Our analysis shows a drastic change in the orbit of this young star after it made a close approach to another object in the system," said Luis Rodriguez of the Institute of Astronomy of the National Autonomous University of Mexico (UNAM). "The young star was accelerated to a large velocity by the close approach, and certainly now is in a very different, more remote orbit, and may even completely escape its companions," said Laurent Loinard, leader of the research team that also included Monica Rodriguez in addition to Luis Rodriguez. The UNAM astronomers presented their findings at the American Astronomical Society's meeting in Seattle, WA. The discovery of this chaotic event will be important for advancing our understanding of classical dynamic astronomy and of how stars evolve, including possibly providing an explanation for the production of the mysterious "brown dwarfs," the astronomers said. The scientists analyzed VLA observations of T Tauri, a multiple system of young stars some 450 light-years from Earth. The observations were made from 1983 to 2001. The T Tauri system includes a "Northern" star, the famous star that gives its name to the class of young visible stars, and a "Southern" system of stars, all orbiting each other. The VLA data were used to track the orbit of the smaller Southern star around the larger Southern object, presumed to be a pair of stars orbiting each other closely. The astronomers' plot of the smaller star's orbit shows that it followed an apparently elliptical orbit around its twin companions

  20. A physical-space approach for the probability hypothesis density and cardinalized probability hypothesis density filters

    NASA Astrophysics Data System (ADS)

    Erdinc, Ozgur; Willett, Peter; Bar-Shalom, Yaakov

    2006-05-01

    The probability hypothesis density (PHD) filter, an automatically track-managed multi-target tracker, is attracting increasing but cautious attention. Its derivation is elegant and mathematical, and thus of course many engineers fear it; perhaps that is currently limiting the number of researchers working on the subject. In this paper, we explore a physical-space approach - a bin model - which leads us to arrive the same filter equations as the PHD. Unlike the original derivation of the PHD filter, the concepts used are the familiar ones of conditional probability. The original PHD suffers from a "target-death" problem in which even a single missed detection can lead to the apparent disappearance of a target. To obviate this, PHD originator Mahler has recently developed a new "cardinalized" version of PHD (CPHD). We are able to extend our physical-space derivation to the CPHD case as well. We stress that the original derivations are mathematically correct, and need no embellishment from us; our contribution here is to offer an alternative derivation, one that we find appealing.

  1. GNSS integer ambiguity validation based on posterior probability

    NASA Astrophysics Data System (ADS)

    Wu, Zemin; Bian, Shaofeng

    2015-10-01

    GNSS integer ambiguity validation is considered to be a challenge task for decades. Several kinds of validation tests are developed and widely used in these years, but theoretical basis is their weakness. Ambiguity validation theoretically is an issue of hypothesis test. In the frame of Bayesian hypothesis testing, posterior probability is the canonical standard that statistical decision should be based on. In this contribution, (i) we derive the posterior probability of the fixed ambiguity based on the Bayesian principle and modify it for practice ambiguity validation. (ii) The optimal property of the posterior probability test is proved based on an extended Neyman-Pearson lemma. Since validation failure rate is the issue users most concerned about, (iii) we derive the failure rate upper bound of the posterior probability test, so the user can use the posterior probability test either in the fixed posterior probability or in the fixed failure rate way. Simulated as well as real observed data are used for experimental validations. The results show that (i) the posterior probability test is the most effective within the R-ratio test, difference test, ellipsoidal integer aperture test and posterior probability test, (ii) the posterior probability test is computational efficient and (iii) the failure rate estimation for posterior probability test is useful.

  2. Dynamics of opinion formation with strengthen selection probability

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Jin, Zhen; Wang, Binghong

    2014-04-01

    The local majority rule is extensively accepted as a paradigmatic model to reflect the formation of opinion. In this paper, we study a model of opinion formation where opinion update rule is not based on the majority rule or linear selection probability but on a strengthen selection probability controlled by an adjustable parameter β. In particular, our proposed probability function can proximately fit the two extreme cases-linear probability function and majority rule or in between the two cases under different values of β. By studying such model on different kinds of networks, including different regular networks and complex networks, we find that there exists an optimal value of β giving the most efficient convergence to consensus regardless of the topology of networks. This work reveals that, compared with the majority rule and linear selection probability, the strengthen selection probability might be a more proper model in understanding the formation of opinions in society.

  3. Measurement outcomes and probability in Everettian quantum mechanics

    NASA Astrophysics Data System (ADS)

    Baker, David J.

    The decision-theoretic account of probability in the Everett or many-worlds interpretation, advanced by David Deutsch and David Wallace, is shown to be circular. Talk of probability in Everett presumes the existence of a preferred basis to identify measurement outcomes for the probabilities to range over. But the existence of a preferred basis can only be established by the process of decoherence, which is itself probabilistic.

  4. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  5. Code System to Calculate Pressure Vessel Failure Probabilities.

    2001-03-27

    Version 00 OCTAVIA (Operationally Caused Transients And Vessel Integrity Analysis) calculates the probability of pressure vessel failure from operationally-caused pressure transients which can occur in a pressurized water reactor (PWR). For specified vessel and operating environment characteristics the program computes the failure pressure at which the vessel will fail for different-sized flaws existing in the beltline and the probability of vessel failure per reactor year due to the flaw. The probabilities are summed over themore » various flaw sizes to obtain the total vessel failure probability. Sensitivity studies can be performed to investigate different vessel or operating characteristics in the same computer run.« less

  6. Probability forecast of the suspended sediment concentration using copula

    NASA Astrophysics Data System (ADS)

    Yu, Kun-xia; Li, Peng; Li, Zhanbin

    2016-04-01

    An approach for probability forecast of the suspended sediment loads is presented in our research. Probability forecast model is established based on the joint probability distribution of water discharge and suspended sediment concentration. The conditional distribution function of suspended sediment concentration given water discharge is evaluated provided the joint probability distribution between water discharge and suspended sediment concentration is constructed, and probability forecast of suspended sediment concentration is implemented in terms of conditional probability function. This approach is exemplified using annual data set of ten watersheds in the middle Yellow River which is characterized by heavy sediment. The three-parameter Gamma distribution is employed to fit the marginal distribution of annual water discharge and annual suspended sediment concentration, and the Gumbel copula can well describe the dependence structure between annual water discharge and annual suspended sediment concentration. Annual suspended sediment concentration estimated from the conditional distribution function with forecast probability of 50 percent agree better with the observed suspended sediment concentration values than the traditional sediment rating curve method given water discharge values. The overwhelming majority of observed suspended sediment concentration points lie between the forecast probability of 5 percent and 95 percent, which can be considered as the lower and upper 95th percent uncertainty bound of the predicted observation respectively. The results indicate that probability forecast on the basis of conditional distribution function is a potential alternative in suspended sediment and other hydrological variables estimation.

  7. Anytime synthetic projection: Maximizing the probability of goal satisfaction

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Bresina, John L.

    1990-01-01

    A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

  8. Oscillations in probability distributions for stochastic gene expression

    SciTech Connect

    Petrosyan, K. G. Hu, Chin-Kun

    2014-05-28

    The phenomenon of oscillations in probability distribution functions of number of components is found for a model of stochastic gene expression. It takes place in cases of low levels of molecules or strong intracellular noise. The oscillations distinguish between more probable even and less probable odd number of particles. The even-odd symmetry restores as the number of molecules increases with the probability distribution function tending to Poisson distribution. We discuss the possibility of observation of the phenomenon in gene, protein, and mRNA expression experiments.

  9. Challenges for Enriching the Curriculum: Statistics and Probability.

    ERIC Educational Resources Information Center

    Swift, Jim

    1983-01-01

    Three probability problems designed to challenge students are presented: Liars and Diamonds, Heads Wins, and Random Walks. Other statistic problems are suggested that could involve computer simulations. (MNS)

  10. Methods for fitting a parametric probability distribution to most probable number data.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-07-01

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  11. [Arson and (para-) suicide].

    PubMed

    Lange, E; Kirsch, M

    1988-11-01

    There have been forensic-psychiatric observations from 1963 to 1983 concerning the offenders responsibility in case of deliberate arson with 12 out of 147 suits being closely related to (para-)suicide. According the variety of relations we distinguish between fire as pure means of suicide, fire used to take along the living space or people, suicide committed in consequence of arson, furthermore arson as a symbolic suicide, and finally acting alternately with both arson as well as parasuicide.

  12. A Mathematical Microworld for Students to Learn Introductory Probability.

    ERIC Educational Resources Information Center

    Jiang, Zhonghong; Potter, Walter D.

    1993-01-01

    Describes the Microworld Chance, a simulation-oriented computer environment that allows students to explore probability concepts in five subenvironments: coins, dice, spinners, thumbtacks, and marbles. Results of a teaching experiment to examine the effectiveness of the microworld in changing students' misconceptions about probability are…

  13. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  14. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  15. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  16. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  17. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  18. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of flight... probability estimate must use accurate data, scientific principles, and a method that is statistically...

  19. Multiple-event probability in general-relativistic quantum mechanics

    SciTech Connect

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-04-15

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse.

  20. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  1. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  2. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    ERIC Educational Resources Information Center

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response…

  3. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    ERIC Educational Resources Information Center

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…

  4. A New Way to Evaluate the Probability and Fresnel Integrals

    ERIC Educational Resources Information Center

    Khalili, Parviz

    2007-01-01

    In this article, we show how "Laplace Transform" may be used to evaluate variety of nontrivial improper integrals, including "Probability" and "Fresnel" integrals. The algorithm we have developed here to evaluate "Probability, Fresnel" and other similar integrals seems to be new. This method transforms the evaluation of certain improper integrals…

  5. Misconceptions in Rational Numbers, Probability, Algebra, and Geometry

    ERIC Educational Resources Information Center

    Rakes, Christopher R.

    2010-01-01

    In this study, the author examined the relationship of probability misconceptions to algebra, geometry, and rational number misconceptions and investigated the potential of probability instruction as an intervention to address misconceptions in all 4 content areas. Through a review of literature, 5 fundamental concepts were identified that, if…

  6. A "Virtual Spin" on the Teaching of Probability

    ERIC Educational Resources Information Center

    Beck, Shari A.; Huse, Vanessa E.

    2007-01-01

    This article, which describes integrating virtual manipulatives with the teaching of probability at the elementary level, puts a "virtual spin" on the teaching of probability to provide more opportunities for students to experience successful learning. The traditional use of concrete manipulatives is enhanced with virtual coins and spinners from…

  7. Generalized emptiness formation probability in the six-vertex model

    NASA Astrophysics Data System (ADS)

    Colomo, F.; Pronko, A. G.; Sportiello, A.

    2016-10-01

    In the six-vertex model with domain wall boundary conditions, the emptiness formation probability is the probability that a rectangular region in the top left corner of the lattice is frozen. We generalize this notion to the case where the frozen region has the shape of a generic Young diagram. We derive here a multiple integral representation for this correlation function.

  8. Prizes in Cereal Boxes: An Application of Probability.

    ERIC Educational Resources Information Center

    Litwiller, Bonnie H.; Duncan, David R.

    1992-01-01

    Presents four cases of real-world probabilistic situations to promote more effective teaching of probability. Calculates the probability of obtaining six of six different prizes successively in six, seven, eight, and nine boxes of cereal, generalizes the problem to n boxes of cereal, and offers suggestions to extend the problem. (MDH)

  9. On the Provenance of Judgments of Conditional Probability

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Shah, Anuj; Osherson, Daniel

    2009-01-01

    In standard treatments of probability, Pr(A[vertical bar]B) is defined as the ratio of Pr(A[intersection]B) to Pr(B), provided that Pr(B) greater than 0. This account of conditional probability suggests a psychological question, namely, whether estimates of Pr(A[vertical bar]B) arise in the mind via implicit calculation of…

  10. Probabilities of Natural Events Occurring at Savannah River Plant

    SciTech Connect

    Huang, J.C.

    2001-07-17

    This report documents the comprehensive evaluation of probability models of natural events which are applicable to Savannah River Plant. The probability curves selected for these natural events are recommended to be used by all SRP/SRL safety analysts. This will ensure a consistency in analysis methodology for postulated SAR incidents involving natural phenomena.

  11. A Probability Model of Accuracy in Deception Detection Experiments.

    ERIC Educational Resources Information Center

    Park, Hee Sun; Levine, Timothy R.

    2001-01-01

    Extends the recent work on the veracity effect in deception detection. Explains the probabilistic nature of a receiver's accuracy in detecting deception and analyzes a receiver's detection of deception in terms of set theory and conditional probability. Finds that accuracy is shown to be a function of the relevant conditional probability and the…

  12. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  13. Use of External Visual Representations in Probability Problem Solving

    ERIC Educational Resources Information Center

    Corter, James E.; Zahner, Doris C.

    2007-01-01

    We investigate the use of external visual representations in probability problem solving. Twenty-six students enrolled in an introductory statistics course for social sciences graduate students (post-baccalaureate) solved eight probability problems in a structured interview format. Results show that students spontaneously use self-generated…

  14. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her...

  15. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her...

  16. Decision-Making Processes: Sensitivity to Sequentially Experienced Outcome Probabilities

    ERIC Educational Resources Information Center

    Boyer, Ty W.

    2007-01-01

    A computerized sequential event sampling decision-making task was administered to 187 5- to 10-year-olds and adults. Participants made a series of choices between alternatives that differed in win probability (Study 1) or win and loss probability (Study 2). Intuitive and more explicit measures were used. Study 1 revealed that, across ages,…

  17. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her...

  18. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her...

  19. A Quantum Theoretical Explanation for Probability Judgment Errors

    ERIC Educational Resources Information Center

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  20. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  1. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  2. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her...

  3. Objective probability-like things with and without objective indeterminism

    NASA Astrophysics Data System (ADS)

    Szabó, László E.

    I shall argue that there is no such property of an event as its "probability". This is why standard interpretations cannot give a sound definition in empirical terms of what "probability" is, and this is why empirical sciences like physics can manage without such a definition. "Probability" is a collective term, the meaning of which varies from context to context: it means different-dimensionless [ 0, 1 ] -valued-physical quantities characterising the different particular situations. In other words, probability is a reducible concept, supervening on physical quantities characterising the state of affairs corresponding to the event in question. On the other hand, however, these "probability-like" physical quantities correspond to objective features of the physical world, and are objectively related to measurable quantities like relative frequencies of physical events based on finite samples-no matter whether the world is objectively deterministic or indeterministic.

  4. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed.

  5. A discussion on the origin of quantum probabilities

    SciTech Connect

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-15

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.

  6. Probability theory, not the very guide of life.

    PubMed

    Juslin, Peter; Nilsson, Håkan; Winman, Anders

    2009-10-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality. PMID:19839686

  7. Decision-making processes: sensitivity to sequentially experienced outcome probabilities.

    PubMed

    Boyer, Ty W

    2007-05-01

    A computerized sequential event sampling decision-making task was administered to 187 5- to 10-year-olds and adults Participants made a series of choices between alternatives that differed in win probability (Study 1) or win and loss probability (Study 2). Intuitive and more explicit measures were used. Study 1 revealed that, across ages, participants demonstrated intuitive sensitivity to probability; however, adult participants evidenced greater sensitivity than did children, and younger children failed to demonstrate more explicit understanding of probability. Study 2 also revealed that children were intuitively sensitive to probability; however, the inclusion of loss had limited impact on decision processes. These findings and their relevance to cognitive developmental theory are discussed.

  8. Probability in the Many-Worlds Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Vaidman, Lev

    It is argued that, although in the Many-Worlds Interpretation of quantum mechanics there is no "probability" for an outcome of a quantum experiment in the usual sense, we can understand why we have an illusion of probability. The explanation involves: (a) A "sleeping pill" gedanken experiment which makes correspondence between an illegitimate question: "What is the probability of an outcome of a quantum measurement?" with a legitimate question: "What is the probability that `I' am in the world corresponding to that outcome?"; (b) A gedanken experiment which splits the world into several worlds which are identical according to some symmetry condition; and (c) Relativistic causality, which together with (b) explain the Born rule of standard quantum mechanics. The Quantum Sleeping Beauty controversy and "caring measure" replacing probability measure are discussed.

  9. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  10. Electrofishing capture probability of smallmouth bass in streams

    USGS Publications Warehouse

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  11. Two-slit experiment: quantum and classical probabilities

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2015-06-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).

  12. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  13. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  14. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis. PMID:24885680

  15. Visualizing RNA base-pairing probabilities with RNAbow diagrams.

    PubMed

    Aalberts, Daniel P; Jannen, William K

    2013-04-01

    There are many effective ways to represent a minimum free energy RNA secondary structure that make it easy to locate its helices and loops. It is a greater challenge to visualize the thermal average probabilities of all folds in a partition function sum; dot plot representations are often puzzling. Therefore, we introduce the RNAbows visualization tool for RNA base pair probabilities. RNAbows represent base pair probabilities with line thickness and shading, yielding intuitive diagrams. RNAbows aid in disentangling incompatible structures, allow comparisons between clusters of folds, highlight differences between wild-type and mutant folds, and are also rather beautiful.

  16. [Probable flora: an expression mean of ecological gradients in France].

    PubMed

    Garbolino, Emmanuel; De Ruffray, Patrice; Brisse, Henry; Grandjouan, Gilles

    2013-02-01

    The application of the criterion of fidelity of plants to plants over four millions botanical observations in France is considered to characterize the ecology of 215,000 phytosociological surveys. Among those discriminant plants, some are missing of the surveys, but they can have a certain probability of occurrence: these plants are called "probable plants" and they represent the "probable flora" of a territory. The study of their geographical distribution shows ecological gradients of flora across France in a better way than only considering the botanical observations. In fact, this method mitigates the discontinuities of taxa observations whose absence may be due to historical and/or anthropogenic factors.

  17. Calculation of inclusive probabilities from single-particle amplitudes

    NASA Astrophysics Data System (ADS)

    Kürpick, Peter; Lüdde, Hans Jürgen

    1993-04-01

    On the basis of the independent particle model, used to describe collisions between ions and atoms involving many electrons, the formalism of inclusive probabilities allows the computation of many-electron transition probabilities from single-particle amplitudes. The method presented can answer practically any experimental question formulated in terms of a certain number of vacancies and occupancies as can be measured in a typical ion-atom collision experiment. It is specialised to calculate many-particle probabilities with respect to a minimum number of vacancies or occupancies in one or more subshells as obtained e.g. in KLL- or KLM-Auger spectra.

  18. Origin of probabilities and their application to the multiverse

    NASA Astrophysics Data System (ADS)

    Albrecht, Andreas; Phillips, Daniel

    2014-12-01

    We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.

  19. Quantum correlations in terms of neutrino oscillation probabilities

    NASA Astrophysics Data System (ADS)

    Alok, Ashutosh Kumar; Banerjee, Subhashish; Uma Sankar, S.

    2016-08-01

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  20. Exact integration of height probabilities in the Abelian Sandpile model

    NASA Astrophysics Data System (ADS)

    Caracciolo, Sergio; Sportiello, Andrea

    2012-09-01

    The height probabilities for the recurrent configurations in the Abelian Sandpile model on the square lattice have analytic expressions, in terms of multidimensional quadratures. At first, these quantities were evaluated numerically with high accuracy and conjectured to be certain cubic rational-coefficient polynomials in π-1. Later their values were determined by different methods. We revert to the direct derivation of these probabilities, by computing analytically the corresponding integrals. Once again, we confirm the predictions on the probabilities, and thus, as a corollary, the conjecture on the average height, <ρ> = 17/8.

  1. Probability Tables: a generic tool for representing and propagating uncertainties

    NASA Astrophysics Data System (ADS)

    Coste-Delclaux, M.; Diop, C. M.; Lahaye, S.

    2014-06-01

    Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of Nuclear Reactor Physics, this tool is currently used to represent the variation of cross-sections versus energy (TRIPOLI4®, MCNP, APOLLO2, ECCO/ERANOS, …). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two very simple mathematical problems: an eigenvalue problem (neutron multiplication factor, …) or a depletion problem.

  2. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  3. 10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR FIRE WATER STORAGE, LOOKING SOUTHWEST. - Whitsett Pump Plant, West side of Colorado River, north of Parker Dam, Parker Dam, San Bernardino County, CA

  4. 2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES AN ORIGINAL 4/4 DOUBLE-HUNG WINDOW. - Tonto Ranger Station, Barn, Forest Service Road 65 at Tonto Wash, Skull Valley, Yavapai County, AZ

  5. 42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED FUMES FROM THE DIPPING OPERATIONS IN BUILDING 49A; BUILDING 49 IS AT THE LEFT OF THE PHOTOGRAPH - Scovill Brass Works, 59 Mill Street, Waterbury, New Haven County, CT

  6. Children's Conceptions of Probability--A Psychological and Pedagogical Review.

    ERIC Educational Resources Information Center

    Hawkins, Anne S.; Kapadia, Ramesh

    1984-01-01

    Identifies key questions concerning children's intuitions and conceptions of probabilistic notions. Research on the theoretical framework for probability studies, misconceptions and strategies, and pupil attainment is reviewed, and the methodology is evaluated. Finally, implications for classroom practice are discussed. (MNS)

  7. Adaptive MFR parameter control: fixed vs. variable probabilities of detection

    NASA Astrophysics Data System (ADS)

    Boers, Yvo; Driessen, Hans; Zwaga, Jitse

    2005-09-01

    In this paper an efficient adaptive parameter control scheme for Multi Function Radar (MFR) is used. This scheme has been introduced in.5 The scheme has been designed in such a way that it meets constraints on specific quantities that are relevant for target tracking while minimizing the energy spent. It is shown here, that this optimal scheme leads to a considerable variation of the realized detection probability, even within a single scenario. We also show that constraining or fixing the probability of detection to a certain predefined value leads to a considerable increase in the energy spent on the target. This holds even when one optimizes the fixed probability of detection. The bottom line message is that the detection probability is not a design parameter by itself, but merely the product of an optimal schedule.

  8. Subjective probability intervals: how to reduce overconfidence by interval evaluation.

    PubMed

    Winman, Anders; Hansson, Patrik; Juslin, Peter

    2004-11-01

    Format dependence implies that assessment of the same subjective probability distribution produces different conclusions about over- or underconfidence depending on the assessment format. In 2 experiments, the authors demonstrate that the overconfidence bias that occurs when participants produce intervals for an uncertain quantity is almost abolished when they evaluate the probability that the same intervals include the quantity. The authors successfully apply a method for adaptive adjustment of probability intervals as a debiasing tool and discuss a tentative explanation in terms of a naive sampling model. According to this view, people report their experiences accurately, but they are naive in that they treat both sample proportion and sample dispersion as unbiased estimators, yielding small bias in probability evaluation but strong bias in interval production. PMID:15521796

  9. The role of probability of reinforcement in models of choice.

    PubMed

    Williams, B A

    1994-10-01

    A general account of choice behavior in animals, the cumulative effects model, has been proposed by Davis, Staddon, Machado, and Palmer (1993). Its basic assumptions are that choice occurs in an all-or-none fashion for the response alternative with the highest probability of reinforcement and that the probability of reinforcement for each response alternative is calculated from the entire history of training (total number of reinforced responses/total number of reinforced and nonreinforced responses). The model's reliance on probability of reinforcement as the fundamental variable controlling choice behavior subjects the cumulative effects model to the same criticisms as have been directed toward other related models of choice, notably melioration theory. Several different data sets show that the relative value of a response alternative is not predicted by the obtained probability of reinforcement associated with that alternative. Alternative approaches to choice theory are considered.

  10. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  11. Quantum Probability Theory and the Foundations of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Fröhlich, Jürg; Schubnel, Baptiste

    By and large, people are better at coining expressions than at filling them with interesting, concrete contents. Thus, it may not be very surprising that there are many professional probabilists who may have heard the expression but do not appear to be aware of the need to develop "quantum probability theory" into a thriving, rich, useful field featured at meetings and conferences on probability theory. Although our aim, in this essay, is not to contribute new results on quantum probability theory, we hope to be able to let the reader feel the enormous potential and richness of this field. What we intend to do, in the following, is to contribute some novel points of view to the "foundations of quantum mechanics", using mathematical tools from "quantum probability theory" (such as the theory of operator algebras).

  12. Sculpture, general view looking to the seated lions, probably from ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Sculpture, general view looking to the seated lions, probably from the American Bungalow - National Park Seminary, Bounded by Capitol Beltway (I-495), Linden Lane, Woodstove Avenue, & Smith Drive, Silver Spring, Montgomery County, MD

  13. 17. Photocopy of photograph (source uncertain; probably the Bucks County ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    17. Photocopy of photograph (source uncertain; probably the Bucks County Historical Society) ca. 1913, photographer unknown SHOWING 1904 SOCIETY HEADQUARTERS AND THE EARLY PHASES OF CONSTRUCTION ON MERCER'S ADDITION - Mercer Museum, Pine & Ashland Streets, Doylestown, Bucks County, PA

  14. Review of Literature for Model Assisted Probability of Detection

    SciTech Connect

    Meyer, Ryan M.; Crawford, Susan L.; Lareau, John P.; Anderson, Michael T.

    2014-09-30

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  15. Probable maximum flood at Lake Chippewa near Winter, Wisconsin

    USGS Publications Warehouse

    Krug, William R.

    1976-01-01

    The probable maximum flood was computed for Lake Chippewa, Wisconsin, and routed through the lake to determine maximum lake stage. The peak discharge of the probable maximum flood at Lake Chippewa was computed to be about 75,000 cubic feet per second, primarily caused by rainfall on the lake. A secondary peak of about 41,000 cubic feet per second was due to streamflow entering Lake Chippewa. The 14-day volume of this flood was 450 ,000 acre-feet. Using an assumed operating procedure for Winter Dam, the maximum lake stage for the probable maximum flood was computed to be about 1,318 feet above mean sea level--about 3 feet below the dam crest and 6 feet above the proposed normal summer operating level. The probability of this flood occurring in any year is less than 1 in 10,000. (Woodard-USGS)

  16. Use of Spreadsheets in Introductory Statistics and Probability.

    ERIC Educational Resources Information Center

    Mitchell, Brian S.

    1997-01-01

    Provides details of a course that introduces chemical engineering students to a variety of spreadsheet applications. The course is taken concurrently with stoichiometry and introduces statistical analysis, probability, reliability, and quality control. (DDR)

  17. Exploring Probability through an Evens-Odds Dice Game.

    ERIC Educational Resources Information Center

    Quinn, Robert J.; Wiest, Lynda R.

    1999-01-01

    Presents a dice game that students can use as a basis for exploring mathematical probabilities and making decisions while they also exercise skills in multiplication, pattern identification, proportional thinking, and communication. (ASK)

  18. The use of sequential probabilities in the segmentation of speech.

    PubMed

    van der Lugt, A H

    2001-07-01

    The present investigation addresses the possible utility of sequential probabilities in the segmentation of spoken language. In a series of five word- spotting and two control lexical decision experiments, high- versus low-probability consonant-vowel (Experiments 1, 2, 5, and 7) and vowel-consonant (Experiments 1, 3, 4, and 6) strings were presented either in the nonsense contexts of target words (Experiments 1-3) or within the target words themselves (Experiments 4-7). The results suggest that listeners, at least for sequences in the onset position, indeed use sequential probabilities as cues for segmentation. The probability of a sound sequence influenced segmentation more when the sequence occurred within the target words (Experiments 4-7 vs. Experiments 1-3). Furthermore, the effects were reliable only when the sequences occurred in the onset position (Experiments 1, 2, 5, and 7 vs. Experiments 1, 3, 4, and 6). PMID:11521849

  19. 15. STONE BURR MILL PROBABLY USED FOR GRINDING FEED. ONLY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. STONE BURR MILL PROBABLY USED FOR GRINDING FEED. ONLY REMAINING MILL MACHINERY William E. Barrett, photographer, 1973 - Thomas Shepherd's Grist Mill, High Street Vicinity, Shepherdstown, Jefferson County, WV

  20. Coincidence probabilities for spacecraft gravitational wave experiments - Massive coalescing binaries

    NASA Technical Reports Server (NTRS)

    Tinto, Massimo; Armstrong, J. W.

    1991-01-01

    Massive coalescing binary systems are candidate sources of gravitational radiation in the millihertz frequency band accessible to spacecraft Doppler tracking experiments. This paper discusses signal processing and detection probability for waves from coalescing binaries in the regime where the signal frequency increases linearly with time, i.e., 'chirp' signals. Using known noise statistics, thresholds with given false alarm probabilities are established for one- and two-spacecraft experiments. Given the threshold, the detection probability is calculated as a function of gravitational wave amplitude for both one- and two-spacecraft experiments, assuming random polarization states and under various assumptions about wave directions. This allows quantitative statements about the detection efficiency of these experiments and the utility of coincidence experiments. In particular, coincidence probabilities for two-spacecraft experiments are insensitive to the angle between the directions to the two spacecraft, indicating that near-optical experiments can be done without constraints on spacecraft trajectories.

  1. Applications of the Dirichlet distribution to forensic match probabilities.

    PubMed

    Lange, K

    1995-01-01

    The Dirichlet distribution provides a convenient conjugate prior for Bayesian analyses involving multinomial proportions. In particular, allele frequency estimation can be carried out with a Dirichlet prior. If data from several distinct populations are available, then the parameters characterizing the Dirichlet prior can be estimated by maximum likelihood and then used for allele frequency estimation in each of the separate populations. This empirical Bayes procedure tends to moderate extreme multinomial estimates based on sample proportions. The Dirichlet distribution can also be employed to model the contributions from different ancestral populations in computing forensic match probabilities. If the ancestral populations are in genetic equilibrium, then the product rule for computing match probabilities is valid conditional on the ancestral contributions to a typical person of the reference population. This fact facilitates computation of match probabilities and tight upper bounds to match probabilities.

  2. A Note on the Sequential Probability Ratio Test

    ERIC Educational Resources Information Center

    Thomas, Ewart A. C.

    1975-01-01

    Given reflection symmetry, the moment generating function symmetry is necessary and sufficient for the random walk model to be equivalent to a sequential probability ratio test. For a related article, see TM 501 715. (Author/RC)

  3. Strength of evidence, judged probability, and choice under uncertainty.

    PubMed

    Fox, C R

    1999-02-01

    This paper traces, within subjects, the relationship between assessed strength of evidence, judgments of probability, and decisions under uncertainty. The investigation relies on the theoretical framework provided by support theory (Tversky & Koehler, 1994; Rottenstreich & Tversky, 1997), a nonextensional model of judgment under uncertainty. Fans of professional basketball (N = 50) judged the probability that each of eight teams, four divisions, and two conferences would win the National Basketball Association championship. Additionally, participants rated the relative strength of each team, judged the probability that a given team would win the championship assuming a particular pairing in the finals, priced prospects contingent on the winner of the championship, and made choices between chance prospects. The data conformed to the major tenets of support theory, and the predicted relationships between assessed strength of evidence, hypothetical support, judged probabilities, and choices under uncertainty also held quite well.

  4. Topology of optimally controlled quantum mechanical transition probability landscapes

    SciTech Connect

    Rabitz, H.; Ho, T.-S.; Hsieh, M.; Kosut, R.; Demiralp, M.

    2006-07-15

    An optimally controlled quantum system possesses a search landscape defined by the physical objective as a functional of the control field. This paper particularly explores the topological structure of quantum mechanical transition probability landscapes. The quantum system is assumed to be controllable and the analysis is based on the Euler-Lagrange variational equations derived from a cost function only requiring extremizing the transition probability. It is shown that the latter variational equations are automatically satisfied as a mathematical identity for control fields that either produce transition probabilities of zero or unit value. Similarly, the variational equations are shown to be inconsistent (i.e., they have no solution) for any control field that produces a transition probability different from either of these two extreme values. An upper bound is shown to exist on the norm of the functional derivative of the transition probability with respect to the control field anywhere over the landscape. The trace of the Hessian, evaluated for a control field producing a transition probability of a unit value, is shown to be bounded from below. Furthermore, the Hessian at a transition probability of unit value is shown to have an extensive null space and only a finite number of negative eigenvalues. Collectively, these findings show that (a) the transition probability landscape extrema consists of values corresponding to no control or full control, (b) approaching full control involves climbing a gentle slope with no false traps in the control space and (c) an inherent degree of robustness exists around any full control solution. Although full controllability may not exist in some applications, the analysis provides a basis to understand the evident ease of finding controls that produce excellent yields in simulations and in the laboratory.

  5. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  6. Probability density function modeling for sub-powered interconnects

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Amaricǎi, Alexandru

    2016-06-01

    This paper proposes three mathematical models for reliability probability density function modeling the interconnect supplied at sub-threshold voltages: spline curve approximations, Gaussian models,and sine interpolation. The proposed analysis aims at determining the most appropriate fitting for the switching delay - probability of correct switching for sub-powered interconnects. We compare the three mathematical models with the Monte-Carlo simulations of interconnects for 45 nm CMOS technology supplied at 0.25V.

  7. Weak measurements measure probability amplitudes (and very little else)

    NASA Astrophysics Data System (ADS)

    Sokolovski, D.

    2016-04-01

    Conventional quantum mechanics describes a pre- and post-selected system in terms of virtual (Feynman) paths via which the final state can be reached. In the absence of probabilities, a weak measurement (WM) determines the probability amplitudes for the paths involved. The weak values (WV) can be identified with these amplitudes, or their linear combinations. This allows us to explain the "unusual" properties of the WV, and avoid the "paradoxes" often associated with the WM.

  8. Generalized Sequential Probability Ratio Test for Separate Families of Hypotheses

    PubMed Central

    Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang

    2014-01-01

    In this paper, we consider the problem of testing two separate families of hypotheses via a generalization of the sequential probability ratio test. In particular, the generalized likelihood ratio statistic is considered and the stopping rule is the first boundary crossing of the generalized likelihood ratio statistic. We show that this sequential test is asymptotically optimal in the sense that it achieves asymptotically the shortest expected sample size as the maximal type I and type II error probabilities tend to zero. PMID:27418716

  9. Universality probability of a prefix-free machine.

    PubMed

    Barmpalias, George; Dowe, David L

    2012-07-28

    We study the notion of universality probability of a universal prefix-free machine, as introduced by C. S. Wallace. We show that it is random relative to the third iterate of the halting problem and determine its Turing degree and its place in the arithmetical hierarchy of complexity. Furthermore, we give a computational characterization of the real numbers that are universality probabilities of universal prefix-free machines. PMID:22711870

  10. An Alternative Approach to the Total Probability Formula. Classroom Notes

    ERIC Educational Resources Information Center

    Wu, Dane W. Wu; Bangerter, Laura M.

    2004-01-01

    Given a set of urns, each filled with a mix of black chips and white chips, what is the probability of drawing a black chip from the last urn after some sequential random shifts of chips among the urns? The Total Probability Formula (TPF) is the common tool to solve such a problem. However, when the number of urns is more than two and the number…

  11. Universality probability of a prefix-free machine.

    PubMed

    Barmpalias, George; Dowe, David L

    2012-07-28

    We study the notion of universality probability of a universal prefix-free machine, as introduced by C. S. Wallace. We show that it is random relative to the third iterate of the halting problem and determine its Turing degree and its place in the arithmetical hierarchy of complexity. Furthermore, we give a computational characterization of the real numbers that are universality probabilities of universal prefix-free machines.

  12. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    NASA Astrophysics Data System (ADS)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    2014-06-01

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.

  13. Error probabilities in optical PPM receivers with Gaussian mixture densities

    NASA Technical Reports Server (NTRS)

    Gagliardi, R. M.

    1982-01-01

    A Gaussian mixture density arises when a discrete variable (e.g., a photodetector count variable) is added to a continuous Gaussian variable (e.g., thermal noise). Making use of some properties of photomultiplier Gaussian mixture distributions, some approximate error probability formulas can be derived. These appear as averages of M-ary orthogonal Gaussian error probabilities. The use of a pure Gaussian assumption is considered, and when properly defined, appears as an accurate upper bound to performance.

  14. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    SciTech Connect

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  15. Persistence probabilities of the German DAX and Shanghai Index

    NASA Astrophysics Data System (ADS)

    Ren, F.; Zheng, B.; Lin, H.; Wen, L. Y.; Trimper, S.

    2005-05-01

    We present a relatively detailed analysis of the persistence probability distributions in financial dynamics. Compared with the auto-correlation function, the persistence probability distributions describe dynamic correlations nonlocal in time. Universal and non-universal behaviors of the German DAX and Shanghai Index are analyzed, and numerical simulations of some microscopic models are also performed. Around the fixed point z0=0, the interacting herding model produces the scaling behavior of the real markets.

  16. Estimation of the probability of error without ground truth and known a priori probabilities. [remote sensor performance

    NASA Technical Reports Server (NTRS)

    Havens, K. A.; Minster, T. C.; Thadani, S. G.

    1976-01-01

    The probability of error or, alternatively, the probability of correct classification (PCC) is an important criterion in analyzing the performance of a classifier. Labeled samples (those with ground truth) are usually employed to evaluate the performance of a classifier. Occasionally, the numbers of labeled samples are inadequate, or no labeled samples are available to evaluate a classifier's performance; for example, when crop signatures from one area from which ground truth is available are used to classify another area from which no ground truth is available. This paper reports the results of an experiment to estimate the probability of error using unlabeled test samples (i.e., without the aid of ground truth).

  17. Anticipating abrupt shifts in temporal evolution of probability of eruption

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Loschetter, Annick

    2016-04-01

    Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value and to the one of high probability value: the latter value generally supports the call for evacuation. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to >70% could be identified up 4 hours in advance, ~2.5 days before the evacuation call (decided for an eruption probability >80% during the MESIMEX exercise). This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.

  18. Reinforcement Probability Modulates Temporal Memory Selection and Integration Processes

    PubMed Central

    Matell, Matthew S.; Kurti, Allison N.

    2013-01-01

    We have previously shown that rats trained in a mixed-interval peak procedure (tone = 4s, light = 12s) respond in a scalar manner at a time in between the trained peak times when presented with the stimulus compound (Swanton & Matell, 2011). In our previous work, the two component cues were reinforced with different probabilities (short = 20%, long = 80%) to equate response rates, and we found that the compound peak time was biased toward the cue with the higher reinforcement probability. Here, we examined the influence that different reinforcement probabilities have on the temporal location and shape of the compound response function. We found that the time of peak responding shifted as a function of the relative reinforcement probability of the component cues, becoming earlier as the relative likelihood of reinforcement associated with the short cue increased. However, as the relative probabilities of the component cues grew dissimilar, the compound peak became non-scalar, suggesting that the temporal control of behavior shifted from a process of integration to one of selection. As our previous work has utilized durations and reinforcement probabilities more discrepant than those used here, these data suggest that the processes underlying the integration/selection decision for time are based on cue value. PMID:23896560

  19. The preference of probability over negative values in action selection.

    PubMed

    Neyedli, Heather F; Welsh, Timothy N

    2015-01-01

    It has previously been found that when participants are presented with a pair of motor prospects, they can select the prospect with the largest maximum expected gain (MEG). Many of those decisions, however, were trivial because of large differences in MEG between the prospects. The purpose of the present study was to explore participants' preferences when making non-trivial decisions between two motor prospects. Participants were presented with pairs of prospects that: 1) differed in MEG with either only the values or only the probabilities differing between the prospects; and 2) had similar MEG with one prospect having a larger probability of hitting the target and a higher penalty value and the other prospect a smaller probability of hitting the target but a lower penalty value. In different experiments, participants either had 400 ms or 2000 ms to decide between the prospects. It was found that participants chose the configuration with the larger MEG more often when the probability varied between prospects than when the value varied. In pairs with similar MEGs, participants preferred a larger probability of hitting the target over a smaller penalty value. These results indicate that participants prefer probability information over negative value information in a motor selection task.

  20. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    PubMed

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  1. The preference of probability over negative values in action selection.

    PubMed

    Neyedli, Heather F; Welsh, Timothy N

    2015-01-01

    It has previously been found that when participants are presented with a pair of motor prospects, they can select the prospect with the largest maximum expected gain (MEG). Many of those decisions, however, were trivial because of large differences in MEG between the prospects. The purpose of the present study was to explore participants' preferences when making non-trivial decisions between two motor prospects. Participants were presented with pairs of prospects that: 1) differed in MEG with either only the values or only the probabilities differing between the prospects; and 2) had similar MEG with one prospect having a larger probability of hitting the target and a higher penalty value and the other prospect a smaller probability of hitting the target but a lower penalty value. In different experiments, participants either had 400 ms or 2000 ms to decide between the prospects. It was found that participants chose the configuration with the larger MEG more often when the probability varied between prospects than when the value varied. In pairs with similar MEGs, participants preferred a larger probability of hitting the target over a smaller penalty value. These results indicate that participants prefer probability information over negative value information in a motor selection task. PMID:25004846

  2. Revising probability estimates: Why increasing likelihood means increasing impact.

    PubMed

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record PMID:27281350

  3. Establishment probability in fluctuating environments: a branching process model.

    PubMed

    Haccou, P; Iwasa, Y

    1996-12-01

    We study the establishment probability of invaders in stochastically fluctuating environments and the related issue of extinction probability of small populations in such environments, by means of an inhomogeneous branching process model. In the model it is assumed that individuals reproduce asexually during discrete reproduction periods. Within each period, individuals have (independent) Poisson distributed numbers of offspring. The expected numbers of offspring per individual are independently identically distributed over the periods. It is shown that the establishment probability of an invader varies over the reproduction periods according to a stable distribution. We give a method for simulating the establishment probabilities and approximations for the expected establishment probability. Furthermore, we show that, due to the stochasticity of the establishment success over different periods, the expected success of sequential invasions is larger then that of simultaneous invasions and we study the effects of environmental fluctuations on the extinction probability of small populations and metapopulations. The results can easily be generalized to other offspring distributions than the Poisson. PMID:9000490

  4. Probability analysis of position errors using uncooled IR stereo camera

    NASA Astrophysics Data System (ADS)

    Oh, Jun Ho; Lee, Sang Hwa; Lee, Boo Hwan; Park, Jong-Il

    2016-05-01

    This paper analyzes the random phenomenon of 3D positions when tracking moving objects using the infrared (IR) stereo camera, and proposes a probability model of 3D positions. The proposed probability model integrates two random error phenomena. One is the pixel quantization error which is caused by discrete sampling pixels in estimating disparity values of stereo camera. The other is the timing jitter which results from the irregular acquisition-timing in the uncooled IR cameras. This paper derives a probability distribution function by combining jitter model with pixel quantization error. To verify the proposed probability function of 3D positions, the experiments on tracking fast moving objects are performed using IR stereo camera system. The 3D depths of moving object are estimated by stereo matching, and be compared with the ground truth obtained by laser scanner system. According to the experiments, the 3D depths of moving object are estimated within the statistically reliable range which is well derived by the proposed probability distribution. It is expected that the proposed probability model of 3D positions can be applied to various IR stereo camera systems that deal with fast moving objects.

  5. Probability fields revisited in the context of ensemble Kalman filtering

    NASA Astrophysics Data System (ADS)

    Xu, Teng; Gómez-Hernández, J. Jaime

    2015-12-01

    Hu et al. (2013) proposed an approach to update complex geological facies models generated by multiple-point geostatistical simulation while keeping geological and statistical consistency. Their approach is based on mapping the facies realization onto the spatially uncorrelated uniform random numbers used by the sequential multiple-point simulation to generate the facies realization itself. The ensemble Kalman filter was then used to update the uniform random number realizations, which were then used to generate a new facies realization by multiple-point simulation. This approach has not a good performance that we attribute to the fact that, being the probabilities random and spatially uncorrelated, their correlation with the state variable (piezometric heads) is very weak, and the Kalman gain is always small. The approach is reminiscent of the probability field simulation, which also maps the conductivity realizations onto a field of uniform random numbers; although the mapping now is done using the local conditional distribution functions built based on a prior statistical model and the conditioning data. Contrary to Hu et al. (2013) approach, this field of uniform random numbers, termed a probability field, displays spatial patterns related to the conductivity spatial patterns, and, therefore, the correlation between probabilities and state variable is as strong as the correlation between conductivities and state variable could be. Similarly to Hu et al. (2013), we propose to use the ensemble Kalman filter to update the probability fields, and show that the existence of this correlation between probability values and state variables provides better results.

  6. Prediction of identity by descent probabilities from marker-haplotypes.

    PubMed

    Meuwissen, T H; Goddard, M E

    2001-01-01

    The prediction of identity by descent (IBD) probabilities is essential for all methods that map quantitative trait loci (QTL). The IBD probabilities may be predicted from marker genotypes and/or pedigree information. Here, a method is presented that predicts IBD probabilities at a given chromosomal location given data on a haplotype of markers spanning that position. The method is based on a simplification of the coalescence process, and assumes that the number of generations since the base population and effective population size is known, although effective size may be estimated from the data. The probability that two gametes are IBD at a particular locus increases as the number of markers surrounding the locus with identical alleles increases. This effect is more pronounced when effective population size is high. Hence as effective population size increases, the IBD probabilities become more sensitive to the marker data which should favour finer scale mapping of the QTL. The IBD probability prediction method was developed for the situation where the pedigree of the animals was unknown (i.e. all information came from the marker genotypes), and the situation where, say T, generations of unknown pedigree are followed by some generations where pedigree and marker genotypes are known.

  7. On the probability of cure for heavy-ion radiotherapy.

    PubMed

    Hanin, Leonid; Zaider, Marco

    2014-07-21

    The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  8. On the probability of cure for heavy-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Hanin, Leonid; Zaider, Marco

    2014-07-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  9. Anticipating abrupt shifts in temporal evolution of probability of eruption

    NASA Astrophysics Data System (ADS)

    Rohmer, J.; Loschetter, A.

    2016-04-01

    Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value to a state of high probability value. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to > 70% could be identified up to 1-3 h in advance. This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.

  10. Bounds on probability of transformations between multipartite pure states

    SciTech Connect

    Cui Wei; Helwig, Wolfram; Lo, Hoi-Kwong

    2010-01-15

    For a tripartite pure state of three qubits, it is well known that there are two inequivalent classes of genuine tripartite entanglement, namely the Greenberger-Horne-Zeilinger (GHZ) class and the W class. Any two states within the same class can be transformed into each other with stochastic local operations and classical communication with a nonzero probability. The optimal conversion probability, however, is only known for special cases. Here, lower and upper bounds are derived for the optimal probability of transformation from a GHZ state to other states of the GHZ class. A key idea in the derivation of the upper bounds is to consider the action of the local operations and classical communications (LOCC) protocol on a different input state, namely 1/sq root(2)[|000>-|111>], and to demand that the probability of an outcome remains bounded by 1. We also find an upper bound for more general cases by using the constraints of the so-called interference term and 3-tangle. Moreover, some of the results are generalized to the case in which each party holds a higher dimensional system. In particular, the GHZ state generalized to three qutrits; that is, |GHZ{sub 3}>=1/sq root(3)[|000>+|111>+|222>] shared among three parties can be transformed to any tripartite three-qubit pure state with probability 1 via LOCC. Some of our results can also be generalized to the case of a multipartite state shared by more than three parties.

  11. Kr II transition probability measurements for the UV spectral region

    NASA Astrophysics Data System (ADS)

    Belmonte, M. T.; Gavanski, L.; Peláez, R. J.; Aparicio, J. A.; Djurović, S.; Mar, S.

    2016-02-01

    The determination of radiative transition probabilities or oscillator strengths is of common interest in astrophysics. The analysis of the high-resolution stellar spectra is now available in order to estimate the stellar abundances. In this paper, 93 experimentally obtained transition probability values (Aki) for singly ionized krypton spectral lines belonging to the ultraviolet (UV) wavelength region (208-360) nm are presented. These data, expressed in absolute units, were derived from the measurements of relative spectral line intensities and the values of transition probability data taken from the literature. The results obtained extend considerably the transition probability data base. As a light source, a plasma from a low-pressure pulsed arc was used. Its electron density was in the range of (1.5-3.4) × 1022 m-3, while the temperature was between 28 000 and 35 000 K. A detailed analysis of the results is also given. Only a few relative and a few absolute transition probabilities from other authors, for the mentioned spectral region, are available in the literature.

  12. The probability of genetic parallelism and convergence in natural populations.

    PubMed

    Conte, Gina L; Arnegard, Matthew E; Peichel, Catherine L; Schluter, Dolph

    2012-12-22

    Genomic and genetic methods allow investigation of how frequently the same genes are used by different populations during adaptive evolution, yielding insights into the predictability of evolution at the genetic level. We estimated the probability of gene reuse in parallel and convergent phenotypic evolution in nature using data from published studies. The estimates are surprisingly high, with mean probabilities of 0.32 for genetic mapping studies and 0.55 for candidate gene studies. The probability declines with increasing age of the common ancestor of compared taxa, from about 0.8 for young nodes to 0.1-0.4 for the oldest nodes in our study. Probability of gene reuse is higher when populations begin from the same ancestor (genetic parallelism) than when they begin from divergent ancestors (genetic convergence). Our estimates are broadly consistent with genomic estimates of gene reuse during repeated adaptation to similar environments, but most genomic studies lack data on phenotypic traits affected. Frequent reuse of the same genes during repeated phenotypic evolution suggests that strong biases and constraints affect adaptive evolution, resulting in changes at a relatively small subset of available genes. Declines in the probability of gene reuse with increasing age suggest that these biases diverge with time.

  13. Integrated analysis of incidence, progression, regression and disappearance probabilities

    PubMed Central

    Huang, Guan-Hua

    2008-01-01

    Background Age-related maculopathy (ARM) is a leading cause of vision loss in people aged 65 or older. ARM is distinctive in that it is a disease which can transition through incidence, progression, regression and disappearance. The purpose of this study is to develop methodologies for studying the relationship of risk factors with different transition probabilities. Methods Our framework for studying this relationship includes two different analytical approaches. In the first approach, one can define, model and estimate the relationship between each transition probability and risk factors separately. This approach is similar to constraining a population to a certain disease status at the baseline, and then analyzing the probability of the constrained population to develop a different status. While this approach is intuitive, one risks losing available information while at the same time running into the problem of insufficient sample size. The second approach specifies a transition model for analyzing such a disease. This model provides the conditional probability of a current disease status based upon a previous status, and can therefore jointly analyze all transition probabilities. Throughout the paper, an analysis to determine the birth cohort effect on ARM is used as an illustration. Results and conclusion This study has found parallel separate and joint analyses to be more enlightening than any analysis in isolation. By implementing both approaches, one can obtain more reliable and more efficient results. PMID:18577235

  14. Multiclass Posterior Probability Twin SVM for Motor Imagery EEG Classification.

    PubMed

    She, Qingshan; Ma, Yuliang; Meng, Ming; Luo, Zhizeng

    2015-01-01

    Motor imagery electroencephalography is widely used in the brain-computer interface systems. Due to inherent characteristics of electroencephalography signals, accurate and real-time multiclass classification is always challenging. In order to solve this problem, a multiclass posterior probability solution for twin SVM is proposed by the ranking continuous output and pairwise coupling in this paper. First, two-class posterior probability model is constructed to approximate the posterior probability by the ranking continuous output techniques and Platt's estimating method. Secondly, a solution of multiclass probabilistic outputs for twin SVM is provided by combining every pair of class probabilities according to the method of pairwise coupling. Finally, the proposed method is compared with multiclass SVM and twin SVM via voting, and multiclass posterior probability SVM using different coupling approaches. The efficacy on the classification accuracy and time complexity of the proposed method has been demonstrated by both the UCI benchmark datasets and real world EEG data from BCI Competition IV Dataset 2a, respectively. PMID:26798330

  15. Probability as Possibility Spaces: Communicating Uncertainty to Policymakers

    NASA Astrophysics Data System (ADS)

    Stiso, C.

    2015-12-01

    One problem facing the sciences is communicating results and recommendations to policymakers. This is perhaps particularly difficult in the geosciences where results are often based on probabilistic models, as probability is often and unduly equated with a specific kind of uncertainty or unreliability in the results. This leads to a great deal of miscommunication and misguided policy decisions. It is, then, valid to ask how scientists should talk about probability, uncertainty, and models in a way that correctly conveys what the users of these models intend. What I propose is a new way to think and, importantly, talk about probability which will hopefully make this much more transparent to both users and policy makers. Rather than using a frequentist (prior percentages) or Bayesian (observer uncertainty) framework, we should talk about probability as a tool for defining a possibility space for measurements. This model is conceptually simple and makes probability a tool of refinement rather than a source of inaccuracy. A similar possibility-space model has proven useful in the climate sciences and there is good reason to believe it will have similar applications in hydrology.

  16. Duration discrimination: effects of probability of stimulus presentation.

    PubMed

    Elsmore, T F

    1972-11-01

    Monkeys initiated a stimulus by pressing on the center of three levers and the stimulus terminated independently of behavior 60, 80, 90, or 100 sec later. Presses on the right lever were reinforced with food following the three briefer durations, and presses on the left lever, following the 100-sec duration. Incorrect responses produced a 10-sec timeout. Probability of presenting the 100-sec duration was manipulated in the range from 0.25 to 0.75, with the probabilities of the briefer durations remaining equal and summing to one minus the probability of the 100-sec duration. Percentage of responses on either side lever was functionally related to both the probability of presenting the 100-sec stimulus and to stimulus duration. An analysis of the data based on the theory of signal detection resulted in operating characteristics that were linear when plotted on normal-normal coordinates. The percentage of responses on either lever approximated the optimal values for maximizing reinforcement probability in each condition of the experiment.

  17. Rejecting probability summation for radial frequency patterns, not so Quick!

    PubMed

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. PMID:26975501

  18. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories.

  19. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  20. TRANSIT PROBABILITIES FOR STARS WITH STELLAR INCLINATION CONSTRAINTS

    SciTech Connect

    Beatty, Thomas G.; Seager, Sara

    2010-04-01

    The probability that an exoplanet transits its host star is high for planets in close orbits, but drops off rapidly for increasing semimajor axes. This makes transit surveys for planets with large semimajor axes orbiting bright stars impractical, since one would need to continuously observe hundreds of stars that are spread out over the entire sky. One way to make such a survey tractable is to constrain the inclination of the stellar rotation axes in advance, and thereby enhance the transit probabilities. We derive transit probabilities for stars with stellar inclination constraints, considering a reasonable range of planetary system inclinations. We find that stellar inclination constraints can improve the transit probability by almost an order of magnitude for habitable-zone planets. When applied to an ensemble of stars, such constraints dramatically lower the number of stars that need to be observed in a targeted transit survey. We also consider multiplanet systems where only one planet has an identified transit and derive the transit probabilities for the second planet assuming a range of mutual planetary inclinations.

  1. Domestic wells have high probability of pumping septic tank leachate

    NASA Astrophysics Data System (ADS)

    Horn, J. E.; Harter, T.

    2011-06-01

    Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25-30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  2. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. PMID:25590946

  3. Sufficient Statistics for Divergence and the Probability of Misclassification

    NASA Technical Reports Server (NTRS)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  4. Assault frequency and preformation probability of the {alpha} emission process

    SciTech Connect

    Zhang, H. F.; Royer, G.; Li, J. Q.

    2011-08-15

    A study of the assault frequency and preformation factor of the {alpha}-decay description is performed from the experimental {alpha}-decay constant and the penetration probabilities calculated from the generalized liquid-drop model (GLDM) potential barriers. To determine the assault frequency a quantum-mechanical method using a harmonic oscillator is introduced and leads to values of around 10{sup 21} s{sup -1}, similar to the ones calculated within the classical method. The preformation probability is around 10{sup -1}-10{sup -2}. The results for even-even Po isotopes are discussed for illustration. While the assault frequency presents only a shallow minimum in the vicinity of the magic neutron number 126, the preformation factor and mainly the penetrability probability diminish strongly around N=126.

  5. Testing the Value of Probability Forecasts for Calibrated Combining

    PubMed Central

    Lahiri, Kajal; Peng, Huaming; Zhao, Yongchen

    2014-01-01

    We combine the probability forecasts of a real GDP decline from the U.S. Survey of Professional Forecasters, after trimming the forecasts that do not have “value”, as measured by the Kuiper Skill Score and in the sense of Merton (1981). For this purpose, we use a simple test to evaluate the probability forecasts. The proposed test does not require the probabilities to be converted to binary forecasts before testing, and it accommodates serial correlation and skewness in the forecasts. We find that the number of forecasters making valuable forecasts decreases sharply as the horizon increases. The beta-transformed linear pool combination scheme, based on the valuable individual forecasts, is shown to outperform the simple average for all horizons on a number of performance measures, including calibration and sharpness. The test helps to identify the good forecasters ex ante, and therefore contributes to the accuracy of the combined forecasts. PMID:25530646

  6. Imprecise probability assessment of tipping points in the climate system

    PubMed Central

    Kriegler, Elmar; Hall, Jim W.; Held, Hermann; Dawson, Richard; Schellnhuber, Hans Joachim

    2009-01-01

    Major restructuring of the Atlantic meridional overturning circulation, the Greenland and West Antarctic ice sheets, the Amazon rainforest and ENSO, are a source of concern for climate policy. We have elicited subjective probability intervals for the occurrence of such major changes under global warming from 43 scientists. Although the expert estimates highlight large uncertainty, they allocate significant probability to some of the events listed above. We deduce conservative lower bounds for the probability of triggering at least 1 of those events of 0.16 for medium (2–4 °C), and 0.56 for high global mean temperature change (above 4 °C) relative to year 2000 levels. PMID:19289827

  7. Quantum Probability Cancellation Due to a Single-Photon State

    NASA Technical Reports Server (NTRS)

    Ou, Z. Y.

    1996-01-01

    When an N-photon state enters a lossless symmetric beamsplitter from one input port, the photon distribution for the two output ports has the form of Bernouli Binormial, with highest probability at equal partition (N/2 at one outport and N/2 at the other). However, injection of a single photon state at the other input port can dramatically change the photon distribution at the outputs, resulting in zero probability at equal partition. Such a strong deviation from classical particle theory stems from quantum probability amplitude cancellation. The effect persists even if the N-photon state is replaced by an arbitrary state of light. A special case is the coherent state which corresponds to homodyne detection of a single photon state and can lead to the measurement of the wave function of a single photon state.

  8. Simulation and Estimation of Extreme Quantiles and Extreme Probabilities

    SciTech Connect

    Guyader, Arnaud; Hengartner, Nicolas; Matzner-Lober, Eric

    2011-10-15

    Let X be a random vector with distribution {mu} on Double-Struck-Capital-R {sup d} and {Phi} be a mapping from Double-Struck-Capital-R {sup d} to Double-Struck-Capital-R . That mapping acts as a black box, e.g., the result from some computer experiments for which no analytical expression is available. This paper presents an efficient algorithm to estimate a tail probability given a quantile or a quantile given a tail probability. The algorithm improves upon existing multilevel splitting methods and can be analyzed using Poisson process tools that lead to exact description of the distribution of the estimated probabilities and quantiles. The performance of the algorithm is demonstrated in a problem related to digital watermarking.

  9. Probability bounds analysis for nonlinear population ecology models.

    PubMed

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals.

  10. Uniform distribution of initial states: The physical basis of probability

    NASA Astrophysics Data System (ADS)

    Zhang Kechen

    1990-02-01

    For repetitive experiments performed on a deterministic system with initial states restricted to a certain region in phase space, the relative frequency of an event has a definite value insensitive to the preparation of the experiments only if the initial states leading to that event are distributed uniformly in the prescribed region. Mechanical models of coin tossing and roulette spinning and equal a priori probability hypothesis in statistical mechanics are considered in the light of this principle. Probabilities that have arisen from uniform distributions of initial states do not necessarily submit to Kolmogorov's axioms of probability. In the finite-dimensional case, a uniform distribution in phase space either in the coarse-grained sense or in the limit sense can be formulated in a unified way.

  11. ProbOnto: ontology and knowledge base of probability distributions

    PubMed Central

    Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala

    2016-01-01

    Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608

  12. Defining Predictive Probability Functions for Species Sampling Models

    PubMed Central

    Lee, Jaeyong; Quintana, Fernando A.; Müller, Peter; Trippa, Lorenzo

    2013-01-01

    We review the class of species sampling models (SSM). In particular, we investigate the relation between the exchangeable partition probability function (EPPF) and the predictive probability function (PPF). It is straightforward to define a PPF from an EPPF, but the converse is not necessarily true. In this paper we introduce the notion of putative PPFs and show novel conditions for a putative PPF to define an EPPF. We show that all possible PPFs in a certain class have to define (unnormalized) probabilities for cluster membership that are linear in cluster size. We give a new necessary and sufficient condition for arbitrary putative PPFs to define an EPPF. Finally, we show posterior inference for a large class of SSMs with a PPF that is not linear in cluster size and discuss a numerical method to derive its PPF. PMID:24368874

  13. Probability effects on stimulus evaluation and response processes

    NASA Technical Reports Server (NTRS)

    Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.

    1992-01-01

    This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.

  14. Gravity and count probabilities in an expanding universe

    NASA Technical Reports Server (NTRS)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  15. [Pre-test and post-test probabilities. Who cares?].

    PubMed

    Steurer, Johann

    2009-01-01

    The accuracy of a diagnostic test, i.e. abdomen ultrasound in patients with suspected acute appendicitis, is described in the terms of sensitivity and specificity. According to eminent textbooks physicians should use the values of the sensitivity and specificity of a test in their diagnostic reasoning. Physician's estimate, after taking the history, the pretest-probability of the suspected illness, order one or more tests and then calculate the respective posttest-probability. In practice physicians almost never follow this line of thinking. The main reasons are; to estimate concrete illness probabilities is difficult, the values for the sensitivity and specificity of a test are most often not known by physicians and calculations during daily practice are intricate. Helpful for busy physicians are trustworthy expert recommendations which test to apply in which clinical situation.

  16. Quantum Particles from Classical Probabilities in Phase Space

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2012-10-01

    Quantum particles in a potential are described by classical statistical probabilities. We formulate a basic time evolution law for the probability distribution of classical position and momentum such that all known quantum phenomena follow, including interference or tunneling. The appropriate quantum observables for position and momentum contain a statistical part which reflects the roughness of the probability distribution. "Zwitters" realize a continuous interpolation between quantum and classical particles. Such objects may provide for an effective one-particle description of classical or quantum collective states as droplets of a liquid, macromolecules or a Bose-Einstein condensate. They may also be used for quantitative fundamental tests of quantum mechanics. We show that the ground state for zwitters has no longer a sharp energy. This feature permits to put quantitative experimental bounds on a small parameter for possible deviations from quantum mechanics.

  17. Conservative Analytical Collision Probability for Design of Orbital Formations

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  18. Negative probabilities and information gain in weak measurements

    NASA Astrophysics Data System (ADS)

    Zhu, Xuanmin; Wei, Qun; Liu, Quanhui; Wu, Shengjun

    2013-11-01

    We study the outcomes in a general measurement with postselection, and derive upper bounds for the pointer readings in weak measurement. The probabilities inferred from weak measurements change along with the coupling strength; and the true probabilities can be obtained when the coupling is strong enough. By calculating the information gain of the measuring device about which path the particles pass through, we show that the “negative probabilities” only emerge for cases when the information gain is little due to very weak coupling between the measuring device and the particles. When the coupling strength increases, we can unambiguously determine whether a particle passes through a given path every time, hence the average shifts always represent true probabilities, and the strange “negatives probabilities” disappear.

  19. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    SciTech Connect

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of the technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.

  20. Flux continuity and probability conservation in complexified Bohmian mechanics

    SciTech Connect

    Poirier, Bill

    2008-02-15

    Recent years have seen increased interest in complexified Bohmian mechanical trajectory calculations for quantum systems as both a pedagogical and computational tool. In the latter context, it is essential that trajectories satisfy probability conservation to ensure they are always guided to where they are most needed. We consider probability conservation for complexified Bohmian trajectories. The analysis relies on time-reversal symmetry considerations, leading to a generalized expression for the conjugation of wave functions of complexified variables. This in turn enables meaningful discussion of complexified flux continuity, which turns out not to be satisfied in general, though a related property is found to be true. The main conclusion, though, is that even under a weak interpretation, probability is not conserved along complex Bohmian trajectories.

  1. Critical Probabilities and Convergence Time of Percolation Probabilistic Cellular Automata

    NASA Astrophysics Data System (ADS)

    Taggi, Lorenzo

    2015-05-01

    This paper considers a class of probabilistic cellular automata undergoing a phase transition with an absorbing state. Denoting by the neighbourhood of site , the transition probability is if or otherwise, . For any there exists a non-trivial critical probability that separates a phase with an absorbing state from a fluctuating phase. This paper studies how the neighbourhood affects the value of and provides lower bounds for . Furthermore, by using dynamic renormalization techniques, we prove that the expected convergence time of the processes on a finite space with periodic boundaries grows exponentially (resp. logarithmically) with the system size if (resp. ). This provides a partial answer to an open problem in Toom et al. (Stochastic Cellular Systems: Ergodicity, Memory, Morphogenesis, pp. 1-182. Manchester University Press, Manchester, 1990; Topics in Contemporary Probability and its Applications, pp. 117-157. CRC Press, Boca Raton, 1995).

  2. A review of representation functions for probability of detection

    NASA Astrophysics Data System (ADS)

    Rivers, W.; Bucknam, J. N.; Khoury, E. N.; Blase, R. E.

    1980-01-01

    Simple representation functions tht interrelate the primary signal detection variables for the receiver structure of Marcum and Swerling are reviewed in regard to acuracy, complexity and inversion. Functions reviewed include those of Brooks, Neuvy, and Khoury-Bucknam. The first two of these are simple algorithms for computing minimum ratio of signal energy to noise power density as functions of number of samples integrated, the target distribution case, and required probabilities of detection and false alarm. The Khoury-Bucknam functions are analytic and invertable, and they relate probability of detection and signal-to-noise ratio using parameters chosen uniquely for each case, number of samples, and probability of false alarm. The procedures and software that support determination of the coefficients of the Khoury-Bucknam function are documented.

  3. Probability theory versus simulation of petroleum potential in play analysis

    USGS Publications Warehouse

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  4. Multidimensional stationary probability distribution for interacting active particles

    PubMed Central

    Maggi, Claudio; Marconi, Umberto Marini Bettolo; Gnan, Nicoletta; Di Leonardo, Roberto

    2015-01-01

    We derive the stationary probability distribution for a non-equilibrium system composed by an arbitrary number of degrees of freedom that are subject to Gaussian colored noise and a conservative potential. This is based on a multidimensional version of the Unified Colored Noise Approximation. By comparing theory with numerical simulations we demonstrate that the theoretical probability density quantitatively describes the accumulation of active particles around repulsive obstacles. In particular, for two particles with repulsive interactions, the probability of close contact decreases when one of the two particle is pinned. Moreover, in the case of isotropic confining potentials, the radial density profile shows a non trivial scaling with radius. Finally we show that the theory well approximates the “pressure” generated by the active particles allowing to derive an equation of state for a system of non-interacting colored noise-driven particles. PMID:26021260

  5. Brick tunnel randomization and the momentum of the probability mass.

    PubMed

    Kuznetsova, Olga M

    2015-12-30

    The allocation space of an unequal-allocation permuted block randomization can be quite wide. The development of unequal-allocation procedures with a narrower allocation space, however, is complicated by the need to preserve the unconditional allocation ratio at every step (the allocation ratio preserving (ARP) property). When the allocation paths are depicted on the K-dimensional unitary grid, where allocation to the l-th treatment is represented by a step along the l-th axis, l = 1 to K, the ARP property can be expressed in terms of the center of the probability mass after i allocations. Specifically, for an ARP allocation procedure that randomizes subjects to K treatment groups in w1 :⋯:wK ratio, w1 +⋯+wK =1, the coordinates of the center of the mass are (w1 i,…,wK i). In this paper, the momentum with respect to the center of the probability mass (expected imbalance in treatment assignments) is used to compare ARP procedures in how closely they approximate the target allocation ratio. It is shown that the two-arm and three-arm brick tunnel randomizations (BTR) are the ARP allocation procedures with the tightest allocation space among all allocation procedures with the same allocation ratio; the two-arm BTR is the minimum-momentum two-arm ARP allocation procedure. Resident probabilities of two-arm and three-arm BTR are analytically derived from the coordinates of the center of the probability mass; the existence of the respective transition probabilities is proven. Probability of deterministic assignments with BTR is found generally acceptable. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Quantifying Extinction Probabilities from Sighting Records: Inference and Uncertainties

    PubMed Central

    Caley, Peter; Barry, Simon C.

    2014-01-01

    Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes) into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes. PMID:24788945

  7. Modeling the effect of reward amount on probability discounting.

    PubMed

    Myerson, Joel; Green, Leonard; Morris, Joshua

    2011-03-01

    The present study with college students examined the effect of amount on the discounting of probabilistic monetary rewards. A hyperboloid function accurately described the discounting of hypothetical rewards ranging in amount from $20 to $10,000,000. The degree of discounting increased continuously with amount of probabilistic reward. This effect of amount was not due to changes in the rate parameter of the discounting function, but rather was due to increases in the exponent. These results stand in contrast to those observed with the discounting of delayed monetary rewards, in which the degree of discounting decreases with reward amount due to amount-dependent decreases in the rate parameter. Taken together, this pattern of results suggests that delay and probability discounting reflect different underlying mechanisms. That is, the fact that the exponent in the delay discounting function is independent of amount is consistent with a psychophysical scaling interpretation, whereas the finding that the exponent of the probability-discounting function is amount-dependent is inconsistent with such an interpretation. Instead, the present results are consistent with the idea that the probability-discounting function is itself the product of a value function and a weighting function. This idea was first suggested by Kahneman and Tversky (1979), although their prospect theory does not predict amount effects like those observed. The effect of amount on probability discounting was parsimoniously incorporated into our hyperboloid discounting function by assuming that the exponent was proportional to the amount raised to a power. The amount-dependent exponent of the probability-discounting function may be viewed as reflecting the effect of amount on the weighting of the probability with which the reward will be received.

  8. Does probability of occurrence relate to population dynamics?

    PubMed Central

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species’ realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species’ niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K) and population density (N) relate to occurrence probability (Pocc). We hypothesized that these relationships vary with species’ competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence probability

  9. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  10. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    PubMed

    Caley, Peter; Barry, Simon C

    2014-01-01

    Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes) into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes. PMID:24788945

  11. Using Correlation to Compute Better Probability Estimates in Plan Graphs

    NASA Technical Reports Server (NTRS)

    Bryce, Daniel; Smith, David E.

    2006-01-01

    Plan graphs are commonly used in planning to help compute heuristic "distance" estimates between states and goals. A few authors have also attempted to use plan graphs in probabilistic planning to compute estimates of the probability that propositions can be achieved and actions can be performed. This is done by propagating probability information forward through the plan graph from the initial conditions through each possible action to the action effects, and hence to the propositions at the next layer of the plan graph. The problem with these calculations is that they make very strong independence assumptions - in particular, they usually assume that the preconditions for each action are independent of each other. This can lead to gross overestimates in probability when the plans for those preconditions interfere with each other. It can also lead to gross underestimates of probability when there is synergy between the plans for two or more preconditions. In this paper we introduce a notion of the binary correlation between two propositions and actions within a plan graph, show how to propagate this information within a plan graph, and show how this improves probability estimates for planning. This notion of correlation can be thought of as a continuous generalization of the notion of mutual exclusion (mutex) often used in plan graphs. At one extreme (correlation=0) two propositions or actions are completely mutex. With correlation = 1, two propositions or actions are independent, and with correlation > 1, two propositions or actions are synergistic. Intermediate values can and do occur indicating different degrees to which propositions and action interfere or are synergistic. We compare this approach with another recent approach by Bryce that computes probability estimates using Monte Carlo simulation of possible worlds in plan graphs.

  12. Direct propagation of probability density functions in hydrological equations

    NASA Astrophysics Data System (ADS)

    Kunstmann, Harald; Kastens, Marko

    2006-06-01

    Sustainable decisions in hydrological risk management require detailed information on the probability density function ( pdf) of the model output. Only then probabilities for the failure of a specific management option or the exceedance of critical thresholds (e.g. of pollutants) can be derived. A new approach of uncertainty propagation in hydrological equations is developed that directly propagates the probability density functions of uncertain model input parameters into the corresponding probability density functions of model output. The basics of the methodology are presented and central applications to different disciplines in hydrology are shown. This work focuses on the following basic hydrological equations: (1) pumping test analysis (Theis-equation, propagation of uncertainties in recharge and transmissivity), (2) 1-dim groundwater contaminant transport equation (Gauss-equation, propagation of uncertainties in decay constant and dispersivity), (3) evapotranspiration estimation (Penman-Monteith-equation, propagation of uncertainty in roughness length). The direct propagation of probability densities is restricted to functions that are monotonically increasing or decreasing or that can be separated in corresponding monotonic branches so that inverse functions can be derived. In case no analytic solutions for inverse functions could be derived, semi-analytical approximations were used. It is shown that the results of direct probability density function propagation are in perfect agreement with results obtained from corresponding Monte Carlo derived frequency distributions. Direct pdf propagation, however, has the advantage that is yields exact solutions for the resulting hydrological pdfs rather than approximating discontinuous frequency distributions. It is additionally shown that the type of the resulting pdf depends on the specific values (order of magnitude, respectively) of the standard deviation of the input pdf. The dependency of skewness and kurtosis

  13. Advances in the Measurement of Atomic Transition Probabilities

    NASA Astrophysics Data System (ADS)

    O'Brian, Thomas Raymond

    The technology for measuring absolute atomic transition probabilities is extended. Radiative lifetimes are measured by time-resolved laser-induced fluorescence on a slow atomic beam generated by a versatile hollow cathode discharge source. The radiative lifetimes are free from systematic error at the five percent level. Combined with branching fractions measured with emission or absorption sources, the lifetimes result in absolute transition probabilities usually accurate to 5-10 %. Three new developments in the lifetime and branching fraction technique are reported. Radiative lifetimes for 186 levels in neutral iron are measured, with the energy of the upper levels densely spanning the entire excitation range of neutral iron. Combined with branching fractions measured in emission with Fourier transform spectrophotometry, the level lifetimes directly yield absolute transition probabilities for 1174 transitions. An additional 640 transition probabilities are determined by interpolating level populations in an emission source. The dense energy spacing of the levels with directly measured lifetimes permits accurate population interpolation despite departures from local thermodynamic equilibrium. This technique has the potential to permit accurate absolute transition probability measurements for essentially every classified line in a spectrum. Radiative lifetime measurements are extended into the vacuum ultraviolet with a continuously tunable vacuum ultraviolet laser based on stimulated anti-Stokes Raman scattering. When used with the hollow cathode atomic beam source, accurate lifetimes are measured for 47 levels in neutral silicon and 8 levels in neutral boron, primarily in the vacuum ultraviolet spectral region. Transition probabilities are reported for many lines connected to these upper levels, using previously measured or calculated branching fractions. The hollow cathode beam source is developed for use with refractory non-metals. Intense atomic beams of boron

  14. On the decoder error probability of linear codes

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.

    1989-01-01

    By using coding and combinatorial techniques, an approximate formula for the weight distribution of decodable words of most linear block codes is evaluated. This formula is then used to give an approximate expression for the decoder error probability P(sub E)(u) of linear block codes, given that an error pattern of weight u has occurred. It is shown that P(sub E)(u) approaches the constant Q as u gets large, where Q is the probability that a completely random error pattern will cause decoder error.

  15. An application of recurrent nets to phone probability estimation.

    PubMed

    Robinson, A J

    1994-01-01

    This paper presents an application of recurrent networks for phone probability estimation in large vocabulary speech recognition. The need for efficient exploitation of context information is discussed; a role for which the recurrent net appears suitable. An overview of early developments of recurrent nets for phone recognition is given along with the more recent improvements that include their integration with Markov models. Recognition results are presented for the DARPA TIMIT and Resource Management tasks, and it is concluded that recurrent nets are competitive with traditional means for performing phone probability estimation.

  16. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2010-01-01

    When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's

  17. Bayesian Estimator of Protein-Protein Association Probabilities

    2008-05-28

    The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein LC-MS/MS affinity isolation experiments. BEPro3 is public domain software, has been tested on Windows XP and version 10.4 or newer of the Mac OS 10.4, and is freely available. A user guide, example dataset with analysis and additional documentation are included with the BEPro3 download.

  18. Gendist: An R Package for Generated Probability Distribution Models

    PubMed Central

    Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; ABSL Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim

    2016-01-01

    In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements. PMID:27272043

  19. Probability of detection of defects in coatings with electronic shearography

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Horton, Charles M.; Lansing, Matthew D.; Gnacek, William J.; Newton, Patrick L.

    1994-01-01

    The goal of this research was to utilize statistical methods to evaluate the probability of detection (POD) of defects in coatings using electronic shearography. The coating system utilized in the POD studies was to be the paint system currently utilized on the external casings of the NASA Space Transportation System (STS) Revised Solid Rocket Motor (RSRM) boosters. The population of samples was to be large enough to determine the minimum defect size for 90 percent probability of detection of 95 percent confidence POD on these coatings. Also, the best methods to excite coatings on aerospace components to induce deformations for measurement by electronic shearography were to be determined.

  20. Transition Probabilities for Spectral Lines in Co I

    NASA Astrophysics Data System (ADS)

    Nitz, D. E.; Wilson, K. L.; Lentz, L. R.

    1996-05-01

    We are in the process of determining transition probabilities for visible and uv lines in Co I from Fourier transform spectra recorded at Kitt Peak and made available to us by Prof. W. Whaling. Normalization of relative transition probabilities obtained from these spectra is achieved using recently-measured Co I lifetimes.(D. E. Nitz, S. D. Bergeson, and J. E. Lawler, J. Opt. Soc. Am. B 12, 377 (1995).) To date we have obtained preliminary results for 240 lines having branch fractions > 1

  1. How to Calculate the Accident Probability of Dangerous Substance Transport

    NASA Astrophysics Data System (ADS)

    Fuchs, Pavel; Saska, Tomas; Sousek, Radovan; Valis, David

    2012-09-01

    Currently the risk assessment of dangerous substances manipulation and transportation is a frequent topic in scientific groups. It exist many themes from this area, which are discussed at conferences and scientific events. One of these topics is surely the calculation of accident probability of dangerous substance transport. The following paper describes the procedure of the accident probability calculation of dangerous substance road transportation. The next aim of this paper is to show, what uncertainties may be contained in such as calculation procedure. And finally, which parameters should be collected for complex accident risk assessment of dangerous substance road transport.

  2. A probability generating function method for stochastic reaction networks

    NASA Astrophysics Data System (ADS)

    Kim, Pilwon; Lee, Chang Hyeong

    2012-06-01

    In this paper we present a probability generating function (PGF) approach for analyzing stochastic reaction networks. The master equation of the network can be converted to a partial differential equation for PGF. Using power series expansion of PGF and Padé approximation, we develop numerical schemes for finding probability distributions as well as first and second moments. We show numerical accuracy of the method by simulating chemical reaction examples such as a binding-unbinding reaction, an enzyme-substrate model, Goldbeter-Koshland ultrasensitive switch model, and G2/M transition model.

  3. A quantum probability explanation for violations of 'rational' decision theory.

    PubMed

    Pothos, Emmanuel M; Busemeyer, Jerome R

    2009-06-22

    Two experimental tasks in psychology, the two-stage gambling game and the Prisoner's Dilemma game, show that people violate the sure thing principle of decision theory. These paradoxical findings have resisted explanation by classical decision theory for over a decade. A quantum probability model, based on a Hilbert space representation and Schrödinger's equation, provides a simple and elegant explanation for this behaviour. The quantum model is compared with an equivalent Markov model and it is shown that the latter is unable to account for violations of the sure thing principle. Accordingly, it is argued that quantum probability provides a better framework for modelling human decision-making.

  4. Ask Marilyn in the mathematics classroom: probability questions

    NASA Astrophysics Data System (ADS)

    Vasko, Francis J.

    2012-06-01

    Since 1986, Marilyn Vos Savant, who is listed in the Guinness Book of World Records Hall of Fame for the highest IQ, has had a weekly column that is published in Parade Magazine. In this column, she answers readers' questions on a wide variety of subjects including mathematics and particularly probability. Many of the mathematically oriented questions are directly relevant to high school and undergraduate college level mathematics courses. For nearly 20 years, I have incorporated many of these questions into a variety of mathematics courses that I teach. In this note, I will discuss some of the questions that I use dealing with probability.

  5. The neurologic examination in patients with probable Alzheimer's disease.

    PubMed

    Huff, F J; Boller, F; Lucchelli, F; Querriera, R; Beyer, J; Belle, S

    1987-09-01

    Abnormal findings on a standardized neurologic examination were compared between patients with a clinical diagnosis of probable Alzheimer's disease (AD) and healthy control subjects. Aside from mental status findings, the most useful examination findings for differentiating AD from control subjects were the presence of release signs, olfactory deficit, impaired stereognosis or graphesthesia, gait disorder, tremor, and abnormalities on cerebellar testing. These abnormalities probably reflect the different areas of the central nervous system that are affected pathologically in AD. In the clinical diagnosis of AD, particular attention should be given to these aspects of the neurologic examination.

  6. LFSPMC: Linear feature selection program using the probability of misclassification

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Marion, B. P.

    1975-01-01

    The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.

  7. Confidence as Bayesian Probability: From Neural Origins to Behavior.

    PubMed

    Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F

    2015-10-01

    Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions. PMID:26447574

  8. Strategies in Yahtzee: An Exercise in Elementary Probability.

    ERIC Educational Resources Information Center

    Milton, J. Susan; Corbet, James J.

    1982-01-01

    The game of Yahtzee is analyzed to some extent through methods of elementary probability. Several different problems are posed, and these serve to illustrate both mathematical concerns and game play strategy. Solutions are provided to all questions at the conclusion and suggestions for further questions are presented. (MP)

  9. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  10. Exaggerated Risk: Prospect Theory and Probability Weighting in Risky Choice

    ERIC Educational Resources Information Center

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-01-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and…

  11. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

  12. Flipping Out: Calculating Probability with a Coin Game

    ERIC Educational Resources Information Center

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  13. Spatial Probability Dynamically Modulates Visual Target Detection in Chickens

    PubMed Central

    Sridharan, Devarajan; Ramamurthy, Deepa L.; Knudsen, Eric I.

    2013-01-01

    The natural world contains a rich and ever-changing landscape of sensory information. To survive, an organism must be able to flexibly and rapidly locate the most relevant sources of information at any time. Humans and non-human primates exploit regularities in the spatial distribution of relevant stimuli (targets) to improve detection at locations of high target probability. Is the ability to flexibly modify behavior based on visual experience unique to primates? Chickens (Gallus domesticus) were trained on a multiple alternative Go/NoGo task to detect a small, briefly-flashed dot (target) in each of the quadrants of the visual field. When targets were presented with equal probability (25%) in each quadrant, chickens exhibited a distinct advantage for detecting targets at lower, relative to upper, hemifield locations. Increasing the probability of presentation in the upper hemifield locations (to 80%) dramatically improved detection performance at these locations to be on par with lower hemifield performance. Finally, detection performance in the upper hemifield changed on a rapid timescale, improving with successive target detections, and declining with successive detections at the diagonally opposite location in the lower hemifield. These data indicate the action of a process that in chickens, as in primates, flexibly and dynamically modulates detection performance based on the spatial probabilities of sensory stimuli as well as on recent performance history. PMID:23734188

  14. Multiple-Category Classification Using a Sequential Probability Ratio Test.

    ERIC Educational Resources Information Center

    Spray, Judith A.

    Sequential probability ratio testing (PRT), which usually is applied in situations requiring a decision between two simple hypotheses or a single decision point, is extended to include situations involving k decision points and [(k + 1)-choose-2] sets of simultaneous, simple hypotheses, where k>1. The multiple-decision point or multiple-category…

  15. Computer Simulations and Problem-Solving in Probability.

    ERIC Educational Resources Information Center

    Camp, John S.

    1978-01-01

    The purpose of this paper is to present problems (and solutions) from the areas of marketing, population planning, system reliability, and mathematics to show how a computer simulation can be used as a problem-solving strategy in probability. Examples using BASIC and two methods of generating random numbers are given. (Author/MP)

  16. Addressing Challenging Behavior: Considering the Logic of Probability

    ERIC Educational Resources Information Center

    Scott, Terrance M.; Hirn, Regina G.

    2014-01-01

    When dealing with children who exhibit challenging behaviors there are no known interventions that work for all students or at all times. Thus, intervention for these students is often implemented in a trial and error manner. This article provides a logic for considering probability as a factor in selecting strategies. Understanding that some…

  17. Delay, Probability, and Social Discounting in a Public Goods Game

    ERIC Educational Resources Information Center

    Jones, Bryan A.; Rachlin, Howard

    2009-01-01

    A human social discount function measures the value to a person of a reward to another person at a given social distance. Just as delay discounting is a hyperbolic function of delay, and probability discounting is a hyperbolic function of odds-against, social discounting is a hyperbolic function of social distance. Experiment 1 obtained individual…

  18. A Brief Look at the History of Probability and Statistics.

    ERIC Educational Resources Information Center

    Lightner, James E.

    1991-01-01

    The historical development of probability theory is traced from its early origins in games of chance through its mathematical foundations in the work of Pascal and Fermat. The roots of statistics are also presented beginning with early actuarial developments through the work of Laplace, Gauss, and others. (MDH)

  19. Pedigrees, Prizes, and Prisoners: The Misuse of Conditional Probability

    ERIC Educational Resources Information Center

    Carlton, Matthew A.

    2005-01-01

    We present and discuss three examples of misapplication of the notion of conditional probability. In each example, we present the problem along with a published and/or well-known incorrect--but seemingly plausible--solution. We then give a careful treatment of the correct solution, in large part to show how careful application of basic probability…

  20. A proposed physical analog for a quantum probability amplitude

    NASA Astrophysics Data System (ADS)

    Boyd, Jeffrey

    What is the physical analog of a probability amplitude? All quantum mathematics, including quantum information, is built on amplitudes. Every other science uses probabilities; QM alone uses their square root. Why? This question has been asked for a century, but no one previously has proposed an answer. We will present cylindrical helices moving toward a particle source, which particles follow backwards. Consider Feynman's book QED. He speaks of amplitudes moving through space like the hand of a spinning clock. His hand is a complex vector. It traces a cylindrical helix in Cartesian space. The Theory of Elementary Waves changes direction so Feynman's clock faces move toward the particle source. Particles follow amplitudes (quantum waves) backwards. This contradicts wave particle duality. We will present empirical evidence that wave particle duality is wrong about the direction of particles versus waves. This involves a paradigm shift; which are always controversial. We believe that our model is the ONLY proposal ever made for the physical foundations of probability amplitudes. We will show that our ``probability amplitudes'' in physical nature form a Hilbert vector space with adjoints, an inner product and support both linear algebra and Dirac notation.