Science.gov

Sample records for probable nmp para

  1. Sample contamination with NMP-oxidation products and byproduct-free NMP removal from sample solutions

    SciTech Connect

    Cesar Berrueco; Patricia Alvarez; Silvia Venditti; Trevor J. Morgan; Alan A. Herod; Marcos Millan; Rafael Kandiyoti

    2009-05-15

    1-Methyl-2-pyrrolidinone (NMP) is widely used as a solvent for coal-derived products and as eluent in size exclusion chromatography. It was observed that sample contamination may take place, through reactions of NMP, during extraction under refluxing conditions and during the process of NMP evaporation to concentrate or isolate samples. In this work, product distributions from experiments carried out in contact with air and under a blanket of oxygen-free nitrogen have been compared. Gas chromatography/mass spectrometry (GC-MS) clearly shows that oxidation products form when NMP is heated in the presence of air. Upon further heating, these oxidation products appear to polymerize, forming material with large molecular masses. Potentially severe levels of interference have been encountered in the size exclusion chromatography (SEC) of actual samples. Laser desorption mass spectrometry and SEC agree in showing an upper mass limit of nearly 7000 u for a residue left after distilling 'pure' NMP in contact with air. Furthermore, experiments have shown that these effects could be completely avoided by a strict exclusion of air during the refluxing and evaporation of NMP to dryness. 45 refs., 13 figs.

  2. Genome-Wide Mapping and Interrogation of the Nmp4 Antianabolic Bone Axis

    PubMed Central

    Childress, Paul; Stayrook, Keith R.; Alvarez, Marta B.; Wang, Zhiping; Shao, Yu; Hernandez-Buquer, Selene; Mack, Justin K.; Grese, Zachary R.; He, Yongzheng; Horan, Daniel; Pavalko, Fredrick M.; Warden, Stuart J.; Robling, Alexander G.; Yang, Feng-Chun; Allen, Matthew R.; Krishnan, Venkatesh; Liu, Yunlong

    2015-01-01

    PTH is an osteoanabolic for treating osteoporosis but its potency wanes. Disabling the transcription factor nuclear matrix protein 4 (Nmp4) in healthy, ovary-intact mice enhances bone response to PTH and bone morphogenetic protein 2 and protects from unloading-induced osteopenia. These Nmp4−/− mice exhibit expanded bone marrow populations of osteoprogenitors and supporting CD8+ T cells. To determine whether the Nmp4−/− phenotype persists in an osteoporosis model we compared PTH response in ovariectomized (ovx) wild-type (WT) and Nmp4−/− mice. To identify potential Nmp4 target genes, we performed bioinformatic/pathway profiling on Nmp4 chromatin immunoprecipitation sequencing (ChIP-seq) data. Mice (12 w) were ovx or sham operated 4 weeks before the initiation of PTH therapy. Skeletal phenotype analysis included microcomputed tomography, histomorphometry, serum profiles, fluorescence-activated cell sorting and the growth/mineralization of cultured WT and Nmp4−/− bone marrow mesenchymal stem progenitor cells (MSPCs). ChIP-seq data were derived using MC3T3-E1 preosteoblasts, murine embryonic stem cells, and 2 blood cell lines. Ovx Nmp4−/− mice exhibited an improved response to PTH coupled with elevated numbers of osteoprogenitors and CD8+ T cells, but were not protected from ovx-induced bone loss. Cultured Nmp4−/− MSPCs displayed enhanced proliferation and accelerated mineralization. ChIP-seq/gene ontology analyses identified target genes likely under Nmp4 control as enriched for negative regulators of biosynthetic processes. Interrogation of mRNA transcripts in nondifferentiating and osteogenic differentiating WT and Nmp4−/− MSPCs was performed on 90 Nmp4 target genes and differentiation markers. These data suggest that Nmp4 suppresses bone anabolism, in part, by regulating IGF-binding protein expression. Changes in Nmp4 status may lead to improvements in osteoprogenitor response to therapeutic cues. PMID:26244796

  3. Genome-Wide Mapping and Interrogation of the Nmp4 Antianabolic Bone Axis.

    PubMed

    Childress, Paul; Stayrook, Keith R; Alvarez, Marta B; Wang, Zhiping; Shao, Yu; Hernandez-Buquer, Selene; Mack, Justin K; Grese, Zachary R; He, Yongzheng; Horan, Daniel; Pavalko, Fredrick M; Warden, Stuart J; Robling, Alexander G; Yang, Feng-Chun; Allen, Matthew R; Krishnan, Venkatesh; Liu, Yunlong; Bidwell, Joseph P

    2015-09-01

    PTH is an osteoanabolic for treating osteoporosis but its potency wanes. Disabling the transcription factor nuclear matrix protein 4 (Nmp4) in healthy, ovary-intact mice enhances bone response to PTH and bone morphogenetic protein 2 and protects from unloading-induced osteopenia. These Nmp4(-/-) mice exhibit expanded bone marrow populations of osteoprogenitors and supporting CD8(+) T cells. To determine whether the Nmp4(-/-) phenotype persists in an osteoporosis model we compared PTH response in ovariectomized (ovx) wild-type (WT) and Nmp4(-/-) mice. To identify potential Nmp4 target genes, we performed bioinformatic/pathway profiling on Nmp4 chromatin immunoprecipitation sequencing (ChIP-seq) data. Mice (12 w) were ovx or sham operated 4 weeks before the initiation of PTH therapy. Skeletal phenotype analysis included microcomputed tomography, histomorphometry, serum profiles, fluorescence-activated cell sorting and the growth/mineralization of cultured WT and Nmp4(-/-) bone marrow mesenchymal stem progenitor cells (MSPCs). ChIP-seq data were derived using MC3T3-E1 preosteoblasts, murine embryonic stem cells, and 2 blood cell lines. Ovx Nmp4(-/-) mice exhibited an improved response to PTH coupled with elevated numbers of osteoprogenitors and CD8(+) T cells, but were not protected from ovx-induced bone loss. Cultured Nmp4(-/-) MSPCs displayed enhanced proliferation and accelerated mineralization. ChIP-seq/gene ontology analyses identified target genes likely under Nmp4 control as enriched for negative regulators of biosynthetic processes. Interrogation of mRNA transcripts in nondifferentiating and osteogenic differentiating WT and Nmp4(-/-) MSPCs was performed on 90 Nmp4 target genes and differentiation markers. These data suggest that Nmp4 suppresses bone anabolism, in part, by regulating IGF-binding protein expression. Changes in Nmp4 status may lead to improvements in osteoprogenitor response to therapeutic cues. PMID:26244796

  4. Removing H/sub 2/S from gas with recycled nmp extraction solvent

    SciTech Connect

    Blume, J.H.; Bushnell, J.D.; Leighton, M.D.

    1980-06-17

    An improved process is claimed for removing H/sub 2/S from a hydrofiner tail gas wherein said tail gas is passed into a scrubbing zone wherein it is contacted with liquid nmp to remove most of the H/sub 2/S from the gas to form an H/sub 2/S-rich nmp solution and the H/sub 2/S-rich nmp solution is heated and passed into a stripping zone to remove most of the H/sub 2/S from the nmp to form an H/sub 2/S-lean nmp solution, extracting a hydrocarbon oil with an nmp solution, recovering hot liquid, nmp from said extracted oil and combining it with said H/sub 2/S-lean nmp solution and wherein said combined solution comprises at least a portion of said nmp solution used to extract said oil, wherein the improvement comprises heating the H/sub 2/S-rich nmp solution to the required stripping temperature by indirectly contacting same, in heat exchange relationship, with at least a portion of said combined nmp solution. This process is especially useful for scrubbing H/sub 2/S from hydrofiner tail gas for use as once-through stripping gas in the solvent recovery section of a lube oil extraction process employing nmp as the extraction solvent, and in a preferred embodiment, the spent stripping gas from the lube oil solvent recovery is used as the stripping gas in the stripping zone of this invention.

  5. The epigenetically active small chemical N-methyl pyrrolidone (NMP) prevents estrogen depletion induced osteoporosis.

    PubMed

    Gjoksi, Bebeka; Ghayor, Chafik; Siegenthaler, Barbara; Ruangsawasdi, Nisarat; Zenobi-Wong, Marcy; Weber, Franz E

    2015-09-01

    Currently, there are several treatments for osteoporosis however; they all display some sort of limitation and/or side effects making the need for new treatments imperative. We have previously demonstrated that NMP is a bioactive drug which enhances bone regeneration in vivo and acts as an enhancer of bone morphogenetic protein (BMP) in vitro. NMP also inhibits osteoclast differentiation and attenuates bone resorption. In the present study, we tested NMP as a bromodomain inhibitor and for osteoporosis prevention on ovariectomized (OVX) induced rats while treated systemically with NMP. Female Sprague-Dawley rats were ovariectomized and weekly NMP treatment was administrated 1 week after surgery for 15 weeks. Bone parameters and related serum biomarkers were analyzed. 15 weeks of NMP treatment decreased ovariectomy-induced gained weight in average by 43% and improved bone mineral density (BMD) and bone volume over total volume (BV/TV) in rat femur on average by 25% and 41% respectively. Moreover, mineral apposition rate and bone biomarkers of bone turnover in the treatment group were at similar levels with those of the Sham group. Due to the function of NMP as a low affinity bromodomain inhibitor and its mechanism of action involving osteoblasts/osteoclasts balance and inhibitory effect on inflammatory cytokines, NMP is a promising therapeutic compound for the prevention of osteoporosis. PMID:25959414

  6. [Monograph for N-Methyl-pyrrolidone (NMP) and human biomonitoring values for the metabolites 5-Hydroxy-NMP and 2-Hydroxy-N-methylsuccinimide].

    PubMed

    2015-10-01

    1-Methyl-pyrrolidone (NMP) is used as a solvent in many technical applications. The general population may be exposed to NMP from the use as ingredient in paint and graffiti remover, indoors also from use in paints and carpeting. Because of developmental toxic effects, the use of NMP in consumer products in the EU is regulated. The developmental effects accompanied by weak maternally toxic effects in animal experiments are considered as the critical effects by the German HBM Commission. Based on these effects, HBM-I values of 10 mg/l urine for children and of 15 mg/l for adults, respectively, were derived for the metabolites 5-Hydroxy-NMP and 2-Hydroxy-N-methylsuccinimide. HBM-II-values were set to 30 mg/l urine for children and 50 mg/l for adults, respectively. Because of similar effects of the structural analogue 1-ethyl-2-pyrrolidone (NEP), the possible mixed exposure to both compounds has to be taken into account when evaluating the total burden. PMID:26324095

  7. Human volunteer study on the inhalational and dermal absorption of N-methyl-2-pyrrolidone (NMP) from the vapour phase.

    PubMed

    Bader, Michael; Wrbitzky, Renate; Blaszkewicz, Meinolf; Schäper, Michael; van Thriel, Christoph

    2008-01-01

    N-Methyl-2-pyrrolidone (NMP) is a versatile organic solvent frequently used for surface cleaning such as paint stripping or graffiti removal. Liquid NMP is rapidly absorbed through the skin but dermal vapour phase absorption might also play an important role for the uptake of the solvent. This particular aspect was investigated in an experimental study with 16 volunteers exposed to 80 mg/m(3) NMP for 8 h under either whole-body, i.e. inhalational plus dermal, or dermal-only conditions. Additionally, the influence of moderate physical workload on the uptake of NMP was studied. The urinary concentrations of NMP and its metabolites 5-hydroxy-N-methyl-2-pyrrolidone (5-HNMP) and 2-hydroxy-N-methylsuccinimide (2-HMSI) were followed for 48 h and analysed by gas chromatography-mass spectrometry (GC-MS). Percutaneous uptake delayed the elimination peak times and the apparent biological half-lives of NMP and 5-HNMP. Under resting conditions, dermal-only exposure resulted in the elimination of 71 +/- 8 mg NMP equivalents as compared to 169 +/- 15 mg for whole-body exposure. Moderate workload yielded 79 +/- 8 mg NMP (dermal-only) and 238 +/- 18 mg (whole-body). Thus, dermal absorption from the vapour phase may contribute significantly to the total uptake of NMP, e.g. from workplace atmospheres. As the concentration of airborne NMP does not reflect the body dose, biomonitoring should be carried out for surveillance purposes. PMID:17721780

  8. The neutral metallopeptidase NMP1 of Trichoderma guizhouense is required for mycotrophy and self-defence.

    PubMed

    Zhang, Jian; Bayram Akcapinar, Gunseli; Atanasova, Lea; Rahimi, Mohammad Javad; Przylucka, Agnieszka; Yang, Dongqing; Kubicek, Christian P; Zhang, Ruifu; Shen, Qirong; Druzhinina, Irina S

    2016-02-01

    Trichoderma guizhouense NJAU 4742 (Harzianum clade) can suppress the causative agent of banana wild disease Fusarium oxysporum f. sp. cubense 4 (Foc4). To identify genes involved in this trait, we used T-DNA insertional mutagenesis and isolated one mutant that was unable to overgrow Foc4 and had reduced antifungal ability. Using the high-efficiency thermal asymmetric interlaced-PCR, the T-DNA was located in the terminator of a neutral metalloprotease gene (encoding a MEROPS family M35 protease), which was named nmp1. The antifungal activity of the mutant was recovered by retransformation with wild-type nmp1 gene. The purified NMP1 (overexpressed in Pichia pastoris) did not inhibit the growth and germination of other fungi in vitro. Its addition, however, partly recovered the antifungal activity of the mutant strain against some fungi. The expression of nmp1 is induced by the presence of fungi and by dead fungal biomass, but the time-course of transcript accumulation following the physical contact depends on mode of interaction: it increases in cases of long-lasting parasitism and decreases if the prey fungus is dead shortly after or even before the contact (predation). We thus conclude that NMP1 protein of T. guizhouense has major importance for mycotrophic interactions and defence against other fungi. PMID:26118314

  9. Chemosensory effects during acute exposure to N-methyl-2-pyrrolidone (NMP).

    PubMed

    van Thriel, Christoph; Blaszkewicz, Meinolf; Schäper, Michael; Juran, Stephanie A; Kleinbeck, Stefan; Kiesswetter, Ernst; Wrbitzky, Renate; Stache, Jürgen; Golka, Klaus; Bader, Michael

    2007-12-10

    Organic solvents are still essential in many industrial applications. To improve safety and health in the working environment lower occupational thresholds limits have been established and less toxic substitutes were introduced. N-Methyl-2-pyrrolidone (NMP) is a versatile solvent that is used as a substitute for dichloromethane in paint strippers. Due to conflicting results, there is a debate whether NMP causes irritations of the upper airways/eyes or not. In a human experimental study we examined the chemosensory effects of NMP under controlled conditions. Fifteen healthy males were investigated in a cross-over study. NMP vapor concentrations were 10, 40 and 80 mg/m(3) for 2 x 4h with an exposure-free lunch break of 30 min. To maximize chemosensory effects a peak exposure scenario (25mg/m(3) baseline, 160 mg/m(3) peaks 4 x 15 min, time-weighted average: 72 mg/m(3)) was tested. The four different conditions were conducted with and without moderate physical workload. Chemosensory effects were measured physiologically by anterior rhinomanometry, eye blink rate and breathing frequency. Subjectively, ratings of acute health symptoms and intensity of olfactory and trigeminal sensations were collected repeatedly throughout the exposures. All physiological variables were unaffected by the different NMP concentrations and even the peak exposures were non-effective on these measures. Olfactory mediated health symptoms increased dose-dependently. For these symptoms a strong adaptation was observable, especially during the first 4h of the exposures. Other acute symptoms were not significantly affected. Comparable to the symptoms, only olfactory sensations increased dose-dependently. Trigeminal sensations (e.g. eye and nose irritations) were evaluated as being barely detectable during the different exposures, only during 160 mg/m(3) exposure peak weak and transient eye irritation were reported. The results clearly suggest that NMP concentrations of up to 160 mg/m(3) caused no

  10. Nmp4/CIZ: Road Block at the Intersection of PTH and Load

    PubMed Central

    Childress, Paul; Robling, Alexander G; Bidwell, Joseph P

    2009-01-01

    Teriparatide (parathyroid hormone, [PTH]) is the only FDA-approved drug that replaces bone lost to osteoporosis. Enhancing PTH efficacy will improve cost-effectiveness and ameliorate contraindications. Combining this hormone with load-bearing exercise may enhance therapeutic potential consistent with a growing body of evidence that these agonists are synergistic and share common signaling pathways. Additionally, neutralizing molecules that naturally suppress the anabolic response to PTH may also improve the efficacy of treatment with this hormone. Nmp4/CIZ (nuclear matrix protein 4/cas interacting zinc finger)-null mice have enhanced responses to intermittent PTH with respect to increasing trabecular bone mass and are also immune to disuse-induced bone loss likely by the removal of Nmp4/CIZ suppressive action on osteoblast function. Nmp4/CIZ activity may be sensitive to changes in the mechanical environment of the bone cell brought about by hormone- or mechanical load-induced changes in cell shape and adhesion. Nmp4 was identified in a screen for PTH-responsive nuclear matrix architectural transcription factors (ATFs) that we proposed translate hormone-induced changes in cell shape and adhesion into changes in target gene DNA conformation. CIZ was independently identified as a nucleocytoplasmic shuttling transcription factor associating with the mechano-sensitive focal adhesion proteins p130Cas and zxyin. The p130Cas/zyxin/Nmp4/CIZ pathway resembles the β-catenin/TCF/LEF1 mechanotransduction response limb and both share features with the HMGB1 (high mobility group box 1)/RAGE (receptor for advanced glycation end products) signaling axis. Here we describe Nmp4/CIZ within the context of the PTH-induced anabolic response and consider the place of this molecule in the hierarchy of the PTH-load response network. PMID:19766748

  11. Synthesis and First Principles Investigation of HMX/NMP Cocrystal Explosive

    NASA Astrophysics Data System (ADS)

    Lin, He; Zhu, Shun-Guan; Zhang, Lin; Peng, Xin-Hua; LI, Hong-Zhen

    2013-10-01

    1,3,5,7-Tetranitro-l,3,5,7-tetrazocine (HMX)/N-methyl-2-pyrrolidone (NMP) cocrystal explosive was prepared by a solution evaporation method. This cocrystal explosive crystallized in the trigonal system (space group ? ), with cell parameters a = 16.605(8) Å and c = 31.496(4) Å. Theoretical investigations of the formation mechanism of HMX/NMP cocrystal were carried out in Cambridge serial total energy package (CASTEP) based on dispersion-corrected density functional theory (DFT-D) with a plane wave scheme. The exchange-correlation potential was treated with the Perdew-Burke-Ernzerhof function of generalized gradient approximation, and dispersion force was correlated using Grimme's method. The band structure, density of states, projected density of states, and Mulliken populations were calculated at the generalized gradient approximation level. The results showed that the main host-guest interactions in HMX/NMP cocrystal were hydrogen bonds and stacking interactions, which were the same as those analyzed using X-ray diffraction. Theoretical investigations of HMX/NMP cocrystal explosive may provide the basis for the preparation of cocrystal explosive composed of HMX and energetic materials.

  12. Synthesis of NMP, a Fluoxetine (Prozac) Precursor, in the Introductory Organic Laboratory

    NASA Astrophysics Data System (ADS)

    Perrine, Daniel M.; Sabanayagam, Nathan R.; Reynolds, Kristy J.

    1998-10-01

    A synthesis of the immediate precursor of the widely used antidepressant fluoxetine (Prozac) is described. The procedure is short, safe, and simple enough to serve as a laboratory exercise for undergraduate students in the second semester of introductory organic chemistry and is one which will be particularly interesting to those planning a career in the health sciences. The compound synthesized is (°)-N,N-dimethyl-3-(p-trifluoromethylphenoxy)-3-phenylpropylamine, or "N-methyl Prozac" (NMP). The synthesis of NMP requires one two-hour period and a second three-hour period. In the first period, a common Mannich base, 3-dimethylaminopropiophenone, is reduced with sodium borohydride to form (°)-3-dimethylamino-1-phenylpropanol. In the second period, potassium t-butoxide is used to couple (°)-3-dimethylamino-1-phenylpropanol with p-chlorotrifluoromethylbenzene to form NMP, which is isolated as its oxalate salt. All processes use equipment and materials that are inexpensive and readily available in most undergraduate laboratories. Detailed physical data are given on NMP, including high-field DEPT 13C NMR.

  13. OPPT workplan risk assessment Methylene Chloride (dichloromethane; CASRN 75-09-2) and N-Methylpyrrolidone (NMP; CASRN 872-50-4)

    EPA Science Inventory

    These assessments will focus on the use of DCM and NMP in paint stripping. NMP exposures scenarios for this use result in inhalation and dermal exposure to consumers and workers. The low concern for environmental effects of NMP will be discussed in the assessment. In the case of ...

  14. Computational and spectroscopic studies on luminescence of [Ag(PPh 3) 2(NMP)]NO 3

    NASA Astrophysics Data System (ADS)

    Wei, Yong-Qin; Wu, Ke-Chen; Zhuang, Bo-Tao; Zhou, Zhang-Feng

    2005-09-01

    Silver compound [Ag(PPh 3) 2(NMP)]NO 3 ( 1) (NMP=2-(4-Dimethylaminophenyl)imidazo(4,5-f)(1,10)phenanthroline) has been synthesized and characterized by elemental analyses, IR and X-ray diffraction techniques. Two absorption bands (294 and 342 nm) have been observed in UV/vis absorption spectrum of 1. Compound 1 exhibits luminescence in solid state at room temperature and shows broad emission with maximum at 599 nm, which make red shift compared with free ligand NMP ( λmax=522 nm). Short lifetime (2.2 ns) indicates that emission of 1 does not come form triplet excited state and the intense spin-orbital coupling of Ag I has little effect on the lowest singlet excited state. Excited state of 1 was simulated by using the time-dependent density functional theory, which gave a sound explanation for absorption spectrum and indicated that emission of 1 originates form the metal-perturbed intraligand charge-transfer.

  15. CIZ/NMP4 is expressed in B16 melanoma and forms a positive feedback loop with RANKL to promote migration of the melanoma cells.

    PubMed

    Sakuma, Tomomi; Nakamoto, Tetsuya; Hemmi, Hiroaki; Kitazawa, Sohei; Kitazawa, Riko; Notomi, Takuya; Hayata, Tadayoshi; Ezura, Yoichi; Amagasa, Teruo; Noda, Masaki

    2012-07-01

    Tumor metastasis to bone is a serious pathological situation that causes severe pain, and deterioration in locomoter function. However, the mechanisms underlying tumor metastasis is still incompletely understood. CIZ/NMP4 is a nucleocytoplasmic shuttling protein and its roles in tumor cells have not been known. We, therefore, hypothesized the role of CIZ/NMP4 in B16 melanoma cells that metastasize to bone. CIZ/NMP4 is expressed in B16 cells. The CIZ/NMP4 expression levels are correlated to the metastatic activity in divergent types of melanoma cells. Overexpression of CIZ/NMP4 increased B16 cell migration in Trans-well assay. Conversely, siRNA-based knockdown of CIZ/NMP4 suppressed migratory activity of these cells. As RANKL promotes metastasis of tumor cells in bone, we tested its effect on CIZ in melanoma cells. RANKL treatment enhanced CIZ/NMP4 expression. This increase of CIZ by RANKL promoted migration. Conversely, we identified CIZ/NMP4 binding site in the promoter of RANKL. Furthermore, luciferase assay indicated that CIZ/NMP4 overexpression enhanced RANKL promoter activities, revealing a positive feedback loop of CIZ/NMP4 and RANKL in melanoma. These observations indicate that CIZ/NMP4 is critical regulator of metastasis of melanoma cells. PMID:22307584

  16. Nmp4/CIZ suppresses the response of bone to anabolic parathyroid hormone by regulating both osteoblasts and osteoclasts

    PubMed Central

    Childress, Paul; Philip, Binu K.; Robling, Alexander G.; Bruzzaniti, Angela; Kacena, Melissa A.; Bivi, Nicoletta; Plotkin, Lilian I.; Heller, Aaron; Bidwell, Joseph P.

    2011-01-01

    How parathyroid hormone (PTH) increases bone mass is unclear but understanding this phenomenon is significant to the improvement of osteoporosis therapy. Nmp4/CIZ is a nucleocytoplasmic shuttling transcriptional repressor that suppresses PTH-induced osteoblast gene expression and hormone-stimulated gains in murine femoral trabecular bone. To further characterize Nmp4/CIZ suppression of hormone-mediated bone growth we treated 10 wk-old Nmp4-knockout (KO) and wild-type (WT) mice with intermittent human PTH (1-34) at 30μg/kg/day or vehicle, 7 days/wk, for 2, 3, or 7 wks. Null mice treated with hormone (7 wks) gained more vertebral and tibial cancellous bone than WT animals paralleling the exaggerated response in the femur. Interestingly, Nmp4/CIZ suppression of this hormone-stimulated bone formation was not apparent during the first 2 wks of treatment. Consistent with the null mice enhanced PTH-stimulated addition of trabecular bone these animals exhibited an augmented hormone-induced increase in serum osteocalcin 3 wks into treatment. Unexpectedly the Nmp4-KO mice displayed an osteoclast phenotype. Serum C-terminal telopeptides, a marker for bone resorption, was elevated in the null mice, irrespective of treatment. Nmp4-KO bone marrow cultures produced more osteoclasts, which exhibited an elevated resorbing activity, compared to WT cultures. The expression of several genes critical to the development of both osteoblasts and osteoclasts were elevated in Nmp4-KO mice at 2 wks but not 3 wks of hormone exposure. We propose that Nmp4/CIZ dampens PTH-induced improvement of trabecular bone throughout the skeleton by transiently suppressing hormone-stimulated increases in the expression of proteins key to the required enhanced activity/number of both osteoblasts and osteoclasts. PMID:21607813

  17. Nmp4/CIZ suppresses the response of bone to anabolic parathyroid hormone by regulating both osteoblasts and osteoclasts.

    PubMed

    Childress, Paul; Philip, Binu K; Robling, Alexander G; Bruzzaniti, Angela; Kacena, Melissa A; Bivi, Nicoletta; Plotkin, Lilian I; Heller, Aaron; Bidwell, Joseph P

    2011-07-01

    How parathyroid hormone (PTH) increases bone mass is unclear, but understanding this phenomenon is significant to the improvement of osteoporosis therapy. Nmp4/CIZ is a nucleocytoplasmic shuttling transcriptional repressor that suppresses PTH-induced osteoblast gene expression and hormone-stimulated gains in murine femoral trabecular bone. To further characterize Nmp4/CIZ suppression of hormone-mediated bone growth, we treated 10-week-old Nmp4-knockout (KO) and wild-type (WT) mice with intermittent human PTH(1-34) at 30 μg/kg daily or vehicle, 7 days/week, for 2, 3, or 7 weeks. Null mice treated with hormone (7 weeks) gained more vertebral and tibial cancellous bone than WT animals, paralleling the exaggerated response in the femur. Interestingly, Nmp4/CIZ suppression of this hormone-stimulated bone formation was not apparent during the first 2 weeks of treatment. Consistent with the null mice enhanced PTH-stimulated addition of trabecular bone, these animals exhibited an augmented hormone-induced increase in serum osteocalcin 3 weeks into treatment. Unexpectedly, the Nmp4-KO mice displayed an osteoclast phenotype. Serum C-terminal telopeptide, a marker for bone resorption, was elevated in the null mice, irrespective of treatment. Nmp4-KO bone marrow cultures produced more osteoclasts, which exhibited elevated resorbing activity, compared to WT cultures. The expression of several genes critical to the development of both osteoblasts and osteoclasts was elevated in Nmp4-KO mice at 2 weeks, but not 3 weeks, of hormone exposure. We propose that Nmp4/CIZ dampens PTH-induced improvement of trabecular bone throughout the skeleton by transiently suppressing hormone-stimulated increases in the expression of proteins key to the required enhanced activity and number of both osteoblasts and osteoclasts. PMID:21607813

  18. Low cost environmental sensors for Spaceflight : NMP Space Environmental Monitor (SEM) requirements

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Buelher, Martin G.; Brinza, D.; Patel, J. U.

    2005-01-01

    An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.

  19. Low Cost Environmental Sensors for Spaceflight: NMP Space Environmental Monitor (SEM) Requirements

    NASA Technical Reports Server (NTRS)

    Garrett, Henry B.; Buehler, Martin G.; Brinza, D.; Patel, J. U.

    2005-01-01

    An outstanding problem in spaceflight is the lack of adequate sensors for monitoring the space environment and its effects on engineering systems. By adequate, we mean low cost in terms of mission impact (e.g., low price, low mass/size, low power, low data rate, and low design impact). The New Millennium Program (NMP) is investigating the development of such a low-cost Space Environmental Monitor (SEM) package for inclusion on its technology validation flights. This effort follows from the need by NMP to characterize the space environment during testing so that potential users can extrapolate the test results to end-use conditions. The immediate objective of this effort is to develop a small diagnostic sensor package that could be obtained from commercial sources. Environments being considered are: contamination, atomic oxygen, ionizing radiation, cosmic radiation, EMI, and temperature. This talk describes the requirements and rational for selecting these environments and reviews a preliminary design that includes a micro-controller data logger with data storage and interfaces to the sensors and spacecraft. If successful, such a sensor package could be the basis of a unique, long term program for monitoring the effects of the space environment on spacecraft systems.

  20. Charge transfer and 2kF vs. 4kF instabilities in the NMP-TCNQ molecular metal and (NMP)x(Phen)1-xTCNQ solid solutions

    NASA Astrophysics Data System (ADS)

    Alemany, Pere; Canadell, Enric; Pouget, Jean-Paul

    2016-01-01

    A first-principles DFT study of the electronic structure of the two-chain molecular conductor NMP-TCNQ is reported. It is shown that the charge transfer occurring in this salt is not 1 but 2/3, finally settling the debate concerning the real charge transfer in this molecular metal. These calculations also lead to a simple rationalization of the three different regimes of 2k F and 4k F CDW instabilities occurring in the solid solutions (NMP) x (Phen)1-x TCNQ.

  1. SYSTEM DESIGN FOR NMP AND NPDES PERMIT APPLICATION FOR THE U.S. MEAT ANIMAL RESEARCH CENTER

    Technology Transfer Automated Retrieval System (TEKTRAN)

    All concentrated animal feeding operations that have the potential to discharge are required to apply for a National Pollutant Discharge Elimination System (NPDES) permit. The process also requires development of a nutrient management plan (NMP). Recent actions by the U.S. Environmental Protection...

  2. DEMONSTRATION OF N-METHYL PYRROLIDONE (NMP) AS A POLLUTION PREVENTION ALTERNATIVE TO PAINT STRIPPING WITH METHYLENE CHLORIDE

    EPA Science Inventory

    This objective of this paper is to demonstrate that NMP is a viable pollution prevention alternative to methylene chloride. Maine Corps Logistics Base (MCLB), Albany, GA, USA was the host site for the demonstration. MCLB's primary function is maintenance of military ground supp...

  3. Performance Analysis of the Absorption Refrigeration Cycle using TFE/NMP as a Working Fluid

    NASA Astrophysics Data System (ADS)

    Kato, Masashi; Tsujimori, Atsushi; Nakaguchi, Kentaro; Yabune, Hiroyuki; Akutsu, Toshinosuke; Nakao, Kazusige

    Performance analysis was made for the generator of the absorption refrigeration cycle using TFE/NMP as a working fluid. In this study the dynamic model was constructed. This model includes the heat and mass transfer characteristics in the generator and is able to predict the outlet concentration and the flow rate of the generated refrigerant vapor according to the change of the operating conditions of the absorption refrigeration cycle. The heat transfer in the generator was decided giving the heat transfer coefficient with temperature difference between the heat transfer wall of the generator and the solution. And the mass transfer was decided giving the over-all mass transfer coefficient between the solution bulk flow and the generated refrigerant bubbles. In this study the change of the concentration and the flow rate of the generated refrigerant vapor was mainly calculated when the strong solution flow rate, the generator wall temperature and the generation pressure were dynamically increased in incremental steps. And in starting and stopping the system, the effect of the generative heat transfer coefficient, over-all mass transfer coefficient and the strong solution flow rate were investigated.

  4. Biochemical characterization of Arabidopsis APYRASE family reveals their roles in regulating endomembrane NDP/NMP homoeostasis.

    PubMed

    Chiu, Tsan-Yu; Lao, Jeemeng; Manalansan, Bianca; Loqué, Dominique; Roux, Stanley J; Heazlewood, Joshua L

    2015-11-15

    Plant apyrases are nucleoside triphosphate (NTP) diphosphohydrolases (NTPDases) and have been implicated in an array of functions within the plant including the regulation of extracellular ATP. Arabidopsis encodes a family of seven membrane bound apyrases (AtAPY1-7) that comprise three distinct clades, all of which contain the five conserved apyrase domains. With the exception of AtAPY1 and AtAPY2, the biochemical and the sub-cellular characterization of the other members are currently unavailable. In this research, we have shown all seven Arabidopsis apyrases localize to internal membranes comprising the cis-Golgi, endoplasmic reticulum (ER) and endosome, indicating an endo-apyrase classification for the entire family. In addition, all members, with the exception of AtAPY7, can function as endo-apyrases by complementing a yeast double mutant (Δynd1Δgda1) which lacks apyrase activity. Interestingly, complementation of the mutant yeast using well characterized human apyrases could only be accomplished by using a functional ER endo-apyrase (NTPDase6), but not the ecto-apyrase (NTPDase1). Furthermore, the substrate specificity analysis for the Arabidopsis apyrases AtAPY1-6 indicated that each member has a distinct set of preferred substrates covering various NDPs (nucleoside diphosphates) and NTPs. Combining the biochemical analysis and sub-cellular localization of the Arabidopsis apyrases family, the data suggest their possible roles in regulating endomembrane NDP/NMP (nucleoside monophosphate) homoeostasis. PMID:26338998

  5. Symbiotic fungi that are essential for plant nutrient uptake investigated with NMP

    NASA Astrophysics Data System (ADS)

    Pallon, J.; Wallander, H.; Hammer, E.; Arteaga Marrero, N.; Auzelyte, V.; Elfman, M.; Kristiansson, P.; Nilsson, C.; Olsson, P. A.; Wegdén, M.

    2007-07-01

    The nuclear microprobe (NMP) technique using PIXE for elemental analysis and STIM on/off axis for parallel mass density normalization has proven successful to investigate possible interactions between minerals and ectomycorrhizal (EM) mycelia that form symbiotic associations with forest trees. The ability for the EM to make elements biologically available from minerals and soil were compared in field studies and in laboratory experiments, and molecular analysis (PCR-RFLP) was used to identify ectomycorrhizal species from the field samplings. EM rhizomorphs associated with apatite in laboratory systems and in mesh bags incubated in forest ecosystems contained larger amounts of Ca than similar rhizomorphs connected to acid-washed sand. EM mycelium produced in mesh bags had a capacity to mobilize P from apatite-amended sand and a high concentration of K in some rhizomorphs suggests that these fungi are good accumulators of K and may have a significant role in transporting K to trees. Spores formed by arbuscular mycorrhizal (AM) fungi in laboratory cultures were compared with spores formed in saline soils in Tunisia in Northern Africa. We found lower concentrations of P and higher concentrations of Cl in the spores collected from the field than in the spores collected from laboratory cultures. For the case of laboratory cultures, the distribution of e.g. P and K was found to be clearly correlated.

  6. Evidence that the outer membrane protein gene nmpC of Escherichia coli K-12 lies within the defective qsr' prophage.

    PubMed Central

    Highton, P J; Chang, Y; Marcotte, W R; Schnaitman, C A

    1985-01-01

    Recombinants between phage lambda and the defective qsr' prophage of Escherichia coli K-12 were made in an nmpC (p+) mutant strain and in the nmpC+ parent. The outer membrane of strains lysogenic for recombinant qsr' phage derived from the nmpC (p+) strain contained a new protein identical in electrophoretic mobility to the NmpC porin and to the Lc porin encoded by phage PA-2. Lysogens of qsr' recombinants from the nmpC+ strain and lysogens of lambda p4, which carries the qsr' region, did not produce this protein. When observed by electron microscopy, the DNA acquired from the qsr' prophage showed homology with the region of the DNA molecule of phage PA-2 which contains the lc gene. Relative to that of the recombinant from the nmpC (p+) mutant, the DNA molecule of the recombinant from the nmpC+ parent contained an insertion near the lc gene. These results were supported by blot hybridization analysis of the E. coli chromosome with probes derived from the lc gene of phage PA-2. A sequence homologous to the lc gene was found at the nmpC locus, and the parental strains contained an insertion, tentatively identified as IS5B, located near the 3' end of the porin coding sequence. We conclude that the structural gene for the NmpC porin protein is located within the defective qsr' prophage at 12.5 min on the E. coli K-12 map and that this gene can be activated by loss of an insertion element. Images PMID:2984173

  7. NMP and O2 as Radical Initiator: Trifluoromethylation of Alkenes to Tertiary β-Trifluoromethyl Alcohols at Room Temperature.

    PubMed

    Liu, Chao; Lu, Qingquan; Huang, Zhiyuan; Zhang, Jian; Liao, Fan; Peng, Pan; Lei, Aiwen

    2015-12-18

    A novel strategy was developed to trigger ·CF3 by using in situ generated peroxide in NMP under O2 or air as the radical initiator. Radical trifluoromethylation of alkenes was achieved toward tertiary β-trifluoromethyl alcohols. Various tertiary β-trifluoromethyl alcohols can be synthesized in good yields without extra oxidants or transition metal catalysts. Preliminary mechanistic investigation revealed that O2 diffusion can influence the reaction rate. PMID:26649920

  8. DMAC and NMP as Electrolyte Additives for Li-Ion Cells

    NASA Technical Reports Server (NTRS)

    Smart, Marshall; Bugga, Ratnakumar; Lucht, Brett

    2008-01-01

    Dimethyl acetamide (DMAC) and N-methyl pyrrolidinone (NMP) have been found to be useful as high-temperature-resilience-enhancing additives to a baseline electrolyte used in rechargeable lithium-ion electrochemical cells. The baseline electrolyte, which was previously formulated to improve low-temperature performance, comprises LiPF6 dissolved at a concentration of 1.0 M in a mixture comprising equal volume proportions of ethylene carbonate, diethyl carbonate, and dimethyl carbonate. This and other electrolytes comprising lithium salts dissolved in mixtures of esters (including alkyl carbonates) have been studied in continuing research directed toward extending the lower limits of operating temperatures and, more recently, enhancing the high-temperature resilience of such cells. This research at earlier stages, and the underlying physical and chemical principles, were reported in numerous previous NASA Tech Briefs articles. Although these electrolytes provide excellent performance at low temperatures (typically as low as -40 C), when the affected Li-ion cells are subjected to high temperatures during storage and cycling, there occur irreversible losses of capacity accompanied by power fade and deterioration of low-temperature performance. The term "high-temperature resilience" signifies, loosely, the ability of a cell to resist such deterioration, retaining as much as possible of its initial charge/discharge capacity during operation or during storage in the fully charged condition at high temperature. For the purposes of the present development, a temperature is considered to be high if it equals or exceeds the upper limit (typically, 30 C) of the operating-temperature range for which the cells in question are generally designed.

  9. Embryotoxic potential of N-methyl-pyrrolidone (NMP) and three of its metabolites using the rat whole embryo culture system

    SciTech Connect

    Flick, Burkhard Talsness, Chris E.; Jaeckh, Rudolf; Buesen, Roland; Klug, Stephan

    2009-06-01

    N-methyl-2-pyrrolidone (NMP), which undergoes extensive biotransformation, has been shown in vivo to cause developmental toxicity and, especially after oral treatment, malformations in rats and rabbits. Data are lacking as to whether the original compound or one of its main metabolites is responsible for the toxic effects observed. Therefore, the relative embryotoxicity of the parent compound and its metabolites was evaluated using rat whole embryo culture (WEC) and the balb/c 3T3 cytotoxicity test. The resulting data were evaluated using two strategies; namely, one based on using all endpoints determined in the WEC and the other including endpoints from both the WEC and the cytotoxicity test. On basis of the first analysis, the substance with the highest embryotoxic potential is NMP, followed by 5-hydroxy-N-methyl-pyrrolidone (5-HNMP), 2-hydroxy-N-methylsuccinimide (2-HMSI) and N-methylsuccinimide (MSI). Specific dysmorphogeneses induced by NMP and 5-HNMP were aberrations in the head region of the embryos, abnormal development of the second visceral arches and open neural pores. The second evaluation strategy used only two endpoints of the WEC, i.e. the no observed adverse effect concentration (NOAEC{sub WEC}) and the lowest concentration leading to dysmorphogenesis in 100% of the cultured embryos (IC{sub MaxWEC}). In addition to these WEC endpoints the IC{sub 503T3} from the cytotoxicity test (balb/c 3T3 fibroblasts) was included in the evaluation scheme. These three endpoints were applied to a prediction model developed during a validation study of the European Centre for the Validation of Alternative Methods (ECVAM) allowing the classification of the embryotoxic potential of each compound into three classes (non-, weakly- and strongly embryotoxic). Consistent results from both evaluation strategies were observed, whereby NMP and its metabolites revealed a direct embryotoxic potential. Hereby, only NMP and 5-HNMP induced specific embryotoxic effects and were

  10. Why Probability?

    ERIC Educational Resources Information Center

    Weatherly, Myra S.

    1984-01-01

    Instruction in mathematical probability to enhance higher levels of critical and creative thinking with gifted students is described. Among thinking skills developed by such an approach are analysis, synthesis, evaluation, fluency, and complexity. (CL)

  11. Interaction partners for human ZNF384/CIZ/NMP4-zyxin as a mediator for p130CAS signaling?

    SciTech Connect

    Janssen, Hilde; Marynen, Peter . E-mail: Peter.Marynen@med.kuleuven.be

    2006-04-15

    Transcription factor ZNF384/CIZ/NMP4 was first cloned in rat as a p130Cas-binding protein and has a role in bone metabolism and spermatogenesis. It is recurrently involved in translocations in acute lymphoblastic leukemia. Translocations t(12;17) and t(12;22) fuse ZNF384 to RNA-binding proteins TAF15 and EWSR1, while a translocation t(12;19) generates an E2A/ZNF384 fusion. We screened for ZNF384 interacting proteins using yeast two-hybrid technology. In contrast to its rat homolog, human ZNF384 does not interact with p130CAS. Zyxin, PCBP1, and vimentin, however, were identified as ZNF384-binding partners. Given the interaction between human zyxin and p130CAS, these results suggest that zyxin indirectly enables the interaction of ZNF384 with p130CAS which is described in rat.

  12. Solute Transport Proteins and the Outer Membrane Protein NmpC Contribute to Heat Resistance of Escherichia coli AW1.7▿

    PubMed Central

    Ruan, Lifang; Pleitner, Aaron; Gänzle, Michael G.; McMullen, Lynn M.

    2011-01-01

    This study aimed to elucidate determinants of heat resistance in Escherichia coli by comparing the composition of membrane lipids, as well as gene expression, in heat-resistant E. coli AW1.7 and heat-sensitive E. coli GGG10 with or without heat shock. The survival of E. coli AW1.7 at late exponential phase was 100-fold higher than that of E. coli GGG10 after incubation at 60°C for 15 min. The cytoplasmic membrane of E. coli AW1.7 contained a higher proportion of saturated and cyclopropane fatty acids than that of E. coli GGG10. Microarray hybridization of cDNA libraries obtained from exponentially growing or heat-shocked cultures was performed to compare gene expression in these two strains. Expression of selected genes from different functional groups was quantified by quantitative PCR. DnaK and 30S and 50S ribosomal subunits were overexpressed in E. coli GGG10 relative to E. coli AW1.7 upon heat shock at 50°C, indicating improved ribosome stability. The outer membrane porin NmpC and several transport proteins were overexpressed in exponentially growing E. coli AW1.7. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis of membrane properties confirmed that NmpC is present in the outer membrane of E. coli AW1.7 but not in that of E. coli GGG10. Expression of NmpC in E. coli GGG10 increased survival at 60°C 50- to 1,000-fold. In conclusion, the outer membrane porin NmpC contributes to heat resistance in E. coli AW1.7, but the heat resistance of this strain is dependent on additional factors, which likely include the composition of membrane lipids, as well as solute transport proteins. PMID:21398480

  13. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  14. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  15. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  16. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  17. Evolution and Probability.

    ERIC Educational Resources Information Center

    Bailey, David H.

    2000-01-01

    Some of the most impressive-sounding criticisms of the conventional theory of biological evolution involve probability. Presents a few examples of how probability should and should not be used in discussing evolution. (ASK)

  18. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  19. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  20. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  1. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  2. A Posteriori Transit Probabilities

    NASA Astrophysics Data System (ADS)

    Stevens, Daniel J.; Gaudi, B. Scott

    2013-08-01

    Given the radial velocity (RV) detection of an unseen companion, it is often of interest to estimate the probability that the companion also transits the primary star. Typically, one assumes a uniform distribution for the cosine of the inclination angle i of the companion's orbit. This yields the familiar estimate for the prior transit probability of ~Rlowast/a, given the primary radius Rlowast and orbital semimajor axis a, and assuming small companions and a circular orbit. However, the posterior transit probability depends not only on the prior probability distribution of i but also on the prior probability distribution of the companion mass Mc, given a measurement of the product of the two (the minimum mass Mc sin i) from an RV signal. In general, the posterior can be larger or smaller than the prior transit probability. We derive analytic expressions for the posterior transit probability assuming a power-law form for the distribution of true masses, dΓ/dMcvpropMcα, for integer values -3 <= α <= 3. We show that for low transit probabilities, these probabilities reduce to a constant multiplicative factor fα of the corresponding prior transit probability, where fα in general depends on α and an assumed upper limit on the true mass. The prior and posterior probabilities are equal for α = -1. The posterior transit probability is ~1.5 times larger than the prior for α = -3 and is ~4/π times larger for α = -2, but is less than the prior for α>=0, and can be arbitrarily small for α > 1. We also calculate the posterior transit probability in different mass regimes for two physically-motivated mass distributions of companions around Sun-like stars. We find that for Jupiter-mass planets, the posterior transit probability is roughly equal to the prior probability, whereas the posterior is likely higher for Super-Earths and Neptunes (10 M⊕ - 30 M⊕) and Super-Jupiters (3 MJup - 10 MJup), owing to the predicted steep rise in the mass function toward smaller

  3. Single-case probabilities

    NASA Astrophysics Data System (ADS)

    Miller, David

    1991-12-01

    The propensity interpretation of probability, bred by Popper in 1957 (K. R. Popper, in Observation and Interpretation in the Philosophy of Physics, S. Körner, ed. (Butterworth, London, 1957, and Dover, New York, 1962), p. 65; reprinted in Popper Selections, D. W. Miller, ed. (Princeton University Press, Princeton, 1985), p. 199) from pure frequency stock, is the only extant objectivist account that provides any proper understanding of single-case probabilities as well as of probabilities in ensembles and in the long run. In Sec. 1 of this paper I recall salient points of the frequency interpretations of von Mises and of Popper himself, and in Sec. 2 I filter out from Popper's numerous expositions of the propensity interpretation its most interesting and fertile strain. I then go on to assess it. First I defend it, in Sec. 3, against recent criticisms (P. Humphreys, Philos. Rev. 94, 557 (1985); P. Milne, Erkenntnis 25, 129 (1986)) to the effect that conditional [or relative] probabilities, unlike absolute probabilities, can only rarely be made sense of as propensities. I then challenge its predominance, in Sec. 4, by outlining a rival theory: an irreproachably objectivist theory of probability, fully applicable to the single case, that interprets physical probabilities as instantaneous frequencies.

  4. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  5. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  6. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  7. Acceptance, values, and probability.

    PubMed

    Steel, Daniel

    2015-10-01

    This essay makes a case for regarding personal probabilities used in Bayesian analyses of confirmation as objects of acceptance and rejection. That in turn entails that personal probabilities are subject to the argument from inductive risk, which aims to show non-epistemic values can legitimately influence scientific decisions about which hypotheses to accept. In a Bayesian context, the argument from inductive risk suggests that value judgments can influence decisions about which probability models to accept for likelihoods and priors. As a consequence, if the argument from inductive risk is sound, then non-epistemic values can affect not only the level of evidence deemed necessary to accept a hypothesis but also degrees of confirmation themselves. PMID:26386533

  8. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  9. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  10. Application of Quantum Probability

    NASA Astrophysics Data System (ADS)

    Bohdalová, Mária; Kalina, Martin; Nánásiová, Ol'ga

    2009-03-01

    This is the first attempt to smooth time series using estimators with applying quantum probability with causality (non-commutative s-maps on an othomodular lattice). In this context it means that we use non-symmetric covariance matrix to construction of our estimator.

  11. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  12. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  13. Probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  14. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  15. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  16. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  17. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  18. Quantum probability and many worlds

    NASA Astrophysics Data System (ADS)

    Hemmo, Meir; Pitowsky, Itamar

    We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.

  19. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  20. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  1. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  3. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  4. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  5. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  6. Computation of Most Probable Numbers

    PubMed Central

    Russek, Estelle; Colwell, Rita R.

    1983-01-01

    A rapid computational method for maximum likelihood estimation of most-probable-number values, incorporating a modified Newton-Raphson method, is presented. The method offers a much greater reliability for the most-probable-number estimate of total viable bacteria, i.e., those capable of growth in laboratory media. PMID:6870242

  7. Probability of sea level rise

    SciTech Connect

    Titus, J.G.; Narayanan, V.K.

    1995-10-01

    The report develops probability-based projections that can be added to local tide-gage trends to estimate future sea level at particular locations. It uses the same models employed by previous assessments of sea level rise. The key coefficients in those models are based on subjective probability distributions supplied by a cross-section of climatologists, oceanographers, and glaciologists.

  8. Decision analysis with approximate probabilities

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  9. VESPA: False positive probabilities calculator

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.

    2015-03-01

    Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA) calculates false positive probabilities and statistically validates transiting exoplanets. Written in Python, it uses isochrones [ascl:1503.010] and the package simpledist.

  10. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  11. The probabilities of unique events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  12. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  13. Transition probabilities of Br II

    NASA Technical Reports Server (NTRS)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  14. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  15. Evaluation of microbial release probabilities

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Work undertaken to improve the estimation of the probability of release of microorganisms from unmanned Martian landing spacecraft is summarized. An analytical model is described for the development of numerical values for release parameters and release mechanisms applicable to flight missions are defined. Laboratory test data are used to evolve parameter values for use by flight projects in estimating numerical values for release probabilities. The analysis treats microbial burden located on spacecraft surfaces, between mated surfaces, and encapsulated within materials.

  16. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  17. Tiempo para un cambio

    NASA Astrophysics Data System (ADS)

    Woltjer, L.

    1987-06-01

    En la reunion celebrada en diciembre dei ano pasado informe al Consejo de mi deseo de terminar mi contrato como Director General de la ESO una vez que fuera aprobado el proyecto dei VLT, que se espera sucedera hacia fines de este aAo. Cuando fue renovada mi designacion hace tres aAos, el Consejo conocia mi intencion de no completar los cinco aAos dei contrato debido a mi deseo de disponer de mas tiempo para otras actividades. Ahora, una vez terminada la fase preparatoria para el VLT, Y habiendose presentado el proyecto formalmente al Consejo el dia 31 de marzo, y esperando su muy probable aprobacion antes dei termino de este ano, me parece que el 10 de enero de 1988 presenta una excelente fecha para que se produzca un cambio en la administracion de la ESO.

  18. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  19. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  20. Measure and probability in cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua S.; Wald, Robert M.

    2012-07-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. In ordinary statistical physics, the Liouville measure is used to compute probabilities of macrostates, and it would seem natural to use the similar measure arising in general relativity to compute probabilities in cosmology, such as the probability that the Universe underwent an era of inflation. Indeed, a number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)—namely, the Gibbons-Hawking-Stewart measure—to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account (we illustrate how) even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines “nearly homogeneous.” (4) In a Universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the Universe to retrodict the likelihood of past conditions.

  1. Flood hazard probability mapping method

    NASA Astrophysics Data System (ADS)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  2. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  3. Probability as a Physical Motive

    NASA Astrophysics Data System (ADS)

    Martin, Peter

    2007-06-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP”) to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  4. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  5. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  6. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  7. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  8. Conditional Independence in Applied Probability.

    ERIC Educational Resources Information Center

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  9. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  10. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    ERIC Educational Resources Information Center

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  11. Dynamic SEP event probability forecasts

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  12. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  13. Probability densities in strong turbulence

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  14. Probability for primordial black holes

    NASA Astrophysics Data System (ADS)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  15. Relative transition probabilities of cobalt

    NASA Technical Reports Server (NTRS)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  16. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  17. Probability of Detection Demonstration Transferability

    NASA Technical Reports Server (NTRS)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  18. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters. PMID:22407706

  19. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  20. Measure and Probability in Cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua; Wald, Robert

    2012-03-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. A number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)---namely, the Gibbons-Hawking-Stewart measure---to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines ``nearly homogeneous''. (4) In a universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the universe to ``retrodict'' the likelihood of past conditions.

  1. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  2. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  3. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  4. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  5. Exploring the Overestimation of Conjunctive Probabilities

    PubMed Central

    Nilsson, Håkan; Rieskamp, Jörg; Jenny, Mirjam A.

    2013-01-01

    People often overestimate probabilities of conjunctive events. The authors explored whether the accuracy of conjunctive probability estimates can be improved by increased experience with relevant constituent events and by using memory aids. The first experiment showed that increased experience with constituent events increased the correlation between the estimated and the objective conjunctive probabilities, but that it did not reduce overestimation of conjunctive probabilities. The second experiment showed that reducing cognitive load with memory aids for the constituent probabilities led to improved estimates of the conjunctive probabilities and to decreased overestimation of conjunctive probabilities. To explain the cognitive process underlying people’s probability estimates, the configural weighted average model was tested against the normative multiplicative model. The configural weighted average model generates conjunctive probabilities that systematically overestimate objective probabilities although the generated probabilities still correlate strongly with the objective probabilities. For the majority of participants this model was better than the multiplicative model in predicting the probability estimates. However, when memory aids were provided, the predictive accuracy of the multiplicative model increased. In sum, memory tools can improve people’s conjunctive probability estimates. PMID:23460026

  6. Direct probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.

    1993-09-17

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration.

  7. Trajectory versus probability density entropy.

    PubMed

    Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy. PMID:11461383

  8. Trajectory versus probability density entropy

    NASA Astrophysics Data System (ADS)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  9. Probability distributions of turbulent energy.

    PubMed

    Momeni, Mahdi; Müller, Wolf-Christian

    2008-05-01

    Probability density functions (PDFs) of scale-dependent energy fluctuations, P[deltaE(l)] , are studied in high-resolution direct numerical simulations of Navier-Stokes and incompressible magnetohydrodynamic (MHD) turbulence. MHD flows with and without a strong mean magnetic field are considered. For all three systems it is found that the PDFs of inertial range energy fluctuations exhibit self-similarity and monoscaling in agreement with recent solar-wind measurements [Hnat, Geophys. Res. Lett. 29, 86 (2002)]. Furthermore, the energy PDFs exhibit similarity over all scales of the turbulent system showing no substantial qualitative change of shape as the scale of the fluctuations varies. This is in contrast to the well-known behavior of PDFs of turbulent velocity fluctuations. In all three cases under consideration the P[deltaE(l)] resemble Lévy-type gamma distributions approximately Delta;{-1} exp(-|deltaE|/Delta)|deltaE|;{-gamma} The observed gamma distributions exhibit a scale-dependent width Delta(l) and a system-dependent gamma . The monoscaling property reflects the inertial-range scaling of the Elsässer-field fluctuations due to lacking Galilei invariance of deltaE . The appearance of Lévy distributions is made plausible by a simple model of energy transfer. PMID:18643170

  10. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  12. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  13. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  14. Illustrating Basic Probability Calculations Using "Craps"

    ERIC Educational Resources Information Center

    Johnson, Roger W.

    2006-01-01

    Instructors may use the gambling game of craps to illustrate the use of a number of fundamental probability identities. For the "pass-line" bet we focus on the chance of winning and the expected game length. To compute these, probabilities of unions of disjoint events, probabilities of intersections of independent events, conditional probabilities…

  15. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  16. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  17. Subjective and objective probabilities in quantum mechanics

    SciTech Connect

    Srednicki, Mark

    2005-05-15

    We discuss how the apparently objective probabilities predicted by quantum mechanics can be treated in the framework of Bayesian probability theory, in which all probabilities are subjective. Our results are in accord with earlier work by Caves, Fuchs, and Schack, but our approach and emphasis are different. We also discuss the problem of choosing a noninformative prior for a density matrix.

  18. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  19. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  20. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior. PMID:20525915

  1. Derivation of quantum probability from measurement

    NASA Astrophysics Data System (ADS)

    Herbut, Fedor

    2016-05-01

    To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).

  2. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  3. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  4. Experience Matters: Information Acquisition Optimizes Probability Gain

    PubMed Central

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  5. Dynamic probability estimator for machine learning.

    PubMed

    Starzyk, Janusz A; Wang, Feng

    2004-03-01

    An efficient algorithm for dynamic estimation of probabilities without division on unlimited number of input data is presented. The method estimates probabilities of the sampled data from the raw sample count, while keeping the total count value constant. Accuracy of the estimate depends on the counter size, rather than on the total number of data points. Estimator follows variations of the incoming data probability within a fixed window size, without explicit implementation of the windowing technique. Total design area is very small and all probabilities are estimated concurrently. Dynamic probability estimator was implemented using a programmable gate array from Xilinx. The performance of this implementation is evaluated in terms of the area efficiency and execution time. This method is suitable for the highly integrated design of artificial neural networks where a large number of dynamic probability estimators can work concurrently. PMID:15384523

  6. Entropy analysis of systems exhibiting negative probabilities

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J. A.

    2016-07-01

    This paper addresses the concept of negative probability and its impact upon entropy. An analogy between the probability generating functions, in the scope of quasiprobability distributions, and the Grünwald-Letnikov definition of fractional derivatives, is explored. Two distinct cases producing negative probabilities are formulated and their distinct meaning clarified. Numerical calculations using the Shannon entropy characterize further the characteristics of the two limit cases.

  7. Calculating the CEP (Circular Error Probable)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report compares the probability contained in the Circular Error Probable associated with an Elliptical Error Probable to that of the EEP at a given confidence level. The levels examined are 50 percent and 95 percent. The CEP is found to be both more conservative and less conservative than the associated EEP, depending on the eccentricity of the ellipse. The formulas used are derived in the appendix.

  8. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  9. Predicting accurate probabilities with a ranking loss

    PubMed Central

    Menon, Aditya Krishna; Jiang, Xiaoqian J; Vembu, Shankar; Elkan, Charles; Ohno-Machado, Lucila

    2013-01-01

    In many real-world applications of machine learning classifiers, it is essential to predict the probability of an example belonging to a particular class. This paper proposes a simple technique for predicting probabilities based on optimizing a ranking loss, followed by isotonic regression. This semi-parametric technique offers both good ranking and regression performance, and models a richer set of probability distributions than statistical workhorses such as logistic regression. We provide experimental results that show the effectiveness of this technique on real-world applications of probability prediction. PMID:25285328

  10. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  11. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  12. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  13. Teaching Statistics and Probability: 1981 Yearbook.

    ERIC Educational Resources Information Center

    Shulte, Albert P., Ed.; Smart, James R., Ed.

    This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…

  14. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  15. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  16. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  17. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  18. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  19. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  20. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  1. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  2. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  3. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  4. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  5. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  6. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  7. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  8. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  9. Quantum probability assignment limited by relativistic causality

    PubMed Central

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  10. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  11. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma. PMID:27477933

  12. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  13. Semigroups of tomographic probabilities and quantum correlations

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.

    2008-08-01

    Semigroups of stochastic and bistochastic matrices constructed by means of spin tomograms or tomographic probabilities and their relations to the problem of Bell's inequalities and entanglement are reviewed. The probability determining the quantum state of spins and the probability densities determining the quantum states of particles with continuous variables are considered. Entropies for semigroups of stochastic and bisctochastic matrices are studied, in view of both the Shannon information entropy and its generalization like Rényi entropy. Qubit portraits of qudit states are discussed in the connection with the problem of Bell's inequality violation for entangled states.

  14. Probability distributions for a surjective unimodal map

    NASA Astrophysics Data System (ADS)

    Sun, Hongyan; Wang, Long

    1996-04-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types, δ function, asymmetric and symmetric type; by identifying the binary structures of its initial values. The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps, and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  15. Neutron initiation probability in fast burst reactor

    SciTech Connect

    Liu, X.; Du, J.; Xie, Q.; Fan, X.

    2012-07-01

    Based on the probability balance of neutron random events in multiply system, the four random process of neutron in prompt super-critical is described and then the equation of neutron initiation probability W(r,E,{Omega},t) is deduced. On the assumption of static, slightly prompt super-critical and the two factorial approximation, the formula of the average probability of 'one' neutron is derived which is the same with the result derived from the point model. The MC simulation using point model is applied in Godiva- II and CFBR-II, and the simulation result of one neutron initiation is well consistent with the theory that the initiation probability of Godiva- II inverted commas CFBR-II burst reactor are 0.00032, 0.00027 respectively on the ordinary burst operation. (authors)

  16. A Survey of Tables of Probability Distributions

    PubMed Central

    Kacker, Raghu; Olkin, Ingram

    2005-01-01

    This article is a survey of the tables of probability distributions published about or after the publication in 1964 of the Handbook of Mathematical Functions, edited by Abramowitz and Stegun PMID:27308104

  17. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  18. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications. PMID:25559642

  19. Determining Probabilities by Examining Underlying Structure.

    ERIC Educational Resources Information Center

    Norton, Robert M.

    2001-01-01

    Discusses how dice games pose fairness issues that appeal to students and examines a structure for three games involving two dice in a way that leads directly to the theoretical probabilities for all possible outcomes. (YDS)

  20. Probability tree algorithm for general diffusion processes

    NASA Astrophysics Data System (ADS)

    Ingber, Lester; Chen, Colleen; Mondescu, Radu Paul; Muzzall, David; Renedo, Marco

    2001-11-01

    Motivated by path-integral numerical solutions of diffusion processes, PATHINT, we present a tree algorithm, PATHTREE, which permits extremely fast accurate computation of probability distributions of a large class of general nonlinear diffusion processes.

  1. Transition Probability and the ESR Experiment

    ERIC Educational Resources Information Center

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  2. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  3. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959

  4. Non-Gaussian Photon Probability Distribution

    NASA Astrophysics Data System (ADS)

    Solomon, Benjamin T.

    2010-01-01

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√u ) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al. (2006) microwave cloaking, and Oulton et al. (2008) sub wavelength confinement; thereby providing a strong case that

  5. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  6. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  7. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  8. The cumulative reaction probability as eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Miller, William H.

    1993-09-01

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.

  9. Familiarity and preference for pitch probability profiles.

    PubMed

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge. PMID:25838257

  10. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  11. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  12. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  13. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. PMID:22609725

  14. Effects of Neutrino Decay on Oscillation Probabilities

    NASA Astrophysics Data System (ADS)

    Leonard, Kayla; de Gouvêa, André

    2016-01-01

    It is now well accepted that neutrinos oscillate as a quantum mechanical result of a misalignment between their mass-eigenstates and the flavor-eigenstates. We study neutrino decay—the idea that there may be new, light states that the three Standard Model flavors may be able to decay into. We consider what effects this neutrino decay would have on the observed oscillation probabilities.The Hamiltonian governs how the states change with time, so we use it to calculate an oscillation amplitude, and from that, the oscillation probability. We simplify the theoretical probabilities using results from experimental data, such as the neutrino mixing angles and mass differences. By exploring what values of the decay parameters are physically allowable, we can begin to understand just how large the decay parameters can be. We compare the probabilities in the case of no neutrino decay and in the case of maximum neutrino decay to determine how much of an effect neutrino decay could have on observations, and discuss the ability of future experiments to detect these differences.We also examine neutrino decay in the realm of CP invariance, and found that it is a new source of CP violation. Our work indicates that there is a difference in the oscillation probabilities between particle transitions and their corresponding antiparticle transitions. If neutrino decay were proven true, it could be an important factor in understanding leptogenesis and the particle-antiparticle asymmetry present in our Universe.

  15. Laboratory-tutorial activities for teaching probability

    NASA Astrophysics Data System (ADS)

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-12-01

    We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  16. Reconstructing the prior probabilities of allelic phylogenies.

    PubMed Central

    Golding, G Brian

    2002-01-01

    In general when a phylogeny is reconstructed from DNA or protein sequence data, it makes use only of the probabilities of obtaining some phylogeny given a collection of data. It is also possible to determine the prior probabilities of different phylogenies. This information can be of use in analyzing the biological causes for the observed divergence of sampled taxa. Unusually "rare" topologies for a given data set may be indicative of different biological forces acting. A recursive algorithm is presented that calculates the prior probabilities of a phylogeny for different allelic samples and for different phylogenies. This method is a straightforward extension of Ewens' sample distribution. The probability of obtaining each possible sample according to Ewens' distribution is further subdivided into each of the possible phylogenetic topologies. These probabilities depend not only on the identity of the alleles and on 4N(mu) (four times the effective population size times the neutral mutation rate) but also on the phylogenetic relationships among the alleles. Illustrations of the algorithm are given to demonstrate how different phylogenies are favored under different conditions. PMID:12072482

  17. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  18. Sampling Quantum Nonlocal Correlations with High Probability

    NASA Astrophysics Data System (ADS)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(< u_i,v_jrangle)_{i,j=1}^n}, where the vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  19. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  20. Classical and Quantum Probability for Biologists - Introduction

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2010-01-01

    The aim of this review (oriented to biologists looking for applications of QM) is to provide a detailed comparative analysis of classical (Kolmogorovian) and quantum (Dirac-von Neumann) models. We will stress differences in the definition of conditional probability and as a consequence in the structures of matrices of transition probabilities, especially the condition of double stochasticity which arises naturally in QM. One of the most fundamental differences between two models is deformation of the classical formula of total probability (FTP) which plays an important role in statistics and decision making. An additional term appears in the QM-version of FTP - so called interference term. Finally, we discuss Bell's inequality and show that the common viewpoint that its violation induces either nonlocality or "death of realism" has not been completely justified. For us it is merely a sign of non-Kolmogorovianity of probabilistic data collected in a few experiments with incompatible setups of measurement devices.

  1. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  2. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. PMID:24300550

  3. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  4. Monte Carlo simulation of scenario probability distributions

    SciTech Connect

    Glaser, R.

    1996-10-23

    Suppose a scenario of interest can be represented as a series of events. A final result R may be viewed then as the intersection of three events, A, B, and C. The probability of the result P(R) in this case is the product P(R) = P(A) P(B {vert_bar} A) P(C {vert_bar} A {intersection} B). An expert may be reluctant to estimate P(R) as a whole yet agree to supply his notions of the component probabilities in the form of prior distributions. Each component prior distribution may be viewed as the stochastic characterization of the expert`s uncertainty regarding the true value of the component probability. Mathematically, the component probabilities are treated as independent random variables and P(R) as their product; the induced prior distribution for P(R) is determined which characterizes the expert`s uncertainty regarding P(R). It may be both convenient and adequate to approximate the desired distribution by Monte Carlo simulation. Software has been written for this task that allows a variety of component priors that experts with good engineering judgment might feel comfortable with. The priors are mostly based on so-called likelihood classes. The software permits an expert to choose for a given component event probability one of six types of prior distributions, and the expert specifies the parameter value(s) for that prior. Each prior is unimodal. The expert essentially decides where the mode is, how the probability is distributed in the vicinity of the mode, and how rapidly it attenuates away. Limiting and degenerate applications allow the expert to be vague or precise.

  5. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. PMID:26621989

  6. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  7. Practical algorithmic probability: an image inpainting example

    NASA Astrophysics Data System (ADS)

    Potapov, Alexey; Scherbakov, Oleg; Zhdanov, Innokentii

    2013-12-01

    Possibility of practical application of algorithmic probability is analyzed on an example of image inpainting problem that precisely corresponds to the prediction problem. Such consideration is fruitful both for the theory of universal prediction and practical image inpaiting methods. Efficient application of algorithmic probability implies that its computation is essentially optimized for some specific data representation. In this paper, we considered one image representation, namely spectral representation, for which an image inpainting algorithm is proposed based on the spectrum entropy criterion. This algorithm showed promising results in spite of very simple representation. The same approach can be used for introducing ALP-based criterion for more powerful image representations.

  8. Flood frequency: expected and unexpected probabilities

    USGS Publications Warehouse

    Thomas, D.M.

    1976-01-01

    Flood-frequency curves may be defined either with or without an ' expeced probability ' adustment; and the two curves differ in the way that they attempt to average the time-sampling uncertainties. A curve with no adustment is shown to estimate a median value of both discharge and frequency of occurrence, while an expected probability curve is shown to estimate a mean frequency of flood years. The attributes and constraints of the two types of curves for various uses are discussed. 

  9. Electric quadrupole transition probabilities for atomic lithium

    SciTech Connect

    Çelik, Gültekin; Gökçe, Yasin; Yıldız, Murat

    2014-05-15

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT.

  10. Non-Gaussian Photon Probability Distribution

    SciTech Connect

    Solomon, Benjamin T.

    2010-01-28

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mGAMMA distribution (whose parameters are alpha = r, betar/sq root(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact P{sub i}, the probabilistic function and the ability to interact A{sub i}, the electromagnetic function. Splitting the probability function P{sub i} from the electromagnetic function A{sub i} enables the investigation of the photon behavior from a purely probabilistic P{sub i} perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function P{sub i} and the ability to interact A{sub i}, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon P{sub i} of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub

  11. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    NASA Astrophysics Data System (ADS)

    Vourdas, A.

    2014-08-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator {{D}}(H_1, H_2), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors {{P}}(H_1), {{P}}(H_2), to the subspaces H1, H2. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  12. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    SciTech Connect

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  13. Quantum temporal probabilities in tunneling systems

    NASA Astrophysics Data System (ADS)

    Anastopoulos, Charis; Savvidou, Ntina

    2013-09-01

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines 'classical' time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems.

  14. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  15. Simplicity and Probability in Causal Explanation

    ERIC Educational Resources Information Center

    Lombrozo, Tania

    2007-01-01

    What makes some explanations better than others? This paper explores the roles of simplicity and probability in evaluating competing causal explanations. Four experiments investigate the hypothesis that simpler explanations are judged both better and more likely to be true. In all experiments, simplicity is quantified as the number of causes…

  16. Exploring Concepts in Probability: Using Graphics Calculators

    ERIC Educational Resources Information Center

    Ghosh, Jonaki

    2004-01-01

    This article describes a project in which certain key concepts in probability were explored using graphics calculators with year 10 students. The lessons were conducted in the regular classroom where students were provided with a Casio CFX 9850 GB PLUS graphics calculator with which they were familiar from year 9. The participants in the…

  17. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  18. Monte Carlo, Probability, Algebra, and Pi.

    ERIC Educational Resources Information Center

    Hinders, Duane C.

    1981-01-01

    The uses of random number generators are illustrated in three ways: (1) the solution of a probability problem using a coin; (2) the solution of a system of simultaneous linear equations using a die; and (3) the approximation of pi using darts. (MP)

  19. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  20. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  1. Spatial Probability Cuing and Right Hemisphere Damage

    ERIC Educational Resources Information Center

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  2. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  3. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  4. Phonotactic Probability Effects in Children Who Stutter

    ERIC Educational Resources Information Center

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  5. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  6. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  7. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  8. Conceptual Variation and Coordination in Probability Reasoning

    ERIC Educational Resources Information Center

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  9. Teaching Mathematics with Technology: Probability Simulations.

    ERIC Educational Resources Information Center

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  10. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  11. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  12. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  13. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  14. Learning a Probability Distribution Efficiently and Reliably

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  15. Five-Parameter Bivariate Probability Distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  16. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  17. On the bound of first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J. N.

    1969-01-01

    Method has been developed to improve the lower bound of the first excursion probability that can apply to the problem with either constant or time-dependent barriers. The method requires knowledge of the joint density function of the random process at two arbitrary instants.

  18. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  19. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  20. Quantum temporal probabilities in tunneling systems

    SciTech Connect

    Anastopoulos, Charis Savvidou, Ntina

    2013-09-15

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.

  1. The albedo effect on neutron transmission probability.

    PubMed

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  2. Neural representation of probabilities for Bayesian inference.

    PubMed

    Rich, Dylan; Cazettes, Fanny; Wang, Yunyan; Peña, José Luis; Fischer, Brian J

    2015-04-01

    Bayesian models are often successful in describing perception and behavior, but the neural representation of probabilities remains in question. There are several distinct proposals for the neural representation of probabilities, but they have not been directly compared in an example system. Here we consider three models: a non-uniform population code where the stimulus-driven activity and distribution of preferred stimuli in the population represent a likelihood function and a prior, respectively; the sampling hypothesis which proposes that the stimulus-driven activity over time represents a posterior probability and that the spontaneous activity represents a prior; and the class of models which propose that a population of neurons represents a posterior probability in a distributed code. It has been shown that the non-uniform population code model matches the representation of auditory space generated in the owl's external nucleus of the inferior colliculus (ICx). However, the alternative models have not been tested, nor have the three models been directly compared in any system. Here we tested the three models in the owl's ICx. We found that spontaneous firing rate and the average stimulus-driven response of these neurons were not consistent with predictions of the sampling hypothesis. We also found that neural activity in ICx under varying levels of sensory noise did not reflect a posterior probability. On the other hand, the responses of ICx neurons were consistent with the non-uniform population code model. We further show that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization. PMID:25561333

  3. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in other journals. However, there is…

  4. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  5. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  6. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  7. From data to probability densities without histograms

    NASA Astrophysics Data System (ADS)

    Berg, Bernd A.; Harris, Robert C.

    2008-09-01

    When one deals with data drawn from continuous variables, a histogram is often inadequate to display their probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which overcomes this problem. Error bars on the estimated probability density are calculated using a jackknife method. We give several examples and provide computer code reproducing them. You may want to look at the corresponding figures 4 to 9 first. Program summaryProgram title: cdf_to_pd Catalogue identifier: AEBC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2758 No. of bytes in distributed program, including test data, etc.: 18 594 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any capable of compiling and executing Fortran code Operating system: Any capable of compiling and executing Fortran code Classification: 4.14, 9 Nature of problem: When one deals with data drawn from continuous variables, a histogram is often inadequate to display the probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Solution method: Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which

  8. Using a Fluorescent Cytosine Analogue tC[superscript o] To Probe the Effect of the Y567 to Ala Substitution on the Preinsertion Steps of dNMP Incorporation by RB69 DNA Polymerase

    SciTech Connect

    Xia, Shuangluo; Beckman, Jeff; Wang, Jimin; Konigsberg, William H.

    2012-10-10

    Residues in the nascent base pair binding pocket (NBP) of bacteriophage RB69 DNA polymerase (RB69pol) are responsible for base discrimination. Replacing Tyr567 with Ala leads to greater flexibility in the NBP, increasing the probability of misincorporation. We used the fluorescent cytosine analogue, 1,3-diaza-2-oxophenoxazine (tC{sup o}), to identify preinsertion step(s) altered by NBP flexibility. When tC{sup o} is the templating base in a wild-type (wt) RB69pol ternary complex, its fluorescence is quenched only in the presence of dGTP. However, with the RB69pol Y567A mutant, the fluorescence of tC{sup o} is also quenched in the presence of dATP. We determined the crystal structure of the dATP/tC{sup o}-containing ternary complex of the RB69pol Y567A mutant at 1.9 {angstrom} resolution and found that the incoming dATP formed two hydrogen bonds with an imino-tautomerized form of tC{sup o}. Stabilization of the dATP/tC{sup o} base pair involved movement of the tC{sup o} backbone sugar into the DNA minor groove and required tilting of the tC{sup o} tricyclic ring to prevent a steric clash with L561. This structure, together with the pre-steady-state kinetic parameters and dNTP binding affinity, estimated from equilibrium fluorescence titrations, suggested that the flexibility of the NBP, provided by the Y567 to Ala substitution, led to a more favorable forward isomerization step resulting in an increase in dNTP binding affinity.

  9. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  10. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  11. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  12. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  13. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  14. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  15. Volcano shapes, entropies, and eruption probabilities

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  16. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. R.; Petrosky, V. E.

    1973-01-01

    The technique of photoelectron spectroscopy was used to obtain the relative continuum transition probabilities of atomic oxygen at 584 A for transitions from 3P ground state into the 4S, D2, and P2 states of the ion. Transition probability ratios for the D2 and P2 states relative to the S4 state of the ion are 1.57 + or - 0.14 and 0.82 + or - 0.07, respectively. In addition, transitions from excited O2(a 1 Delta g) state into the O2(+)(2 Phi u and 2 Delta g) were observed. The adiabatic ionization potential of O2(+)(2 Delta g) was measured as 18.803 + or - 0.006 eV.

  17. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  18. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  19. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  20. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  1. Investigation of Flood Inundation Probability in Taiwan

    NASA Astrophysics Data System (ADS)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  2. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  3. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-04-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  4. SureTrak Probability of Impact Display

    NASA Technical Reports Server (NTRS)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  5. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. A. R.; Petrosky, V. E.

    1974-01-01

    The technique of photoelectron spectroscopy was employed in the investigation. Atomic oxygen was produced in a microwave discharge operating at a power of 40 W and at a pressure of approximately 20 mtorr. The photoelectron spectrum of the oxygen with and without the discharge is shown. The atomic states can be clearly seen. In connection with the measurement of the probability for transitions into the various ionic states, the analyzer collection efficiency was determined as a function of electron energy.

  6. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive. PMID:24681395

  7. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  8. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  9. Classical probabilities for Majorana and Weyl spinors

    SciTech Connect

    Wetterich, C.

    2011-08-15

    Highlights: > Map of classical statistical Ising model to fermionic quantum field theory. > Lattice-regularized real Grassmann functional integral for single Weyl spinor. > Emerging complex structure characteristic for quantum physics. > A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q{sub {tau}}(t) for the Ising states {tau}. The time dependent probability distribution of a generalized Ising model obtains as p{sub {tau}}(t)=q{sub {tau}}{sup 2}(t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  10. Chemisorptive electron emission versus sticking probability

    NASA Astrophysics Data System (ADS)

    Böttcher, Artur; Niehus, Horst

    2001-07-01

    The chemisorption of N2O on thin Cs films has been studied by monitoring the time evolution of the sticking probability as well as the kinetics of the low-energy electron emission. By combining the data sets, two time domains become distinguishable: the initial chemisorption stage is characterized by a high sticking probability (0.1probability of less than 0.01. Such evident anticoincidence between the exoemission and the chemisorption excludes the model of surface harpooning as the elementary process responsible for the electron emission in the late chemisorption stage. A long-term emission decay has also been observed after turning off the flux of chemisorbing molecules. A model is proposed that attributes both, the late chemisorptive and the nonchemisorptive electron emission to the relaxation of a narrow state originated from an oxygen vacancy in the Cs oxide layer terminating the surface. The presence of such a state has been confirmed by the metastable de-excitation spectroscopy [MDS, He*(21S)].