Sample records for naranjo probability scale

  1. Evaluation of naranjo adverse drug reactions probability scale in causality assessment of drug-induced liver injury.

    PubMed

    García-Cortés, M; Lucena, M I; Pachkoria, K; Borraz, Y; Hidalgo, R; Andrade, R J

    2008-05-01

    Causality assessment in hepatotoxicity is challenging. The current standard liver-specific Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale is complex and difficult to implement in daily practice. The Naranjo Adverse Drug Reactions Probability Scale is a simple and widely used nonspecific scale, which has not been specifically evaluated in drug-induced liver injury. To compare the Naranjo method with the standard liver-specific Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale in evaluating the accuracy and reproducibility of Naranjo Adverse Drug Reactions Probability Scale in the diagnosis of hepatotoxicity. Two hundred and twenty-five cases of suspected hepatotoxicity submitted to a national registry were evaluated by two independent observers and assessed for between-observer and between-scale differences using percentages of agreement and the weighted kappa (kappa(w)) test. A total of 249 ratings were generated. Between-observer agreement was 45% with a kappa(w) value of 0.17 for the Naranjo Adverse Drug Reactions Probability Scale, while there was a higher agreement when using the Council for International Organizations of Medical Sciences/Roussel Uclaf Causality Assessment Method scale (72%, kappa(w): 0.71). Concordance between the two scales was 24% (kappa(w): 0.15). The Naranjo Adverse Drug Reactions Probability Scale had low sensitivity (54%) and poor negative predictive value (29%) and showed a limited capability to distinguish between adjacent categories of probability. The Naranjo scale lacks validity and reproducibility in the attribution of causality in hepatotoxicity.

  2. Conducted energy devices: pilot analysis of (non-)attributability of death using a modified Naranjo algorithm.

    PubMed

    Fox, Anthony W; Payne-James, J Jason

    2012-11-30

    Alleged fatalities associated with conductive-energy devices (CEDs) are similar to alleged serious adverse events (SAEs) after the use of pharmaceutical products: both types of case arise rarely, in complex (if not unique) combinations of circumstances, frequently when there are multiple concomitant putative aetiologies for the injury, and after the suspected product has been previously well-designed and tested. Attribution (or otherwise) of SAEs to pharmaceutical products is often assessed by use of the Naranjo algorithm. The purpose of this study was to investigate whether an adapted Naranjo algorithm could be used to assess alleged CED-associated fatalities. Unique cases had four independent identifiers. Prospectively, 7 (of the 10) Naranjo algorithm questions were chosen as being potentially applicable to CED use. These had maximum score 9, and the associated ordinal probability scale (doubtful, possible, probable, and definite) was retained by linear proportion to the integral scores. An arbitrary requirement was for database sufficiency≥50%=([n unique cases×7 questions answerable]×0.5); a pilot sample (n=29 unique cases) suggested feasibility (see below). One hundred and seventy-five unique cases were found, with a data sufficiency of 56.8%. Modified Naranjo algorithm scores had an unequally bimodal distribution. CED-attributability was suggested in 21 (12% of 175) cases. Substantial numbers of concomitant conditions existed among cases with low algorithm scores, all being potentially lethal under field conditions without CED exposure. The number of CED-administered shocks sustained was unrelated to CED-attributability of fatality. Two of the Naranjo questions (regarding dechallenge and the effects of challenge with a non-identical but similar agent) proved to be non-contributory. An algorithmic approach to assessment of CED-associated fatality seems feasible. By these pharmacovigilance standards, some published case fatality rates attributable to CED

  3. Probable levetiracetam-related serum alkaline phosphatase elevation

    PubMed Central

    2012-01-01

    Background Levetiracetam (LEV) is an antiepileptic drug with a favorable tolerability and safety profile with little or no effect on liver function. Case presentation Here, we reported an epileptic pediatric patient who developed a significant elevation in serum alkaline phosphatase level (ALP) during LEV monotherapy. Moreover, the serum ALP level was surprisingly decreased to normal after LEV discontinuation. The Naranjo Adverse Drug Reaction Probability Scale score was 6, indicating firstly LEV was a probable cause for the increased serum ALP. Conclusions Cautious usage and concerns of the LEV-associated potential ALP elevation should be considered when levetiracetam is prescribed to epilepsy patients, especially pediatric patients. PMID:22994584

  4. Probable Association of Tachyarrhythmia With Nebulized Albuterol in a Child With Previously Subclinical Wolff Parkinson White Syndrome

    PubMed Central

    Kroesen, Michiel; Maseland, Machiel; Smal, Jaime; Reimer, Annet; van Setten, Petra

    2012-01-01

    We present the case of a 2-year-old asthmatic boy with atrioventricular (AV)-reentry tachycardia following albuterol inhalation, who was later diagnosed with Wolff-Parkinson-White (WPW) syndrome. The Naranjo adverse drug reaction probability scale score for this adverse event was 7, indicating that the association between his AV-reentry tachycardia and inhalation of albuterol is probable. To our knowledge, this is the first case report that shows the potential arrhythmogenic effects of albuterol in a child with WPW syndrome. We urge clinicians to be aware of this potentially life-threatening adverse effect and to closely monitor these patients when they need beta-adrenergic drugs in case of emergency. Furthermore, this report highlights the dilemma regarding the safe treatment of pediatric patients with both asthma and WPW syndrome. PMID:23118663

  5. A case of probable labetalol induced hyperkalaemia in pre-eclampsia.

    PubMed

    Thomas, Binny; Abdul Rouf, P V; El Kassem, Wessam; Al Hail, Moza; Stewart, Derek; Tharannum, Asma; Ahmed, Afif; Al Saadi, Muna

    2014-12-01

    Hyperkalemia can cause altered cardiac electrical conduction resulting in death. We describe a case of a 23-year old pregnant patient who presented with severe epigastric pain and vomiting. She was severely pre- eclamptic and received initial treatment with intravenous labetalol and decision was taken to deliver. She quickly became hyperkalaemic (serum potassium level 6.4 mmol/L) and labetalol was discontinued and intravenous hydralazine commenced. Post-surgery, her potassium levels were normal but due to rapidly rising blood pressure labetalol was recommenced, resulting in elevated potassium levels. Labetolol was discontinued, hydralazine prescribed, and potassium levels normalised. The adverse reaction was classified as 'probably' due to labetolol using the Naranjo Adverse Drug Reaction scale. This is the first reported case of labetolol induced hyperkalaemia in pregnancy, with life threatening consequences and hence all health professionals should be alert to this potential effect.

  6. Probable hypoglycemic adverse drug reaction associated with prickly pear cactus, glipizide, and metformin in a patient with type 2 diabetes mellitus.

    PubMed

    Sobieraj, Diana M; Freyer, Craig W

    2010-01-01

    To report a case of an adverse drug reaction (ADR) in a patient with type 2 diabetes mellitus taking prickly pear cactus (PPC), glipizide, and metformin. A 58-year-old Mexican male with type 2 diabetes mellitus being treated with metformin 1000 mg twice daily and extended-release glipizide 10 mg daily was referred to the pharmacist for medication education. He denied taking herbal supplements or experiencing hypoglycemia. Two hemoglobin A(1c) values (6.8% and 6.7%) obtained over the past year demonstrated glycemic control, which was supported by his reported fasting blood glucose readings of 113-132 mg/dL. One month later, the patient reported 4 hypoglycemic events with blood glucose readings of 49-68 mg/dL, which resulted in discontinuation of glipizide. One month later, the patient denied any further hypoglycemia. During medication reconciliation he reported consuming crude PPC pads daily for 2 months for glucose control. Literature suggests that PPC has an effect on lowering blood glucose levels in patients with type 2 diabetes mellitus, although few identified data describe ADRs from combining PPC with other agents used in treating type 2 diabetes mellitus. A literature search of MEDLINE (through December 2009) using the search terms diabetes mellitus, prickly pear cactus, nopal, opuntia, metformin, glipizide, glyburide, glimepiride, and sulfonylurea revealed no case reports of the described ADR. One case report describing the blood glucose-lowering effect of PPC in a patient concurrently taking oral antihyperglycemics documented an episode of hypoglycemia, although the Naranjo probability scale was not applied. One patient survey discovered the most common drug-herbal interaction in the given population to be between PPC and antihyperglycemic agents, resulting in hypoglycemia. In our case, use of the Naranjo probability scale suggests the ADR to be probable. The mechanism may be due to the additive glucose lowering of the 3 agents consumed concurrently by the

  7. Somnambulism due to probable interaction of valproic acid and zolpidem.

    PubMed

    Sattar, S Pirzada; Ramaswamy, Sriram; Bhatia, Subhash C; Petty, Frederick

    2003-10-01

    To report a case of somnambulism due to a probable interaction between valproic acid and zolpidem in a patient with no prior personal or family history of somnambulism. A 47-year-old white man with a history of bipolar disorder was being maintained on citalopram 40 mg once daily and zolpidem 5 mg at bedtime. During treatment, he developed manic symptoms and was started on adjunctive valproic acid therapy. Soon after this, he developed episodes of somnambulism, which stopped when valproic acid was discontinued. On rechallenge with valproic acid, somnambulism returned. To our knowledge, this is the first report in the literature describing a probable interaction between valproic acid and zolpidem leading to somnambulism. Even though valproic acid has been associated with sleep changes, there are no published reports of somnambulism with this agent. Zolpidem has been associated with somnambulism, but our patient did not experience this when he was on zolpidem monotherapy. However, within 2 days of starting adjunctive valproic acid, sleepwalking occurred. It stopped after valproic acid was withdrawn. On rechallenge with valproic acid, sleepwalking recurred. However, when zolpidem was discontinued and valproic acid was continued, somnambulism did not occur. An assessment on the Naranjo probability scale suggests probable pharmacokinetic or pharmacodynamic interactions between the 2 medications. Valproic acid and zolpidem are generally safe medications that are commonly prescribed and often used together. No interactions have been previously reported with combined use of valproic acid and zolpidem. This case suggests a probable interaction between these 2 agents that can have a serious consequence, somnambulism. This could be frightening to patients and put them in danger. Recognition of such interactions that place patients at risk for potentially serious adverse events is imperative for appropriate care.

  8. Probable fenofibrate-induced acute generalized exanthematous pustulosis.

    PubMed

    Power, Anna E; Graudins, Linda V; McLean, Catriona A; Hopper, Ingrid

    2015-12-01

    The case of a patient who experienced a severe adverse reaction requiring emergency treatment after a single dose of fenofibrate is described. A 58-year-old woman with type 1 diabetes was hospitalized for treatment of an extensive blistering rash on the buttocks and trunk accompanied by fever, hypotension, tachycardia, neutrophilia, impaired renal function, and liver enzyme abnormalities. She reported that two days previously she had developed fever and vomiting four hours after taking her first dose of fenofibrate (145 mg). The patient required vasopressor support and was initially treated with broad-spectrum antibiotics for 3 days and a course of immune globulin. On hospital day 4, histopathology returned results consistent with acute generalized exanthematous pustulosis (AGEP), and the patient was subsequently treated with topical steroids. Gradual resolution of AGEP was noted at the time of her discharge from the hospital on day 7 and at one-week follow-up. Analysis of the case using the adverse drug reaction probability scale of Naranjo et al. yielded a score of 5, indicating a probable association between fenofibrate use and AGEP development. AGEP is a predominantly drug-induced condition but is not typically associated with fenofibrate use. Cutaneous eruptions in AGEP are often accompanied by systemic symptoms (e.g., fever, leukocytosis), and the disorder can also be associated with impaired creatinine clearance and elevated aminotransaminase levels. A woman with type 1 diabetes developed AGEP after taking a single dose of fenofibrate. Her cutaneous symptoms began to resolve within days of discontinuation of fenofibrate use. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. Finite-size scaling of survival probability in branching processes

    NASA Astrophysics Data System (ADS)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G (y ) =2 y ey /(ey-1 ) , with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  10. Scale-Invariant Transition Probabilities in Free Word Association Trajectories

    PubMed Central

    Costa, Martin Elias; Bonomo, Flavia; Sigman, Mariano

    2009-01-01

    Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16% of order-2 cycles) implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼7 steps) which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution. PMID:19826622

  11. Evidence of scaling of void probability in nucleus-nucleus interactions at few GeV energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Dipak; Biswas, Biswanath; Deb, Argha

    1997-11-01

    The rapidity gap probability in the {sup 24}Mg-AgBr interaction at 4.5GeV/c/nucleon has been studied in detail. The data reveal scaling behavior of the void probability in the central rapidity domain which confirms the validity of the linked-pair approximation for the N-particle cumulant correlation functions. This scaling behavior appears to be similar to the void probability in the Perseus-Pisces supercluster region of galaxies. {copyright} {ital 1997} {ital The American Physical Society}

  12. Approximation of the ruin probability using the scaled Laplace transform inversion

    PubMed Central

    Mnatsakanov, Robert M.; Sarkisian, Khachatur; Hakobyan, Artak

    2015-01-01

    The problem of recovering the ruin probability in the classical risk model based on the scaled Laplace transform inversion is studied. It is shown how to overcome the problem of evaluating the ruin probability at large values of an initial surplus process. Comparisons of proposed approximations with the ones based on the Laplace transform inversions using a fixed Talbot algorithm as well as on the ones using the Trefethen–Weideman–Schmelzer and maximum entropy methods are presented via a simulation study. PMID:26752796

  13. Scaling properties and universality of first-passage-time probabilities in financial markets

    NASA Astrophysics Data System (ADS)

    Perelló, Josep; Gutiérrez-Roig, Mario; Masoliver, Jaume

    2011-12-01

    Financial markets provide an ideal frame for the study of crossing or first-passage time events of non-Gaussian correlated dynamics, mainly because large data sets are available. Tick-by-tick data of six futures markets are herein considered, resulting in fat-tailed first-passage time probabilities. The scaling of the return with its standard deviation collapses the probabilities of all markets examined—and also for different time horizons—into single curves, suggesting that first-passage statistics is market independent (at least for high-frequency data). On the other hand, a very closely related quantity, the survival probability, shows, away from the center and tails of the distribution, a hyperbolic t-1/2 decay typical of a Markovian dynamics, albeit the existence of memory in markets. Modifications of the Weibull and Student distributions are good candidates for the phenomenological description of first-passage time properties under certain regimes. The scaling strategies shown may be useful for risk control and algorithmic trading.

  14. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    PubMed

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  15. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  16. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  17. Vertex evoked potentials in a rating-scale detection task: Relation to signal probability

    NASA Technical Reports Server (NTRS)

    Squires, K. C.; Squires, N. K.; Hillyard, S. A.

    1974-01-01

    Vertex evoked potentials were recorded from human subjects performing in an auditory detection task with rating scale responses. Three values of a priori probability of signal presentation were tested. The amplitudes of the N1 and P3 components of the vertex potential associated with correct detections of the signal were found to be systematically related to the strictness of the response criterion and independent of variations in a priori signal probability. No similar evoked potential components were found associated with signal absent judgements (misses and correct rejections) regardless of the confidence level of the judgement or signal probability. These results strongly support the contention that the form of the vertex evoked response is closely correlated with the subject's psychophysical decision regarding the presence or absence of a threshold level signal.

  18. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  19. A Case of Hepatotoxicity Induced by Adulterated "Tiger King", a Chinese Herbal Medicine Containing Sildenafil.

    PubMed

    Nissan, Ran; Poperno, Alina; Stein, Gideon Y; Shapira, Barak; Fuchs, Shmuel; Berkovitz, Ronny; Hess, Zipora; Arieli, Mickey

    2016-01-01

    Detection of Phosphodiesterase Type 5 (PDE-5) inhibitors and their analogues in "100% natural" or "herbal" supplements have been described in numerous reports. However, few reports have been published in relation to actual harm caused by counterfeit erectile dysfunction herbal supplements. We describe a case of a 65-year old male admitted to a tertiary hospital with acute liver toxicity, possibly induced by adulterated "Chinese herbal" supplement "Tiger King" for sexual enhancement. Chemical analysis of the tablets discovered the presence of therapeutic doses of sildenafil with no other herbal components. Other medications were excluded as potential causes of the hepatic impairment. According to the Naranjo adverse drug reaction scale and the Roussel Uclaf Causality Assessment Method (RUCAM) the probability of association of Hepatotoxicity with Sildenafil was "possible" and "probable" respectively (Naranjo score of 4, RUCAM score of 7). Within three days of admission, the patient's clinical status and liver function improved without any specific treatment. His liver function tests normalized 30 days post discharge. Further pharmacovigilance actions should be taken by regulatory authorities and pharmaceutical companies in order to determine the relation between sildenafil and hepatotoxicity. This case emphasizes the importance of raising public awareness on the potential dangers of "Tiger king" in particular, and other counterfeit medications or herbal supplements of unknown origin.

  20. Benzocaine-induced methemoglobinemia in two patients: interdisciplinary collaboration, management, and near misses.

    PubMed

    Throm, Melinda J; Stevens, Margie Dale; Hansen, Carol

    2007-08-01

    Methemoglobin, a form of hemoglobin that does not bind oxygen, is produced when iron in red blood cells is oxidized from the ferrous state to the ferric state. Methemoglobinemia develops in the presence of oxidizing agents, such as benzocaine-containing topical anesthetic sprays, and it is characterized by cyanosis. If untreated, methemoglobinemia may prove lethal. We describe two patients who developed methemoglobinemia after they were administered benzocaine-containing topical anesthetic sprays. Using the Naranjo adverse drug reaction probability scale, the relationship between the administration of the benzocaine-containing spray and the development of methemoglobinemia was probable (score of 7) in both patients. Collaboration among health care providers is necessary to efficiently recognize, treat, and manage this condition.

  1. Restrictive Cardiomyopathy Associated With Long-Term Use of Hydroxychloroquine for Systemic Lupus Erythematosus.

    PubMed

    Sabato, Leah A; Mendes, Lisa A; Cox, Zachary L

    2017-10-01

    Hydroxychloroquine (HQ) is commonly prescribed for autoimmune diseases such as systemic lupus erythematosus. We report a case of a 75-year-old female presenting with de novo decompensated heart failure and restrictive cardiomyopathy (left ventricular ejection fraction: 40%-45%) after treatment with HQ for more than 11 years. Hydroxychloroquine was discontinued, and follow-up echocardiogram 57 days after discontinuation showed normalization of her left ventricular ejection fraction. A score of 7 on the Naranjo Adverse Drug Reaction Probability Scale indicates that HQ is a probable cause of this patient's cardiomyopathy. An adverse drug effect due to HQ should be considered in treated patients who present with restrictive cardiomyopathy. Discontinuation may allow for partial or complete reversal of the cardiomyopathy.

  2. Adverse Drug Reactions Related to Drug Administration in Hospitalized Patients.

    PubMed

    Gallelli, Luca; Siniscalchi, Antonio; Palleria, Caterina; Mumoli, Laura; Staltari, Orietta; Squillace, Aida; Maida, Francesca; Russo, Emilio; Gratteri, Santo; De Sarro, Giovambattista

    2017-01-01

    Drug treatment may be related to the development of adverse drug reactions (ADRs). In this paper, we evaluated the ADRs in patients admitted to Catanzaro Hospital. After we obtained the approval by local Ethical Committee, we performed a retrospective study on clinical records from March 01, 2013 to April 30, 2015. The association between drug and ADR or between drug and drug-drug-interactions (DDIs) was evaluated using the Naranjo's probability scale and Drug Interaction Probability Scale (DIPS), respectively. During the study period, we analyzed 2870 clinical records containing a total of 11,138 prescriptions, and we documented the development of 770 ADRs. The time of hospitalization was significantly higher (P<0.05) in women with ADRs (12.6 ± 1.2 days) with respect to men (11.8± 0.83 days). Using the Naranjo score, we documented a probable association in 78% of these reactions, while DIPS revealed that about 22% of ADRs were related to DDIs. Patients with ADRs received 3052 prescriptions on 11,138 (27.4%) having a mean of 6.1±0.29 drugs that was significantly higher (P<0.01) with respect to patients not experiencing ADRs (mean of 3.4±0.13 drugs). About 19% of ADRs were not diagnosed and were treated as new diseases. Our results indicate that drug administration induces the development of ADRs also during the hospitalization, particularly in elderly women. Moreover, we also documented that ADRs in some patients are under-diagnosed, therefore, it is important to motivate healthcare to report the ADRs in order to optimize the patients' safety. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Landscape- and local-scale habitat influences on occupancy and detection probability of stream-dwelling crayfish: Implications for conservation

    USGS Publications Warehouse

    Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Nolen, Matthew S.; Wagner, Brian K.

    2017-01-01

    Crayfish are ecologically important in freshwater systems worldwide and are imperiled in North America and globally. We sought to examine landscape- to local-scale environmental variables related to occupancy and detection probability of a suite of stream-dwelling crayfish species. We used a quantitative kickseine method to sample crayfish presence at 102 perennial stream sites with eight surveys per site. We modeled occupancy (psi) and detection probability (P) and local- and landscape-scale environmental covariates. We developed a set of a priori candidate models for each species and ranked models using (Q)AICc. Detection probabilities and occupancy estimates differed among crayfish species with Orconectes eupunctus, O. marchandi, and Cambarus hubbsi being relatively rare (psi < 0.20) with moderate (0.46–0.60) to high (0.81) detection probability and O. punctimanus and O. ozarkae being relatively common (psi > 0.60) with high detection probability (0.81). Detection probability was often related to local habitat variables current velocity, depth, or substrate size. Important environmental variables for crayfish occupancy were species dependent but were mainly landscape variables such as stream order, geology, slope, topography, and land use. Landscape variables strongly influenced crayfish occupancy and should be considered in future studies and conservation plans.

  4. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  5. A review and assessment of drug-induced parotitis.

    PubMed

    Brooks, Krista G; Thompson, Dennis F

    2012-12-01

    To review the current literature on drug-induced parotitis. Literature was accessed through MEDLINE/PubMed (1980-May 2012), using the search terms sialadenitis/chemically induced and parotitis/chemically induced. EMBASE (1980-May 2012) was searched using the terms parotitis/diagnosis, sialadenitis/side effect, and parotitis/side effect. International Pharmaceutical Abstracts (1970-May 2012) was searched using the search terms parotitis and sialadenitis. All searches were limited to articles on humans written in English. Inclusion criteria were published letters, case reports, reviews, and clinical trials involving drugs that may be associated with parotitis. Articles pertaining to parotitis induced by iodine-containing drugs were excluded. References of all relevant articles were reviewed for additional citations. Review articles, clinical trials, background data, and case reports of drug-induced parotitis were collected and case reports were assessed for causality. Parotitis is an uncommon adverse effect; however, signs and symptoms of parotitis have been noted in case reports as an adverse drug reaction related to various medications. Assessing causality of an adverse drug reaction such as parotitis is challenging. To help determine the probability of causality for these events, algorithms such as the Naranjo probability scale have been developed. Eighty-four case reports of drug-induced parotitis from 40 different drugs were reviewed using a modified Naranjo probability scale that included criteria specific for parotitis. Medications that met the criteria for establishing causality included l-asparaginase with 7 case reports, clozapine with 13 case reports, and phenylbutazone with 13 case reports. Drug-induced parotitis is a rare adverse drug reaction. Based on the quantitative and qualitative evidence collected from the case reports, medications that are associated with drug-induced parotitis include l-asparaginase, clozapine, and phenylbutazone. Many other

  6. Bayesian probabilities for Mw 9.0+ earthquakes in the Aleutian Islands from a regionally scaled global rate

    NASA Astrophysics Data System (ADS)

    Butler, Rhett; Frazer, L. Neil; Templeton, William J.

    2016-05-01

    We use the global rate of Mw ≥ 9.0 earthquakes, and standard Bayesian procedures, to estimate the probability of such mega events in the Aleutian Islands, where they pose a significant risk to Hawaii. We find that the probability of such an earthquake along the Aleutians island arc is 6.5% to 12% over the next 50 years (50% credibility interval) and that the annualized risk to Hawai'i is about $30 M. Our method (the regionally scaled global rate method or RSGR) is to scale the global rate of Mw 9.0+ events in proportion to the fraction of global subduction (units of area per year) that takes place in the Aleutians. The RSGR method assumes that Mw 9.0+ events are a Poisson process with a rate that is both globally and regionally stationary on the time scale of centuries, and it follows the principle of Burbidge et al. (2008) who used the product of fault length and convergence rate, i.e., the area being subducted per annum, to scale the Poisson rate for the GSS to sections of the Indonesian subduction zone. Before applying RSGR to the Aleutians, we first apply it to five other regions of the global subduction system where its rate predictions can be compared with those from paleotsunami, paleoseismic, and geoarcheology data. To obtain regional rates from paleodata, we give a closed-form solution for the probability density function of the Poisson rate when event count and observation time are both uncertain.

  7. Multi-scale evaluation of the environmental controls on burn probability in a southern Sierra Nevada landscape

    Treesearch

    Sean A. Parks; Marc-Andre Parisien; Carol Miller

    2011-01-01

    We examined the scale-dependent relationship between spatial fire likelihood or burn probability (BP) and some key environmental controls in the southern Sierra Nevada, California, USA. Continuous BP estimates were generated using a fire simulation model. The correspondence between BP (dependent variable) and elevation, ignition density, fuels and aspect was evaluated...

  8. Skin rash during treatment with generic itraconazole.

    PubMed

    De Vuono, Antonio; Palleria, Caterina; Scicchitano, Francesca; Squillace, Aida; De Sarro, Giovambattista; Gallelli, Luca

    2014-04-01

    Generic drugs have the same active substance, the same pharmaceutical form, the same therapeutic indications and a similar bioequivalence with the reference medicinal product (branded). Although a similar efficacy is postulated, some cases of clinical inefficacy during treatment with generic formulations have been reported. In this case, we describe a woman with onychomycosis that developed a skin rash during treatment with a generic formulation of itraconazole. Drug administration and its re-challenge confirmed the association between itraconazole and skin rash. Both Naranjo probability scale and World Health Organization causality assessment scale documented a probable association between generic-itraconazole and skin rash. The switch from generic formulation to brand one induced an improvement of symptoms. Since we are unable to evaluate the role of each excipient in the development of skin rash, we cannot rule out their involvement. However, more data are necessary to better define the similarities or differences between branded and generic formulations.

  9. Skin rash during treatment with generic itraconazole

    PubMed Central

    De Vuono, Antonio; Palleria, Caterina; Scicchitano, Francesca; Squillace, Aida; De Sarro, Giovambattista; Gallelli, Luca

    2014-01-01

    Generic drugs have the same active substance, the same pharmaceutical form, the same therapeutic indications and a similar bioequivalence with the reference medicinal product (branded). Although a similar efficacy is postulated, some cases of clinical inefficacy during treatment with generic formulations have been reported. In this case, we describe a woman with onychomycosis that developed a skin rash during treatment with a generic formulation of itraconazole. Drug administration and its re-challenge confirmed the association between itraconazole and skin rash. Both Naranjo probability scale and World Health Organization causality assessment scale documented a probable association between generic-itraconazole and skin rash. The switch from generic formulation to brand one induced an improvement of symptoms. Since we are unable to evaluate the role of each excipient in the development of skin rash, we cannot rule out their involvement. However, more data are necessary to better define the similarities or differences between branded and generic formulations. PMID:24799820

  10. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    PubMed

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  11. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps

    NASA Astrophysics Data System (ADS)

    Arkhincheev, V. E.

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways—by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  12. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE PAGES

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.; ...

    2017-08-25

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  13. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  14. Development and Inter-Rater Reliability of the Liverpool Adverse Drug Reaction Causality Assessment Tool

    PubMed Central

    Gallagher, Ruairi M.; Kirkham, Jamie J.; Mason, Jennifer R.; Bird, Kim A.; Williamson, Paula R.; Nunn, Anthony J.; Turner, Mark A.; Smyth, Rosalind L.; Pirmohamed, Munir

    2011-01-01

    Aim To develop and test a new adverse drug reaction (ADR) causality assessment tool (CAT). Methods A comparison between seven assessors of a new CAT, formulated by an expert focus group, compared with the Naranjo CAT in 80 cases from a prospective observational study and 37 published ADR case reports (819 causality assessments in total). Main Outcome Measures Utilisation of causality categories, measure of disagreements, inter-rater reliability (IRR). Results The Liverpool ADR CAT, using 40 cases from an observational study, showed causality categories of 1 unlikely, 62 possible, 92 probable and 125 definite (1, 62, 92, 125) and ‘moderate’ IRR (kappa 0.48), compared to Naranjo (0, 100, 172, 8) with ‘moderate’ IRR (kappa 0.45). In a further 40 cases, the Liverpool tool (0, 66, 81, 133) showed ‘good’ IRR (kappa 0.6) while Naranjo (1, 90, 185, 4) remained ‘moderate’. Conclusion The Liverpool tool assigns the full range of causality categories and shows good IRR. Further assessment by different investigators in different settings is needed to fully assess the utility of this tool. PMID:22194808

  15. Assessment of private security guards by Suicide Probability Scale and Brief Symptom Inventory.

    PubMed

    Dogan, Bulent; Canturk, Gurol; Canturk, Nergis; Guney, Sevgi; Özcan, Ebru

    2016-01-01

    The aim of the present study was to investigate the influence of suicide probability and relevant sociodemographic features and to provide information for preventing suicide in private security guards working under the stressful conditions and continuous exposure to the negative and traumatic life events. 200 private security guards and 200 personnels of Ankara University participated in the study. A sociodemographic information questionnaire, the Suicide Probability Scale (SPS) and the Brief Symptom Inventory (BSI) were used to collect the data. Gender, marital status, income, religious beliefs, experiencing a life-threatening situation, history of a suicide attempt, smoking and not having a chronic disease caused statistically significant differences in the scores for SPS between the private security guards group and the controls. Moreover there was a statistically significant positive correlation between the total scores of the subscales of SPS and the total scores of BSI. Like police officers and gendarmes, private security guards are at high risk of committing and attempting suicide because of being at stressful work settings and also suffering from secondary trauma. It is required that they should be aware of their tendency to commit suicide and have regular psychiatric screenings.

  16. [Psychotic Acute Episode and Rhabdomyolysis after Lovastatin Ingestion].

    PubMed

    Caamaño, Beatriz H; Díaz, Jairo M González; Bracho, Daniel Guerrero; Herrera, Harold; Samur, Manuel Castro

    2012-09-01

    Statins are the most prescribed drugs worldwide given the benefit and security they offer. However, they can cause severe neurological, gastrointestinal, renal and muscular side effects. To describe the clinical course of a female patient with adverse drug reaction to Lovastatin. Case report and literature review. 52-year old woman with sudden psychosis and rhabdomyolysis secondary to Lovastatin and ending after the drug suspension. The causal relationship was corroborated with a score of 6 (probable ADR) on Naranjo's Scale. The simultaneous manifestation of psychosis and rhabdomiolysis represents an atypical and unique case following Lovastatin ingestion. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  17. Loss of Eyebrows and Eyelashes During Concomitant Treatment with Sitagliptin and Metformin.

    PubMed

    Succurro, Elena; Palleria, Caterina; Ruffo, Mariafrancesca; Serra, Raffaele; Arturi, Franco; Gallelli, Luca

    2017-01-01

    The fixed dose combination of sitagliptin 50 mg and metformin 850 mg (Janumet ®), is indicated for the treatment of type 2 diabetes mellitus in addition to diet and exercise to improve glycemic control in patients treated with metformin alone. We report a 69-year-old man with type 2 diabetes that developed sudden loss of eyebrows and eyelashes about 4 months after the beginning of Janumet®. Clinical and laboratory findings excluded the presence of systemic or skin diseases able to induce these manifestations, while the Naranjo probability scale documented a possible association between the drug and the adverse drug reaction. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  19. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  20. Amisulpride and symptomatic bradycardia: a case report.

    PubMed

    Huang, Li-Chung; Huang, Li-Yen; Tseng, Shih-Yen; Hou, Yuh-Ming; Hsiao, Cheng-Cheng

    2015-01-01

    Amisulpride is a second-generation antipsychotic agent indicated for the treatment of schizophrenia and other major psychotic illnesses. Amisulpride-induced bradycardia is a rare condition of unknown etiology and mechanism. Asymptomatic bradycardia has been associated with amisulpride in only two cases. In our case, the association was rated as "probable" on the Naranjo adverse drug reaction probability scale. Case report. A 45-year-old male patient developed symptomatic bradycardia during usage of amisulpride (400-800 mg/day), which dramatically improved after the complete termination of amisulpride usage. The psychiatric condition remained relatively stable without bradycardia after administration of another antipsychotic agent [risperidone (3 mg/day)]. This is the first case report of symptomatic bradycardia associated with the use of amisulpride. Although bradycardia is a rare adverse reaction to antipsychotics, this finding may alert psychiatrists and physicians to this antipsychotic drug side effect. Further study is needed to disclose the role of antipsychotics in bringing about symptomatic bradycardia. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. A metoprolol-terbinafine combination induced bradycardia.

    PubMed

    Bebawi, Emmanuel; Jouni, Suhail S; Tessier, Andrée-Anne; Frenette, Anne Julie; Brindamour, Dave; Doré, Maxime

    2015-09-01

    To report a sinus bradycardia induced by metoprolol and terbinafine drug-drug interaction and its management. A 63 year-old Caucasian man on metoprolol 200 mg/day for stable coronary artery disease was prescribed a 90-day course of oral terbinafine 250 mg/day for onychomycosis. On the 49th day of terbinafine therapy, he was brought to the emergency room for a decrease of his global health status, confusion and falls. The electrocardiogram revealed a 37 beats/min sinus bradycardia. A score of 7 on the Naranjo adverse drug reaction probability scale indicates a probable relationship between the patient's sinus bradycardia and the drug interaction between metoprolol and terbinafine. The heart rate ameliorated first with a decrease in the dose of metoprolol. It was subsequently changed to bisoprolol and the heart rate remained normal. By inhibiting the cytochrome P450 2D6, terbinafine had decreased metoprolol's clearance, leading in metoprolol accumulation which has resulted in clinically significant sinus bradycardia.

  2. Nightmare and Abnormal Dreams: Rare Side Effects of Metformin?

    PubMed Central

    Yanto, Theo Audi; Kosasih, Felicia Nathania

    2018-01-01

    Background Metformin is widely known as an antidiabetic agent which has significant gastrointestinal side effects, but nightmares and abnormal dreams as its adverse reactions are not well reported. Case Presentation Herein we present a case of 56-year-old male patient with no known history of recurrent nightmares and sleep disorder, experiencing nightmare and abnormal dreams directly after consumption of 750 mg extended release metformin. He reported his dream as an unpleasant experience which awakened him at night with negative feelings. The nightmare only lasted for a night, but his dreams every night thereafter seemed abnormal. The dreams were vivid and indescribable. The disappearance and occurrence of abnormal dreams ensued soon after the drug was discontinued and rechallenged. The case was assessed using Naranjo Adverse Drug Reaction (ADR) probability scale and resulted as probable causality. Conclusion Metformin might be the underlying cause of nightmare and abnormal dreams in this patient. More studies are needed to confirm the association and causality of this findings. PMID:29581904

  3. Azithromycin-induced rash in a patient of infectious mononucleosis - a case report with review of literature.

    PubMed

    Banerjee, Indranil; Mondal, Somnath; Sen, Sukanta; Tripathi, Santanu Kumar; Banerjee, Gautam

    2014-08-01

    Antibiotic induced skin rash in setting of infectious mononucleosis is often encountered in clinical practice. However, macrolides like azithromycin are considered relatively safe and till date only two cases of azithromycin induced rash in setting of infectious mononucleosis have been reported. The following report illustrates the case of a 23-year-old man suffering from infectious mononucleosis who exhibited a generalized cutaneous rash following treatment with azithromycin. Using the Naranjo ADR probability scale, this case of acute onset rash following azithromycin administration was found to be in 'probable' category. The mechanism of antibiotic-induced rash in patients suffering from infectious mononucleosis is incompletely understood. It has been suggested that the rash could result from virus mediated immunomodulation or due to altered drug metabolism. The report calls for cautious use of antibiotics in the setting of suspected viral infections like infectious mononucleosis as injudicious use might increase the risk of deleterious skin reactions and increase the cost of healthcare.

  4. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  5. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  6. Defining Probability in Sex Offender Risk Assessment.

    PubMed

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  7. Ginkgo biloba induced mood dysregulation: a case report.

    PubMed

    Rho, Seung Sun; Woo, Young Sup; Bahk, Won-Myong

    2018-01-15

    Impairment of cognitive function as well as negative symptom is the major factor causing the decline of a patient's functioning in chronic stages of schizophrenia. However, until now, there were no definite treatment options that could effectively reduce the impairment. We report a case of mood dysregulation associated with use of Ginkgo biloba in a patient with schizophrenia. After Ginkgo biloba was given, the patient experienced cluster symptoms of mood dysregulation including irritability, difficulty in controlling anger, agitation and restlessness. We estimated the possibility as "probable" according to Naranjo scale considering circumstantial evidence. This case suggests that Ginkgo biloba may have caused mood dysregulation in this patient. Although it is generally accepted as safe, more attention should be given to the adverse effect when treating with Ginkgo biloba.

  8. Life-threatening complications of ibogaine: three case reports.

    PubMed

    Paling, F P; Andrews, L M; Valk, G D; Blom, H J

    2012-11-01

    Ibogaine is a naturally occurring psychoactive alkaloid extracted from the roots of the Tabernanthe iboga plant, which in alternative medicine is used to treat drug dependency. However, this upcoming, online advocated therapy can be dangerous due to its potentially lethal adverse effects. We present three cases in which toxic side effects were noted. We used the Naranjo scale to estimate the probability of a causal relationship between these effects and ibogaine. Findings in these three cases are suggestive of a causal relationship between the use of ibogaine and serious respiratory and cardiac problems (including lengthening of the QT interval). In our opinion it is of great importance that clinicians are aware of these potentially serious side effects and realise that widespread online marketing practices will give many more people access to ibogaine.

  9. Study of adverse drug reactions in patients with diabetes attending a tertiary care hospital in New Delhi, India.

    PubMed

    Singh, Abhishank; Dwivedi, Shridhar

    2017-02-01

    The present prospective observational study was carried out in a tertiary care hospital in New Delhi, India from May 2014 to June 2015 to report adverse drug reactions (ADRs) in patients with type 2 diabetes mellitus (T2DM) using antidiabetic drugs. A total of 220 patients (121 males, 99 females) were enrolled. ADRs were recorded on the prescribed form. Causality and severity assessment was done using Naranjo's probability scale and modified Hartwig and Siegel's severity scale, respectively. Commonly prescribed drugs were biguanides, peptide hormone and sulphonylurea. A total of 26 ADRs were recorded (16 in males and 10 in females). Most commonly observed ADRs were related to endocrine and gastrointestinal system. Severity assessment of ADRs showed seven (26.9%) ADRs as moderate, and 19 (73.1%) as mild. No severe reactions were observed. ADRs were mostly related to endocrine and gastrointestinal system. More information on prescribed drugs and their side effects is required for ensuring patient safety.

  10. Concise calculation of the scaling function, exponents, and probability functional of the Edwards-Wilkinson equation with correlated noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Y.; Pang, N.; Halpin-Healy, T.

    1994-12-01

    The linear Langevin equation proposed by Edwards and Wilkinson [Proc. R. Soc. London A 381, 17 (1982)] is solved in closed form for noise of arbitrary space and time correlation. Furthermore, the temporal development of the full probability functional describing the height fluctuations is derived exactly, exhibiting an interesting evolution between two distinct Gaussian forms. We determine explicitly the dynamic scaling function for the interfacial width for any given initial condition, isolate the early-time behavior, and discover an invariance that was unsuspected in this problem of arbitrary spatiotemporal noise.

  11. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  12. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  13. Rivaroxaban-induced chest wall spontaneous expanding hematoma.

    PubMed

    Salemis, Nikolaos S

    2017-03-22

    Rivaroxaban is an oral direct Factor Xa inhibitor approved in the European Union and the United Sates for the single-drug treatment of several thromboembolic diseases in adults. Ιt has been evaluated in large phase III clinical trials and has been found to have similar efficacy and safety with standard therapy. Herein, is described a very rare case of a rivaroxaban-induced spontaneous expanding chest wall hematoma, that required surgical intervention, in a breast cancer patient. Use of the Naranjo adverse drug reaction probability scale indicated a probable relationship (score of 7) between the patient's development of hematoma and treatment with rivaroxaban. Physicians should be cautious when prescribing rivaroxaban in groups of patients associated with increased bleeding risk such as patients with impaired renal or hepatic function, hypertension, coronary heart disease, heart failure, patients with certain types of cancers and patients receiving concomitant medications which may alter the pharmacokinetic or pharmacodymamic parameters of rivaroxaban. Anticoagulant treatment should be tailored to each individual patient weighing the bleeding risk against the risk of recurrent thrombosis.

  14. Hepatotoxicity induced by methimazole in a previously healthy patient.

    PubMed

    Gallelli, Luca; Staltari, Orietta; Palleria, Caterina; De Sarro, Giovambattista; Ferraro, Maria

    2009-09-01

    We report a case of hepatotoxicity induced by methimazole treatment in a patient affected by hyperthyroidism. A 54-year-old man, presented to our observation for palpitations, excessive sweating, weakness, heat intolerance and weight loss. On physical examination, his blood pressure was 140/90 mmHg and heart beat was 100/min regular. He had mild tremors and left exophthalmos. Laboratory test revealed a significant increase in serum thyroid hormone levels with a decrease in thyroid stimulating hormone levels. A diagnosis of hyperthyroidism was made and he began treatment with methimazole (30 mg/day). Fourteen days later, he returned for the development of scleral icterus, followed by dark urine, and abdominal pain in the right upper quadrant. Laboratory examinations and liver biopsy performed a diagnosis of cholestatic hepatitis, secondary to methimazole usage. Methimazole was promptly withdrawn and cholestyramine, ursodeoxycholic acid, and chlorpheniramine were given. After five days, abdominal pain resolved and laboratory parameters returned to normal. Naranjo probability scale indicated a probable relationship between hepatotoxicity and methimazole therapy. In conclusion physicians should be aware the risk of hepatotoxicity related with methimazole.

  15. Clozapine-induced systemic lupus erythematosus.

    PubMed

    Rami, Abu Fanne; Barkan, Daniel; Mevorach, Dror; Leitersdorf, Eran; Caraco, Yoseph

    2006-05-01

    To report a case of classic clozapine-induced systemic lupus erythematosus that also developed on rechallenge. A 32-year-old white woman diagnosed with schizophrenia presented in 1996 with clinical characteristics and laboratory markers consistent with drug-induced lupus (DIL). Clozapine, started 1 year prior, was withdrawn, with complete biological and clinical remission within 3 months. In 2004, 1 week after rechallenge with clozapine for uncontrolled schizophrenia, the patient developed clinical and biological signs and symptoms consistent with the diagnosis of DIL. Again, discontinuation of clozapine was followed by full remission within 2-3 months. DIL was first described more than 50 years ago, with multiple drugs implicated in the causation. Clozapine-induced lupus was reported recently, but does not meet the usual criteria for a diagnosis of DIL. We report a classic case of clozapine-induced lupus that, according to the Naranjo probability scale, demonstrates a highly probable relationship between DIL and clozapine. DIL demands a high index of suspicion for diagnosis. Although clozapine has an extensive safety profile, DIL must be considered as one of its serious adverse effects.

  16. Nonconvulsive status epilepticus due to ifosfamide.

    PubMed

    Kilickap, Saadettin; Cakar, Mustafa; Onal, Ibrahim K; Tufan, Abdurrahman; Akoglu, Hadim; Aksoy, Sercan; Erman, Mustafa; Tekuzman, Gulten

    2006-02-01

    To report 2 cases of nonconvulsive status epilepticus (NCSE) following infusion of ifosfamide. Two patients who received ifosfamide-containing chemotherapy developed NCSE. One woman received ifosfamide 1000 mg/m2 (1 h infusion on days 1-5); confusion, lethargy, and speech deterioration developed on day 3. The second patient developed similar symptoms on day 3 of treatment with 2500 mg/m2. Both patients responded to intravenous administration of diazepam 10 mg and were given levetiracetam as maintenance therapy. The severity and presentation of central nervous system toxicity due to ifosfamide varies greatly and involves a spectrum ranging from subclinical electroencephalogram changes to coma. NCSE, an epileptic disorder in which typical convulsive activity is absent, has previously been reported in only 4 patients receiving ifosfamide. Levetiracetam may be used for maintenance antiepileptic therapy after diazepam administration. Among the many presentations of ifosfamide neurotoxicity, clinicians should consider NCSE as a possible explanation for changes in consciousness in a patient receiving this agent. An objective causality assessment by use of the Naranjo probability scale revealed that NCSE due to ifosfamide was probable.

  17. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies

    PubMed Central

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A.; Zaykin, Dmitri V.

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease. PMID:25955023

  18. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    PubMed

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  19. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  20. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  1. Evaluation of adverse reactions to contrast media in the hospital

    PubMed Central

    Ryu, J-H; Kim, E-Y

    2013-01-01

    Objective: To determine and analyse the characteristics of contrast media adverse reactions (CM-ARs) reported in a hospital. Methods: A retrospective review of CM-ARs from the electronic spontaneous adverse drug reaction (ADR) report system between January 2011 and August 2012 was conducted. CM-ARs were evaluated in terms of causality, severity, preventability and affected organs. Also, agreement and correlation among the tools used to evaluate CM-ARs were analysed. Results: The overall reaction rate was 1.5% (n = 286). In total, 269 CM-ARs were identified. For ADR causality, 96.7% (n = 260) and 98.5% (n = 265) were evaluated as “probable” ADR using the Naranjo probability scale and the World Health Organization–Uppsala Monitoring Centre causality categories, whereas 98.1% (n = 264) were evaluated as “certain” with Korean algorithm v. II. Of these, 91.4% (n = 246) were mild in severity and 96.7% (n = 260) were unpreventable. Most patients (n = 233, 86.7%) could be managed with observation and/or simple treatment. The most frequent reaction (n = 383, 79.5%) was dermatological. Spearman's correlation coefficient was 0.667 (p < 0.01), and the agreement was 98.1% between the Naranjo scale and the World Health Organization–Uppsala Monitoring Centre categories. No relationship was seen between CM-AR severity and gender or between in- and outpatients. Conclusion: In our study, most CM-ARs were mild and managed with simple treatment. However, as the number of patients undergoing CT procedures continues to increase, it is essential to identify and observe patients at risk for CM-ARs to prevent severe ADRs. Advances in knowledge: Continuous careful review of reporting and treatment protocols of CM-ARs is needed to prevent morbidity and mortality. PMID:24191123

  2. Nonparametric probability density estimation by optimization theoretic techniques

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1976-01-01

    Two nonparametric probability density estimators are considered. The first is the kernel estimator. The problem of choosing the kernel scaling factor based solely on a random sample is addressed. An interactive mode is discussed and an algorithm proposed to choose the scaling factor automatically. The second nonparametric probability estimate uses penalty function techniques with the maximum likelihood criterion. A discrete maximum penalized likelihood estimator is proposed and is shown to be consistent in the mean square error. A numerical implementation technique for the discrete solution is discussed and examples displayed. An extensive simulation study compares the integrated mean square error of the discrete and kernel estimators. The robustness of the discrete estimator is demonstrated graphically.

  3. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  4. A rare case of methimazole-induced cholestatic jaundice in an elderly man of Asian ethnicity with hyperthyroidism: A case report.

    PubMed

    Ji, Hongjian; Yue, Feng; Song, Jianxiang; Zhou, Xiaohua

    2017-12-01

    Methimazole is an antithyroid drug that is widely used for the treatment of hyperthyroidism. As an inhibitor of the enzyme thyroperoxidase, methimazole is generally well-tolerated. However, there have been increasing reports of methimazole-induced liver damage, although this effect of methimazole has been limited by the absence of objective diagnosis of the liver condition or the inappropriate use of the Naranjo scale. We present the case of an elderly man with hyperthyroidism, gastritis, and epilepsy who developed liver damage after administration of multiple drugs. Considering the low sensitivity of the Naranjo scale in detecting rare reactions associated with liver damage, we used the Roussel-Uclaf Causality Assessment Method scale, with a finding of cholestatic jaundice hepatitis induced by methimazole. The patient's liver enzyme levels improved after discontinuation of methimazole. Our case underlines the possible hepatoxicity associated with the use of methimazole. A review of the literature confirmed a selective hepatoxicity risk in individuals of Asian ethnicity, which has not been identified in Caucasian or Black populations. Physicians should be aware of the risk of hepatoxicity when prescribing oral methimazole to patients of Asian ethnicity.

  5. Origin of probabilities and their application to the multiverse

    NASA Astrophysics Data System (ADS)

    Albrecht, Andreas; Phillips, Daniel

    2014-12-01

    We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.

  6. Herbal hepatotoxicity: Challenges and pitfalls of causality assessment methods

    PubMed Central

    Teschke, Rolf; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2013-01-01

    The diagnosis of herbal hepatotoxicity or herb induced liver injury (HILI) represents a particular clinical and regulatory challenge with major pitfalls for the causality evaluation. At the day HILI is suspected in a patient, physicians should start assessing the quality of the used herbal product, optimizing the clinical data for completeness, and applying the Council for International Organizations of Medical Sciences (CIOMS) scale for initial causality assessment. This scale is structured, quantitative, liver specific, and validated for hepatotoxicity cases. Its items provide individual scores, which together yield causality levels of highly probable, probable, possible, unlikely, and excluded. After completion by additional information including raw data, this scale with all items should be reported to regulatory agencies and manufacturers for further evaluation. The CIOMS scale is preferred as tool for assessing causality in hepatotoxicity cases, compared to numerous other causality assessment methods, which are inferior on various grounds. Among these disputed methods are the Maria and Victorino scale, an insufficiently qualified, shortened version of the CIOMS scale, as well as various liver unspecific methods such as the ad hoc causality approach, the Naranjo scale, the World Health Organization (WHO) method, and the Karch and Lasagna method. An expert panel is required for the Drug Induced Liver Injury Network method, the WHO method, and other approaches based on expert opinion, which provide retrospective analyses with a long delay and thereby prevent a timely assessment of the illness in question by the physician. In conclusion, HILI causality assessment is challenging and is best achieved by the liver specific CIOMS scale, avoiding pitfalls commonly observed with other approaches. PMID:23704820

  7. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia.

    PubMed

    Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.

  8. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia

    PubMed Central

    Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054

  9. Negative probability of random multiplier in turbulence

    NASA Astrophysics Data System (ADS)

    Bai, Xuan; Su, Weidong

    2017-11-01

    The random multiplicative process (RMP), which has been proposed for over 50 years, is a convenient phenomenological ansatz of turbulence cascade. In the RMP, the fluctuation in a large scale is statistically mapped to the one in a small scale by the linear action of an independent random multiplier (RM). Simple as it is, the RMP is powerful enough since all of the known scaling laws can be included in this model. So far as we know, however, a direct extraction for the probability density function (PDF) of RM has been absent yet. The reason is the deconvolution during the process is ill-posed. Nevertheless, with the progress in the studies of inverse problems, the situation can be changed. By using some new regularization techniques, for the first time we recover the PDFs of the RMs in some turbulent flows. All the consistent results from various methods point to an amazing observation-the PDFs can attain negative values in some intervals; and this can also be justified by some properties of infinitely divisible distributions. Despite the conceptual unconventionality, the present study illustrates the implications of negative probability in turbulence in several aspects, with emphasis on its role in describing the interaction between fluctuations at different scales. This work is supported by the NSFC (No. 11221062 and No. 11521091).

  10. Propylene Glycol-Related Delirium After Esmolol Infusion.

    PubMed

    Kapitein, Berber S; Biesmans, Renee S C G; van der Sijs, Heleen S I; de Wildt, Saskia S N

    2014-07-01

    Excipients used in oral or intravenous preparations may cause serious adverse events. We present the case of a 15-year-old boy with hypertrophic cardiomyopathy. In the pediatric intensive care unit, he received high doses of continuous intravenous esmolol (range = 20-400 µg/kg/min) for cardiac rhythm control. After a few days he developed a delirium not responding to high doses of antipsychotics or discontinuation of benzodiazepines. We eventually realized that the IV esmolol formulation contained high doses of propylene glycol and ethanol, which may accumulate after prolonged infusion and cause intoxication. Intoxication with propylene glycolcan cause neuropsychiatric symptoms. The boy's propylene glycol plasma concentration was approximately 4 g/L, whereas clinical symptoms arise at concentrations above 1 to 1.44 g/L. Application of the Naranjo adverse drug reaction probability scale suggested a probable relationship (score 6) between the propylene glycol infusion and the delirium. After discontinuation of esmolol, the delirium disappeared spontaneously. This is the first case describing excipient toxicity of esmolol, with an objective causality assessment revealing a probable relationship for the adverse event-namely, delirium-and esmolol. Although excipient toxicity is a well-known adverse drug reaction, this case stresses the importance for easily available information for and education of physicians. © The Author(s) 2014.

  11. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  12. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    USGS Publications Warehouse

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  13. Multifractals embedded in short time series: An unbiased estimation of probability moment

    NASA Astrophysics Data System (ADS)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  14. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  15. Faster computation of exact RNA shape probabilities.

    PubMed

    Janssen, Stefan; Giegerich, Robert

    2010-03-01

    Abstract shape analysis allows efficient computation of a representative sample of low-energy foldings of an RNA molecule. More comprehensive information is obtained by computing shape probabilities, accumulating the Boltzmann probabilities of all structures within each abstract shape. Such information is superior to free energies because it is independent of sequence length and base composition. However, up to this point, computation of shape probabilities evaluates all shapes simultaneously and comes with a computation cost which is exponential in the length of the sequence. We device an approach called RapidShapes that computes the shapes above a specified probability threshold T by generating a list of promising shapes and constructing specialized folding programs for each shape to compute its share of Boltzmann probability. This aims at a heuristic improvement of runtime, while still computing exact probability values. Evaluating this approach and several substrategies, we find that only a small proportion of shapes have to be actually computed. For an RNA sequence of length 400, this leads, depending on the threshold, to a 10-138 fold speed-up compared with the previous complete method. Thus, probabilistic shape analysis has become feasible in medium-scale applications, such as the screening of RNA transcripts in a bacterial genome. RapidShapes is available via http://bibiserv.cebitec.uni-bielefeld.de/rnashapes

  16. Ertapenem-associated neurotoxicity in the Spinal Cord Injury (SCI) population: a case series.

    PubMed

    Patel, Ursula C; Fowler, Mallory A

    2017-09-06

    Context Ertapenem, a broad spectrum carbapenem antibiotic, is used often in Spinal Cord Injury (SCI) patients due to increased risk factors for multi-drug resistant (MDR) infections in this population. Neurotoxicity, specifically seizures, due to ertapenem is a known adverse effect and has been described previously. Other manifestations such as delirium and visual hallucinations have rarely been reported, and no literature, to the best of our knowledge, specifically describes these effects solely in the SCI population. Findings Four cases of mental status changes and hallucinations in SCI patients attributed to ertapenem therapy are described. Onset of symptoms began between one and six days following initiation of ertapenem and resolved between two to 42 days following discontinuation. Based on the Naranjo probability scale, a probable relationship exists between the adverse events and ertapenem for three out of the four cases. Possible overestimation of renal function and hypoalbuminemia may be contributing factors to the noted adverse reactions. Conclusion/Clinical Relevance The cases described highlight the importance of recognizing ertapenem-associated hallucinations in SCI patients. The population is particularly vulnerable due to risk factors for MDR infections necessitating ertapenem use, possible overestimation of renal function, and a high prevalence of hypoalbuminemia.

  17. Transient central diabetes insipidus induced by ketamine infusion.

    PubMed

    Hatab, Sarah Z; Singh, Arun; Felner, Eric I; Kamat, Pradip

    2014-12-01

    Report a case of central diabetes insipidus (DI) associated with ketamine infusion. A 2-year-old girl with long-chain 3-hydroxyacyl-CoA dehydrogenase deficiency and stable hypertrophic cardiomyopathy was admitted to the pediatric intensive care with pneumonia. She subsequently developed respiratory failure and required intubation. Continuous ketamine infusion was used for the sedation and facilitation of mechanical ventilation. Shortly after infusion of ketamine, the patient developed DI and responded appropriately to vasopressin. The Naranjo adverse drug reaction probability scale indicated a probable relationship between the development of central DI and ketamine. The most likely mechanism involves ketamine's antagonist action on N-methyl-d-aspartate receptors, resulting in inhibition of glutamate-stimulated arginine vasopressin release from the neurohypophysis. This is the second case report of ketamine-induced central DI and the only report in children. Clinicians who sedate children with continuous ketamine infusions should monitor patients for developing signs and symptoms of DI by measuring serum sodium and urine output prior to, during, and after ketamine infusion in order to make a timely diagnosis of this potentially serious complication. © The Author(s) 2014.

  18. Flushing and pruritus secondary to prescription fish oil ingestion in a patient with allergy to fish.

    PubMed

    Howard-Thompson, Amanda; Dutton, Anna; Hoover, Robert; Goodfred, Jennifer

    2014-12-01

    A brand of fish oil capsules contains omega-3 fatty acids obtained from several fish sources. Although the manufacturer calls for caution in patients with fish hypersensitivity, insufficient data is available to make a definitive recommendation regarding its use in this population. A patient with documented seafood allergy presented to the emergency department 4 days after the initiation of prescription brand name fish oil capsules complaining of chest tightness, shortness of breath, tingling of upper extremities, flushing, and pruritus that was minimally relieved by excessive nonprescription diphenhydramine administration. During subsequent follow-up, the patient reported that all symptoms had resolved within 5 days of discontinuing the medication and 3 days of disposing of her pillbox and all medications that had come in contact with the fish oil capsules. Due to the patient's allergic history, timing of onset/offset of the reaction, laboratory evidence, and the use of the Naranjo probability scale, prescription fish oil capsules were deemed the probable cause of this patient's pruritus and flushing of the face and trunk. Practitioners and patients should always ensure they have an updated list of allergies within the patient's medical record that includes medications as well as foods and food additives.

  19. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  20. Azithromycin-Induced Rash in a Patient of Infectious Mononucleosis – A Case Report with Review of Literature

    PubMed Central

    Mondal, Somnath; Sen, Sukanta; Tripathi, Santanu Kumar; Banerjee, Gautam

    2014-01-01

    Antibiotic induced skin rash in setting of infectious mononucleosis is often encountered in clinical practice. However, macrolides like azithromycin are considered relatively safe and till date only two cases of azithromycin induced rash in setting of infectious mononucleosis have been reported. The following report illustrates the case of a 23-year-old man suffering from infectious mononucleosis who exhibited a generalized cutaneous rash following treatment with azithromycin. Using the Naranjo ADR probability scale, this case of acute onset rash following azithromycin administration was found to be in ‘probable’ category. The mechanism of antibiotic-induced rash in patients suffering from infectious mononucleosis is incompletely understood. It has been suggested that the rash could result from virus mediated immunomodulation or due to altered drug metabolism. The report calls for cautious use of antibiotics in the setting of suspected viral infections like infectious mononucleosis as injudicious use might increase the risk of deleterious skin reactions and increase the cost of healthcare. PMID:25302218

  1. Relationship between childhood trauma and suicide probability in obsessive-compulsive disorder.

    PubMed

    Ay, Rukiye; Erbay, Lale Gonenir

    2018-03-01

    The aim of this study is to assess the relationship between childhood trauma with the probability of suicide in obsessive compulsive disorders. Sixty-seven patients who were diagnosed with OCD were included in the study out of the patients who were admitted to Malatya Training and Research Hospital psychiatry outpatient clinic. The research data were collected using Yale Brawn Obsessive Compulsive Scale (YBOCS), Beck Depression (BDS) and Beck Anxiety Scales (BAS), Childhood Trauma Questionnaire-28 (CTQ-28), and Suicide Probability Scale (SPS). CTQ was detected as ≥ 35 in 36 of 67 patients who were included in the study. Aggression (p = 0.003), sexual (p = 0.007) and religious (p = 0.023) obsessions and rituelistic (p = 0.000) compulsions were significantly higher in the group with CTQ ≥ 35. Mild correlation was detected between the SPS score and the scores of CTQ. Correlation remained even when the effect of BAS and BDS scores were excluded. At the end of our study, childhood traumas were found to be associated with obsessive symptoms. In the group with childhood trauma, increased suicide probability was detected independently from depression and anxiety. Copyright © 2017. Published by Elsevier B.V.

  2. Research on quantitative relationship between NIIRS and the probabilities of discrimination

    NASA Astrophysics Data System (ADS)

    Bai, Honggang

    2011-08-01

    There are a large number of electro-optical (EO) and infrared (IR) sensors used on military platforms including ground vehicle, low altitude air vehicle, high altitude air vehicle, and satellite systems. Ground vehicle and low-altitude air vehicle (rotary and fixed-wing aircraft) sensors typically use the probabilities of discrimination (detection, recognition, and identification) as design requirements and system performance indicators. High-altitude air vehicles and satellite sensors have traditionally used the National Imagery Interpretation Rating Scale (NIIRS) performance measures for guidance in design and measures of system performance. Recently, there has a large effort to make strategic sensor information available to tactical forces or make the information of targets acquisition can be used by strategic systems. In this paper, the two techniques about the probabilities of discrimination and NIIRS for sensor design are presented separately. For the typical infrared remote sensor design parameters, the function of the probability of recognition and NIIRS scale as the distance R is given to Standard NATO Target and M1Abrams two different size targets based on the algorithm of predicting the field performance and NIIRS. For Standard NATO Target, M1Abrams, F-15, and B-52 four different size targets, the conversion from NIIRS to the probabilities of discrimination are derived and calculated, and the similarities and differences between NIIRS and the probabilities of discrimination are analyzed based on the result of calculation. Comparisons with preliminary calculation results show that the conversion between NIIRS and the probabilities of discrimination is probable although more validation experiments are needed.

  3. The ranking probability approach and its usage in design and analysis of large-scale studies.

    PubMed

    Kuo, Chia-Ling; Zaykin, Dmitri

    2013-01-01

    In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.

  4. Rate and reaction probability of the surface reaction between ozone and dihydromyrcenol measured in a bench scale reactor and a room-sized chamber

    NASA Astrophysics Data System (ADS)

    Shu, Shi; Morrison, Glenn C.

    2012-02-01

    Low volatility terpenoids emitted from consumer products can react with ozone on surfaces and may significantly alter concentrations of ozone, terpenoids and reaction products in indoor air. We measured the reaction probability and a second-order surface-specific reaction rate for the ozonation of dihydromyrcenol, a representative indoor terpenoid, adsorbed onto polyvinylchloride (PVC), glass, and latex paint coated spheres. The reaction probability ranged from (0.06-8.97) × 10 -5 and was very sensitive to humidity, substrate and mass adsorbed. The average surface reaction probability is about 10 times greater than that for the gas-phase reaction. The second-order surface-specific rate coefficient ranged from (0.32-7.05) × 10 -15 cm 4 s -1 molecule -1and was much less sensitive to humidity, substrate, or mass adsorbed. We also measured the ozone deposition velocity due to adsorbed dihydromyrcenol on painted drywall in a room-sized chamber, Based on that, we calculated the rate coefficient ((0.42-1.6) × 10 -15 cm 4 molecule -1 s -1), which was consistent with that derived from bench-scale experiments for the latex paint under similar conditions. We predict that more than 95% of dihydromyrcenol oxidation takes place on indoor surfaces, rather than in building air.

  5. Gravity and count probabilities in an expanding universe

    NASA Technical Reports Server (NTRS)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  6. Inadvertent exaggerated anticoagulation following use of bismuth subsalicylate in an enterally fed patient receiving warfarin therapy.

    PubMed

    Bingham, Angela L; Brown, Rex O; Dickerson, Roland N

    2013-12-01

    We report a case of an inadvertent increase in the international normalized ratio (INR) after the addition of bismuth subsalicylate for the treatment of diarrhea in an enterally fed patient receiving warfarin therapy. A 56-year-old Caucasian female presented to the trauma intensive care unit (ICU) with multiple lower extremity fractures. Warfarin was initiated for deep vein thrombosis prophylaxis due to the patient's inability to ambulate. The target INR was 2-3. Continuous intragastric enteral feeding was withheld 1 hour before and 1 hour after intragastric administration of warfarin. Bismuth subsalicylate 30 mL every 4 hours was prescribed for diarrhea. Within 3 days after starting bismuth subsalicylate therapy, the patient's INR increased from 2.56 to 3.54 and minor bleeding was noted from the patient's tracheostomy site. No significant change in warfarin dosage, variability in vitamin K intake, or medications that potentially alter warfarin metabolism were present during the unexpected rise in INR. When the bismuth subsalicylate was discontinued, the patient's INR stabilized into the target range on the same warfarin dose given at the time of the supratherapeutic INR. Salicylate displaces warfarin from plasma protein binding sites and may result in a significant increase in INR secondary to redistribution of warfarin to the free active form. Evaluation of this case report using the Drug Interaction Probability Scale and Naranjo Adverse Drug Reaction Probability Scale yielded scores consistent with a probable adverse drug interaction. Bismuth subsalicylate exaggerates warfarin's anticoagulant response and its concurrent use during warfarin therapy should be avoided.

  7. Disordered eating attitudes, alexithymia and suicide probability among Turkish high school girls.

    PubMed

    Alpaslan, Ahmet Hamdi; Soylu, Nusret; Avci, Kadriye; Coşkun, Kerem Şenol; Kocak, Uğur; Taş, Hanife Uzel

    2015-03-30

    We aimed to examine association between disordered eating attitudes (DEAs), alexithymia and suicide probability among adolescent females and to explore potential link between alexithymia and suicide probability in subjects with DEAs. 381 female students completed Eating Attitude Test (EAT-26), Toronto Alexithymia Scale (TAS-20) and Suicide Probability Scale (SPS). It was found that 13.2% (n=52) of the subjects have DEAs. Results indicated that total TAS-20 score and scores of Difficulty in Identifying Feelings (DIF) and Difficulty in Describing Feelings (DDF) subscales were significantly higher in DEAs group than in those non DEAs group (p<0.05). Additionally, total SPS score (p<0.001), Hopelessness (p=0.001), Suicide Ideation (p<0.001) and Hostility (p=0.003) subscales scores of SPS were significantly higher in the alexithymic DEAs than the non-alexithymic DEAs group. In order to control potential effect of depression, SPS subscales were used as covariate factors in ANCOVA. Negative Self-Evaluation subscale yielded a statistically significant difference between groups, other subscales did not. Results point out these; DEAs are relatively frequent phenomenon among female students in Turkey and presence of alexithymia was associated with an increased suicide probability in adolescents with DEAs. The results should be evaluated taking into account that depressive symptomatology was not assessed using a depression scale. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  9. Probability assessment with response times and confidence in perception and knowledge.

    PubMed

    Petrusic, William M; Baranski, Joseph V

    2009-02-01

    In both a perceptual and a general knowledge comparison task, participants categorized the time they took to decide, selecting one of six categories ordered from "Slow" to Fast". Subsequently, they rated confidence on a six-category scale ranging from "50%" to "100%". Participants were able to accurately scale their response times thus enabling the treatment of the response time (RT) categories as potential confidence categories. Probability assessment analyses of RTs revealed indices of over/underconfidence, calibration, and resolution, each subject to the "hard-easy" effect, comparable to those obtained with the actual confidence ratings. However, in both the perceptual and knowledge domains, resolution (i.e., the ability to use the confidence categories to distinguish correct from incorrect decisions) was significantly better with confidence ratings than with RT categorization. Generally, comparable results were obtained with scaling of the objective RTs, although subjective categorization of RTs provided probability assessment indices superior to those obtained from objective RTs. Taken together, the findings do not support the view that confidence arises from a scaling of decision time.

  10. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  11. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  13. Soft Soil Tire Model Development and Experimental Testing

    DTIC Science & Technology

    2012-09-01

    Sandu 1 , Mr. Eduardo Pinto 2 , Mr. Scott Naranjo 3 , Dr. Paramsothy Jayakumar 4 , Dr. Brant Ross 5 1Virginia Tech, 2Virginia Tech, 3Virginia Tech...W56HZV-04-2-0001 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Corina Sandu; Eduardo Pinto; Scott Naranjo; Paramsothy Jayakumar ; Brant Ross

  14. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  15. Analysis of SET pulses propagation probabilities in sequential circuits

    NASA Astrophysics Data System (ADS)

    Cai, Shuo; Yu, Fei; Yang, Yiqun

    2018-05-01

    As the feature size of CMOS transistors scales down, single event transient (SET) has been an important consideration in designing logic circuits. Many researches have been done in analyzing the impact of SET. However, it is difficult to consider numerous factors. We present a new approach for analyzing the SET pulses propagation probabilities (SPPs). It considers all masking effects and uses SET pulses propagation probabilities matrices (SPPMs) to represent the SPPs in current cycle. Based on the matrix union operations, the SPPs in consecutive cycles can be calculated. Experimental results show that our approach is practicable and efficient.

  16. Evaluation of carotid plaque echogenicity based on the integral of the cumulative probability distribution using gray-scale ultrasound images.

    PubMed

    Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili

    2017-01-01

    Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.

  17. Studying the effects of fuel treatment based on burn probability on a boreal forest landscape.

    PubMed

    Liu, Zhihua; Yang, Jian; He, Hong S

    2013-01-30

    Fuel treatment is assumed to be a primary tactic to mitigate intense and damaging wildfires. However, how to place treatment units across a landscape and assess its effectiveness is difficult for landscape-scale fuel management planning. In this study, we used a spatially explicit simulation model (LANDIS) to conduct wildfire risk assessments and optimize the placement of fuel treatments at the landscape scale. We first calculated a baseline burn probability map from empirical data (fuel, topography, weather, and fire ignition and size data) to assess fire risk. We then prioritized landscape-scale fuel treatment based on maps of burn probability and fuel loads (calculated from the interactions among tree composition, stand age, and disturbance history), and compared their effects on reducing fire risk. The burn probability map described the likelihood of burning on a given location; the fuel load map described the probability that a high fuel load will accumulate on a given location. Fuel treatment based on the burn probability map specified that stands with high burn probability be treated first, while fuel treatment based on the fuel load map specified that stands with high fuel loads be treated first. Our results indicated that fuel treatment based on burn probability greatly reduced the burned area and number of fires of different intensities. Fuel treatment based on burn probability also produced more dispersed and smaller high-risk fire patches and therefore can improve efficiency of subsequent fire suppression. The strength of our approach is that more model components (e.g., succession, fuel, and harvest) can be linked into LANDIS to map the spatially explicit wildfire risk and its dynamics to fuel management, vegetation dynamics, and harvesting. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  19. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  20. Characteristics and clinical implications of the pharmacokinetic profile of ibuprofen in patients with knee osteoarthritis.

    PubMed

    Gallelli, L; Galasso, O; Urzino, A; Saccà, S; Falcone, D; Palleria, C; Longo, P; Corigliano, A; Terracciano, R; Savino, R; Gasparini, G; De Sarro, G; Southworth, S R

    2012-12-01

    Ibuprofen is a non-selective cyclo-oxygenase (COX)-1/COX-2 inhibitor used to treat pain conditions and inflammation. Limited data have been published concerning the pharmacokinetic profile and clinical effects of ibuprofen in patients with osteoarthritis (OA). In this paper we compared the pharmacokinetic and clinical profile of ibuprofen (at a dosage of from 800 mg/day to 1800 mg/day) administered in patients affected by severe knee OA. Ibuprofen was administered for 7 days to patients who were scheduled to undergo knee arthroplasty due to OA. After 7 days, the ibuprofen concentration in plasma and synovial fluid was measured through both high-performance liquid chromatography (HPLC)-UV and gas chromatography-mass spectroscopy (GC/MS), while clinical effects were evaluated through both visual analogue scale (VAS) and Western Ontario and McMaster Universities (WOMAC) scores. The Naranjo scale and the WHO causality assessment scale were used for estimating the probability of adverse drug reactions (ADRs). The severity of ADRs was assessed by the modified Hartwig and Siegel scale. Ibuprofen showed a dose-dependent diffusion in both plasma and synovial fluid, which was related to the reduction of pain intensity and improvement of health status, without the development of ADRs. Ibuprofen at higher dosages can be expected to provide better control of OA symptoms as a result of higher tissue distribution.

  1. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  2. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspacesmore » H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.« less

  3. Modeling spatial variation in avian survival and residency probabilities

    USGS Publications Warehouse

    Saracco, James F.; Royle, J. Andrew; DeSante, David F.; Gardner, Beth

    2010-01-01

    The importance of understanding spatial variation in processes driving animal population dynamics is widely recognized. Yet little attention has been paid to spatial modeling of vital rates. Here we describe a hierarchical spatial autoregressive model to provide spatially explicit year-specific estimates of apparent survival (phi) and residency (pi) probabilities from capture-recapture data. We apply the model to data collected on a declining bird species, Wood Thrush (Hylocichla mustelina), as part of a broad-scale bird-banding network, the Monitoring Avian Productivity and Survivorship (MAPS) program. The Wood Thrush analysis showed variability in both phi and pi among years and across space. Spatial heterogeneity in residency probability was particularly striking, suggesting the importance of understanding the role of transients in local populations. We found broad-scale spatial patterning in Wood Thrush phi and pi that lend insight into population trends and can direct conservation and research. The spatial model developed here represents a significant advance over approaches to investigating spatial pattern in vital rates that aggregate data at coarse spatial scales and do not explicitly incorporate spatial information in the model. Further development and application of hierarchical capture-recapture models offers the opportunity to more fully investigate spatiotemporal variation in the processes that drive population changes.

  4. Severe Hypertriglyceridemia Induced by Sirolimus Treated With Medical Management Without Plasmapheresis: A Case Report.

    PubMed

    Kido, Kazuhiko; Evans, Rickey A; Gopinath, Anil; Flynn, Jeremy D

    2018-02-01

    Hypertriglyceridemia and hyperlipidemia are the most remarkable metabolic complications seen with long-term sirolimus therapy. We report the case of a 36-year-old woman status post bilateral lung transplantation on a maintenance immunosuppression regimen of sirolimus, tacrolimus, and prednisone who presented with status migrainosus, chest pain, abdominal discomfort, and triglyceride levels greater than 4425 mg/dL. In previously reported cases of severe hypertriglyceridemia that developed on maintenance sirolimus therapy, plasmapheresis has been utilized as an early strategy to rapidly lower triglycerides in order to minimize the risk of acute complications such as pancreatitis, but our case was managed medically without plasmapheresis. The most recent triglyceride was down to 520 mg/dL 2 months after discontinuation of sirolimus. We estimate the probability of this reaction to sirolimus as probable based on a score of 5 points on the Naranjo scale. This is the first case report to our knowledge that highlights the sole use of oral lipid-lowering drug agents to treat severe hypertriglyceridemia secondary to sirolimus without the use of plasmapheresis. Sirolimus-induced severe hypertriglyceridemia can be managed with oral lipid-lowering agents without plasmapheresis. Clinician needs to be aware of the importance of baseline and regular triglyceride monitoring in patients on sirolimus.

  5. Lactic Acidosis with Chloramphenicol Treatment in a Child with Cystic Fibrosis.

    PubMed

    Goyer, Isabelle; Iseppon, Massimiliano; Thibault, Céline; Abaji, Rachid; Krajinovic, Maja; Autmizguine, Julie

    2017-01-30

    Children with cystic fibrosis are commonly colonized with multi-resistant bacteria. In such patients, infectious exacerbation may require salvage therapy with uncommonly used antimicrobials, including chloramphenicol. Chloramphenicol is rarely used nowadays because of the associated severe adverse events. We describe the case of a 15-year-old female with terminal cystic fibrosis who required intravenous (IV) chloramphenicol treatment for a Burkholderia cepacia (B. cepacia) exacerbation. The child subsequently developed lactic acidosis and secondary respiratory compensation adding to her baseline respiratory distress. Based on the Naranjo scale, the probability of chloramphenicol being the cause of the hyperlactatemia and associated respiratory distress was rated as probable, as the adverse effects resolved upon discontinuation of the drug. Subsequent genotyping for mitochondrial polymorphism (G3010A) confirmed a possible susceptibility to lactic acidosis from mitochondrial RNA-inhibiting agents such as chloramphenicol. Hyperlactatemia is a rare but life threatening adverse effect that has been previously reported with chloramphenicol exposure, but is not generally thought of. Clinicians should be aware of this potentially life threatening, but reversible adverse event. Lactate should be monitored under chloramphenicol and it should be discontinued as soon as this complication is suspected, especially in patients with low respiratory reserve. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.

  6. Antidepressant-selective gynecomastia.

    PubMed

    Kaufman, Kenneth R; Podolsky, Dina; Greenman, Danielle; Madraswala, Rehman

    2013-01-01

    To describe what we believe is the first reported case of synergistic gynecomastia during treatment of depressive and anxiety disorders when sertraline was added to a stable medication regimen including duloxetine, rosuvastatin, and amlodipine. A 67-year-old male with major depression, dysthymia, obsessive-compulsive disorder, social anxiety, hypertension, diabetes, and hyperlipidemia presented with new-onset gynecomastia and breast tenderness. Mammography revealed bilateral gynecomastia (fibroglandular tissue posterior to the nipples bilaterally) without suspicious mass, calcification, or other abnormalities. These new symptoms developed after sertraline was added to his stable medication regimen (duloxetine, alprazolam, rosuvastatin, metoprolol, amlodipine, hydrochlorothiazide/triamterene, metformin, and sitagliptin). These symptoms were dose-dependent, with gynecomastia and breast tenderness more severe as sertraline was titrated from 25 mg/day to 50 mg/day and then to 75 mg/day. When sertraline was discontinued, gynecomastia and breast tenderness rapidly resolved. Mammoplasia and gynecomastia are associated with altered dopamine neurotransmission and/or perturbations in sexual hormones. These adverse effects may be medication induced. Selective serotonin reuptake inhibitors (sertraline), serotonin-norepinephrine reuptake inhibitors (duloxetine), rosuvastatin, and amlodipine have been reported to cause these adverse effects. This case was unique, since the patient had been on both sertraline and duloxetine previously as independent psychotropics without the development of gynecomastia. In the context of an additive drug adverse effect, the probability of sertraline as the precipitant drug was determined by both the Naranjo probability scale and the Horn drug interaction probability scale as probable. Gynecomastia is associated with antidepressants and other medications but is rarely addressed. Gynecomastia may be antidepressant selective or may be the result of

  7. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  8. Probability Distribution of Turbulent Kinetic Energy Dissipation Rate in Ocean: Observations and Approximations

    NASA Astrophysics Data System (ADS)

    Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.

    2017-10-01

    The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.

  9. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  10. Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.

    PubMed

    Jaspersen, Johannes G; Montibeller, Gilberto

    2015-07-01

    Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. © 2015 Society for Risk Analysis.

  11. Probability machines: consistent probability estimation using nonparametric learning machines.

    PubMed

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  12. Primary Accretion and Turbulent Cascades: Scale-Dependence of Particle Concentration Multiplier Probability Distribution Functions

    NASA Astrophysics Data System (ADS)

    Cuzzi, Jeffrey N.; Weston, B.; Shariff, K.

    2013-10-01

    Primitive bodies with 10s-100s of km diameter (or even larger) may form directly from small nebula constituents, bypassing the step-by-step “incremental growth” that faces a variety of barriers at cm, m, and even 1-10km sizes. In the scenario of Cuzzi et al (Icarus 2010 and LPSC 2012; see also Chambers Icarus 2010) the immediate precursors of 10-100km diameter asteroid formation are dense clumps of chondrule-(mm-) size objects. These predictions utilize a so-called cascade model, which is popular in turbulence studies. One of its usual assumptions is that certain statistical properties of the process (the so-called multiplier pdfs p(m)) are scale-independent within a cascade of energy from large eddy scales to smaller scales. In similar analyses, Pan et al (2011 ApJ) found discrepancies with results of Cuzzi and coworkers; one possibility was that p(m) for particle concentration is not scale-independent. To assess the situation we have analyzed recent 3D direct numerical simulations of particles in turbulence covering a much wider range of scales than analyzed by either Cuzzi and coworkers or by Pan and coworkers (see Bec et al 2010, J. Flu. Mech 646, 527). We calculated p(m) at scales ranging from 45-1024η where η is the Kolmogorov scale, for both particles with a range of stopping times spanning the optimum value, and for energy dissipation in the fluid. For comparison, the p(m) for dissipation have been observed to be scale-independent in atmospheric flows (at much larger Reynolds number) for scales of at least 30-3000η. We found that, in the numerical simulations, the multiplier distributions for both particle concentration and fluid dissipation are as expected at scales of tens of η, but both become narrower and less intermittent at larger scales. This is consistent with observations of atmospheric flows showing scale independence to >3000η if scale-free behavior is established only after some number 10 of large-scale bifurcations (at scales perhaps

  13. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  14. On the consideration of scaling properties of extreme rainfall in Madrid (Spain) for developing a generalized intensity-duration-frequency equation and assessing probable maximum precipitation estimates

    NASA Astrophysics Data System (ADS)

    Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel

    2018-01-01

    The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor ( k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series. [Figure not available: see fulltext.

  15. Statistic inversion of multi-zone transition probability models for aquifer characterization in alluvial fans

    DOE PAGES

    Zhu, Lin; Dai, Zhenxue; Gong, Huili; ...

    2015-06-12

    Understanding the heterogeneity arising from the complex architecture of sedimentary sequences in alluvial fans is challenging. This study develops a statistical inverse framework in a multi-zone transition probability approach for characterizing the heterogeneity in alluvial fans. An analytical solution of the transition probability matrix is used to define the statistical relationships among different hydrofacies and their mean lengths, integral scales, and volumetric proportions. A statistical inversion is conducted to identify the multi-zone transition probability models and estimate the optimal statistical parameters using the modified Gauss–Newton–Levenberg–Marquardt method. The Jacobian matrix is computed by the sensitivity equation method, which results in anmore » accurate inverse solution with quantification of parameter uncertainty. We use the Chaobai River alluvial fan in the Beijing Plain, China, as an example for elucidating the methodology of alluvial fan characterization. The alluvial fan is divided into three sediment zones. In each zone, the explicit mathematical formulations of the transition probability models are constructed with optimized different integral scales and volumetric proportions. The hydrofacies distributions in the three zones are simulated sequentially by the multi-zone transition probability-based indicator simulations. Finally, the result of this study provides the heterogeneous structure of the alluvial fan for further study of flow and transport simulations.« less

  16. The Torino Impact Hazard Scale

    NASA Astrophysics Data System (ADS)

    Binzel, Richard P.

    2000-04-01

    Newly discovered asteroids and comets have inherent uncertainties in their orbit determinations owing to the natural limits of positional measurement precision and the finite lengths of orbital arcs over which determinations are made. For some objects making predictable future close approaches to the Earth, orbital uncertainties may be such that a collision with the Earth cannot be ruled out. Careful and responsible communication between astronomers and the public is required for reporting these predictions and a 0-10 point hazard scale, reported inseparably with the date of close encounter, is recommended as a simple and efficient tool for this purpose. The goal of this scale, endorsed as the Torino Impact Hazard Scale, is to place into context the level of public concern that is warranted for any close encounter event within the next century. Concomitant reporting of the close encounter date further conveys the sense of urgency that is warranted. The Torino Scale value for a close approach event is based upon both collision probability and the estimated kinetic energy (collision consequence), where the scale value can change as probability and energy estimates are refined by further data. On the scale, Category 1 corresponds to collision probabilities that are comparable to the current annual chance for any given size impactor. Categories 8-10 correspond to certain (probability >99%) collisions having increasingly dire consequences. While close approaches falling Category 0 may be no cause for noteworthy public concern, there remains a professional responsibility to further refine orbital parameters for such objects and a figure of merit is suggested for evaluating such objects. Because impact predictions represent a multi-dimensional problem, there is no unique or perfect translation into a one-dimensional system such as the Torino Scale. These limitations are discussed.

  17. Antituberculosis Drug-Induced Fixed Drug Eruption: A Case Report.

    PubMed

    Vaghela, Jitendra H; Nimbark, Vivek; Barvaliya, Manish; Mehta, Hita; Chavada, Bhavesh

    2018-05-21

    Fixed drug eruption (FDE) was caused by fixed-dose combination (FDC) of antituberculosis drugs in the form of tablet Forecox ® (rifampicin [rifampin] 225 mg + isoniazid 150 mg + pyrazinamide 750 mg + ethambutol 400 mg) in a 40-year-old male patient with a history of drug allergy. The patient developed FDE after taking the third dose of tablet Forecox ® for pulmonary tuberculosis. Tablet Forecox ® was withdrawn and the patient recovered from the reaction after 15 days of treatment for FDE. As per World Health Organization-Uppsala Monitoring Centre (WHO-UMC) and Naranjo causality assessment criteria, the association between the reaction and tablet Forecox ® was possible and probable, respectively. The reaction was moderately (Level 4b) severe according to the Modified Hartwig and Siegel scale. As there is an increased risk of allergic reaction in patients with a history of drug allergy, FDCs should not be used in order to avoid complexity in identifying the culprit drug.

  18. Temporal scaling in information propagation.

    PubMed

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-18

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  19. Temporal scaling in information propagation

    NASA Astrophysics Data System (ADS)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  20. Ladar range image denoising by a nonlocal probability statistics algorithm

    NASA Astrophysics Data System (ADS)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  1. Landslide Hazard Probability Derived from Inherent and Dynamic Determinants

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan

    2016-04-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.

  2. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  3. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  4. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    PubMed

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Power-law tail probabilities of drainage areas in river basins

    USGS Publications Warehouse

    Veitzer, S.A.; Troutman, B.M.; Gupta, V.K.

    2003-01-01

    The significance of power-law tail probabilities of drainage areas in river basins was discussed. The convergence to a power law was not observed for all underlying distributions, but for a large class of statistical distributions with specific limiting properties. The article also discussed about the scaling properties of topologic and geometric network properties in river basins.

  6. Photon Recollision Probability: a Useful Concept for Cross Scale Consistency Check between Leaf Area Index and Foliage Clumping Products

    NASA Astrophysics Data System (ADS)

    Pisek, J.

    2017-12-01

    Clumping index (CI) is the measure of foliage aggregation relative to a random distribution of leaves in space. CI is an important factor for the correct quantification of true leaf area index (LAI). Global and regional scale CI maps have been generated from various multi-angle sensors based on an empirical relationship with the normalized difference between hotspot and darkspot (NDHD) index (Chen et al., 2005). Ryu et al. (2011) suggested that accurate calculation of radiative transfer in a canopy, important for controlling gross primary productivity (GPP) and evapotranspiration (ET) (Baldocchi and Harley, 1995), should be possible by integrating CI with incoming solar irradiance and LAI from MODIS land and atmosphere products. It should be noted that MODIS LAI/FPAR product uses internal non-empirical, stochastic equations for parameterization of foliage clumping. This raises a question if integration of the MODIS LAI product with empirically-based CI maps does not introduce any inconsistencies. Here, the consistency is examined independently through the `recollision probability theory' or `p-theory' (Knyazikhin et al., 1998) along with raw LAI-2000/2200 Plant Canopy Analyzer (PCA) data from > 30 sites, surveyed across a range of vegetation types. The theory predicts that the amount of radiation scattered by a canopy should depend only on the wavelength and the spectrally invariant canopy structural parameter p. The parameter p is linked to the foliage clumping (Stenberg et al., 2016). Results indicate that integration of the MODIS LAI product with empirically-based CI maps is feasible. Importantly, for the first time it is shown that it is possible to obtain p values for any location solely from Earth Observation data. This is very relevant for future applications of photon recollision probability concept for global and local monitoring of vegetation using Earth Observation data.

  7. Modeling the effect of reward amount on probability discounting.

    PubMed

    Myerson, Joel; Green, Leonard; Morris, Joshua

    2011-03-01

    The present study with college students examined the effect of amount on the discounting of probabilistic monetary rewards. A hyperboloid function accurately described the discounting of hypothetical rewards ranging in amount from $20 to $10,000,000. The degree of discounting increased continuously with amount of probabilistic reward. This effect of amount was not due to changes in the rate parameter of the discounting function, but rather was due to increases in the exponent. These results stand in contrast to those observed with the discounting of delayed monetary rewards, in which the degree of discounting decreases with reward amount due to amount-dependent decreases in the rate parameter. Taken together, this pattern of results suggests that delay and probability discounting reflect different underlying mechanisms. That is, the fact that the exponent in the delay discounting function is independent of amount is consistent with a psychophysical scaling interpretation, whereas the finding that the exponent of the probability-discounting function is amount-dependent is inconsistent with such an interpretation. Instead, the present results are consistent with the idea that the probability-discounting function is itself the product of a value function and a weighting function. This idea was first suggested by Kahneman and Tversky (1979), although their prospect theory does not predict amount effects like those observed. The effect of amount on probability discounting was parsimoniously incorporated into our hyperboloid discounting function by assuming that the exponent was proportional to the amount raised to a power. The amount-dependent exponent of the probability-discounting function may be viewed as reflecting the effect of amount on the weighting of the probability with which the reward will be received.

  8. Simple and compact expressions for neutrino oscillation probabilities in matter

    DOE PAGES

    Minakata, Hisakazu; Parke, Stephen J.

    2016-01-29

    We reformulate perturbation theory for neutrino oscillations in matter with an expansion parameter related to the ratio of the solar to the atmospheric Δm 2 scales. Unlike previous works, use a renormalized basis in which certain first-order effects are taken into account in the zeroth-order Hamiltonian. Using this perturbation theory we derive extremely compact expressions for the neutrino oscillations probabilities in matter. We find, for example, that the ν e disappearance probability at this order is of a simple two flavor form with an appropriately identified mixing angle and Δm 2. Furthermore, despite exceptional simplicity in their forms they accommodatemore » all order effects θ 13 and the matter potential.« less

  9. The Finite-Size Scaling Relation for the Order-Parameter Probability Distribution of the Six-Dimensional Ising Model

    NASA Astrophysics Data System (ADS)

    Merdan, Ziya; Karakuş, Özlem

    2016-11-01

    The six dimensional Ising model with nearest-neighbor pair interactions has been simulated and verified numerically on the Creutz Cellular Automaton by using five bit demons near the infinite-lattice critical temperature with the linear dimensions L=4,6,8,10. The order parameter probability distribution for six dimensional Ising model has been calculated at the critical temperature. The constants of the analytical function have been estimated by fitting to probability function obtained numerically at the finite size critical point.

  10. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  11. Questioning the Relevance of Model-Based Probability Statements on Extreme Weather and Future Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2007-12-01

    We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically

  12. Capecitabine-induced ventricular fibrillation arrest: Possible Kounis syndrome.

    PubMed

    Kido, Kazuhiko; Adams, Val R; Morehead, Richard S; Flannery, Alexander H

    2016-04-01

    We report the case of capecitabine-induced ventricular fibrillation arrest, possibly secondary to type I Kounis syndrome. A 47-year-old man with a history of T3N1 moderately differentiated adenocarcinoma of the colon, status-post sigmoid resection, was started on adjuvant capecitabine approximately five months prior to presentation of cardiac arrest secondary to ventricular fibrillation. An electrocardiogram (EKG) revealed ST segment elevation on the lateral leads and the patient was taken emergently to the cardiac catheterization laboratory. The catheterization revealed no angiographically significant stenosis and coronary artery disease was ruled out. After ruling out other causes of cardiac arrest, the working diagnosis was capecitabine-induced ventricular fibrillation arrest. As such, an inflammatory work up was sent to evaluate for the possibility of a capecitabine hypersensitivity, or Kounis syndrome, and is the first documented report in the literature to do so when evaluating Kounis syndrome. Immunoglobulin E (IgE), tryptase, and C-reactive protein were normal but histamine, interleukin (IL)-6, and IL-10 were elevated. Histamine elevation supports the suspicion that our patient had type I Kounis syndrome. Naranjo adverse drug reaction probability scale indicates a probable adverse effect due to capecitabine with seven points. A case of capecitabine-induced ventricular fibrillation arrest is reported, with a potential for type 1 Kounis syndrome as an underlying pathology supported by immunologic work up. © The Author(s) 2014.

  13. Decision making generalized by a cumulative probability weighting function

    NASA Astrophysics Data System (ADS)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  14. Risks and probabilities of breast cancer: short-term versus lifetime probabilities.

    PubMed Central

    Bryant, H E; Brasher, P M

    1994-01-01

    OBJECTIVE: To calculate age-specific short-term and lifetime probabilities of breast cancer among a cohort of Canadian women. DESIGN: Double decrement life table. SETTING: Alberta. SUBJECTS: Women with first invasive breast cancers registered with the Alberta Cancer Registry between 1985 and 1987. MAIN OUTCOME MEASURES: Lifetime probability of breast cancer from birth and for women at various ages; short-term (up to 10 years) probability of breast cancer for women at various ages. RESULTS: The lifetime probability of breast cancer is 10.17% at birth and peaks at 10.34% at age 25 years, after which it decreases owing to a decline in the number of years over which breast cancer risk will be experienced. However, the probability of manifesting breast cancer in the next year increases steadily from the age of 30 onward, reaching 0.36% at 85 years. The probability of manifesting the disease within the next 10 years peaks at 2.97% at age 70 and decreases thereafter, again owing to declining probabilities of surviving the interval. CONCLUSIONS: Given that the incidence of breast cancer among Albertan women during the study period was similar to the national average, we conclude that currently more than 1 in 10 women in Canada can expect to have breast cancer at some point during their life. However, risk varies considerably over a woman's lifetime, with most risk concentrated after age 49. On the basis of the shorter-term age-specific risks that we present, the clinician can put breast cancer risk into perspective for younger women and heighten awareness among women aged 50 years or more. PMID:8287343

  15. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  16. Guide star probabilities

    NASA Technical Reports Server (NTRS)

    Soneira, R. M.; Bahcall, J. N.

    1981-01-01

    Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.

  17. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  18. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  19. Improving estimates of tree mortality probability using potential growth rate

    USGS Publications Warehouse

    Das, Adrian J.; Stephenson, Nathan L.

    2015-01-01

    Tree growth rate is frequently used to estimate mortality probability. Yet, growth metrics can vary in form, and the justification for using one over another is rarely clear. We tested whether a growth index (GI) that scales the realized diameter growth rate against the potential diameter growth rate (PDGR) would give better estimates of mortality probability than other measures. We also tested whether PDGR, being a function of tree size, might better correlate with the baseline mortality probability than direct measurements of size such as diameter or basal area. Using a long-term dataset from the Sierra Nevada, California, U.S.A., as well as existing species-specific estimates of PDGR, we developed growth–mortality models for four common species. For three of the four species, models that included GI, PDGR, or a combination of GI and PDGR were substantially better than models without them. For the fourth species, the models including GI and PDGR performed roughly as well as a model that included only the diameter growth rate. Our results suggest that using PDGR can improve our ability to estimate tree survival probability. However, in the absence of PDGR estimates, the diameter growth rate was the best empirical predictor of mortality, in contrast to assumptions often made in the literature.

  20. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Domestic wells have high probability of pumping septic tank leachate

    NASA Astrophysics Data System (ADS)

    Horn, J. E.; Harter, T.

    2011-06-01

    Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25-30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  2. Drug rash with eosinophilia and systemic symptoms syndrome associated with use of phenytoin, divalproex sodium, and phenobarbital.

    PubMed

    Brizendine, Christina E; Naik, Paras J

    2013-03-15

    A probable case of drug reaction with eosinophilia and systemic symptoms (DRESS) associated with consecutive use of three medications for seizure control is reported. A 36-year-old woman was treated at a community hospital for a mild fever (37.9°C) and diffuse raised maculopapular rash with erythema. Three weeks previously, she had been diagnosed with a seizure disorder and initiated on phenytoin (dose unknown) at that time; about two weeks later, she developed a rash, prompting a switch from phenytoin to extended-release divalproex sodium 250 mg orally twice daily. During the week after discontinuation of phenytoin, the rash was improving, but about five days after the initiation of divalproex therapy, she had worsening rash and pruritus requiring urgent treatment; the divalproex was discontinued, and phenobarbital 30 mg three times daily was initiated for continued seizure control. Despite the discontinuation of phenytoin and divalproex, the patient's hepatic function worsened over five days, and phenobarbital therapy was discontinued. With continued deterioration of the patient's condition to fulminant hepatic failure, a transfer to a liver transplant facility was arranged. The use of the adverse reaction probability scale of Naranjo et al. in this case yielded a score of 8, indicating a probable relationship between DRESS and the serial use of phenytoin, divalproex, and phenobarbital. After receiving phenytoin for treatment of seizure disorder, a 36-year-old woman developed a fever and maculopapular rash with erythema. This reaction continued even after drug therapy was switched to extended-release divalproex and then phenobarbital. The patient's liver function deteriorated despite discontinuation of all seizure medications.

  3. ERP Correlates of Verbal and Numerical Probabilities in Risky Choices: A Two-Stage Probability Processing View

    PubMed Central

    Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin

    2016-01-01

    Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612

  4. REGIONAL-SCALE FISH ECOLOGY IN NORTHEASTERN USA LAKES USING A PROBABILITY-BASED SURVEY DESIGN

    EPA Science Inventory

    Historically, most fish ecology has been done at local scales. As these data accumulate, the need to set this knowledge into landscape, regional, and historical context grows. There are important broad-scale issues (e.g., non-point source pollution, biodiversity loss, alien spe...

  5. An interaction between levodopa and enteral nutrition resulting in neuroleptic malignant-like syndrome and prolonged ICU stay.

    PubMed

    Bonnici, André; Ruiner, Carola-Ellen; St-Laurent, Lyne; Hornstein, David

    2010-09-01

    To describe a probable interaction between enteral feeds and levodopa leading to neuroleptic malignant-like syndrome (NMLS) in a polytrauma patient with Parkinson's disease (PD). A 63-year-old morbidly obese male polytrauma patient with PD and type 2 diabetes mellitus was admitted to our intensive care unit postoperatively. Enteral feeds were administered per nasogastric tube and provided 0.88 g /kg/day of protein based on ideal body weight (IBW). His PD medications (pramipexole, entacapone, and immediate-release levodopa/carbidopa 100 mg/25 mg, 1.5 tablets 4 times daily) were administered via nasogastric tube. To achieve better glycemic control, his enteral feeds were changed to a formula that provided 1.8 g/kg/day of protein based on IBW. In the following 24 hours, the patient's mental status deteriorated and he was reintubated. He developed a high fever (40.5 degrees C), leukocytosis, elevated serum creatine kinase (CK) (480-1801 units/L), and acute renal impairment. His enteral nutrition was changed to decrease protein intake to 1.0 g/kg/day based on IBW and he was given bromocriptine 5 mg 3 times daily via nasogastric tube. Within 24 hours, the patient's mental status improved, his temperature and CK decreased, and his renal function began to improve; the values returned to baseline levels on the 18th day of admission. Withdrawal or dose reduction of levodopa in patients with PD has been reported to precipitate NMLS, which is potentially fatal. Because dietary protein can decrease the absorp0tion of levodopa, a potential for an interaction between levodopa and enteral feedings exists, although published reports of such an interaction are limited. In this patient, the likelihood that a drug-nutrient interaction occurred between levodopa and enteral feedings is considered to be probable based on the Naranjo probability scale and the Horn Drug Interaction Probability Scale. Health-care professionals should be aware of the interaction between levodopa and protein

  6. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  7. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide

  8. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be

  9. Rejecting probability summation for radial frequency patterns, not so Quick!

    PubMed

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Dynamic Encoding of Speech Sequence Probability in Human Temporal Cortex

    PubMed Central

    Leonard, Matthew K.; Bouchard, Kristofer E.; Tang, Claire

    2015-01-01

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. PMID:25948269

  11. Trait mindfulness, reasons for living and general symptom severity as predictors of suicide probability in males with substance abuse or dependence.

    PubMed

    Mohammadkhani, Parvaneh; Khanipour, Hamid; Azadmehr, Hedieh; Mobramm, Ardeshir; Naseri, Esmaeil

    2015-01-01

    The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms. Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS. The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001). The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001). It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability.

  12. Decision analysis with approximate probabilities

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  13. Probability of assertive behaviour, interpersonal anxiety and self-efficacy of South African registered dietitians.

    PubMed

    Paterson, Marie; Green, J M; Basson, C J; Ross, F

    2002-02-01

    There is little information on the probability of assertive behaviour, interpersonal anxiety and self-efficacy in the literature regarding dietitians. The objective of this study was to establish baseline information of these attributes and the factors affecting them. Questionnaires collecting biographical information and self-assessment psychometric scales measuring levels of probability of assertiveness, interpersonal anxiety and self-efficacy were mailed to 350 subjects, who comprised a random sample of dietitians registered with the Health Professions Council of South Africa. Forty-one per cent (n=145) of the sample responded. Self-assessment inventory results were compared to test levels of probability of assertive behaviour, interpersonal anxiety and self-efficacy. The inventory results were compared with the biographical findings to establish statistical relationships between the variables. The hypotheses were formulated before data collection. Dietitians had acceptable levels of probability of assertive behaviour and interpersonal anxiety. The probability of assertive behaviour was significantly lower than the level noted in the literature and was negatively related to interpersonal anxiety and positively related to self-efficacy.

  14. Excluding joint probabilities from quantum theory

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  15. Levofloxacin-induced tendinopathy of the hip.

    PubMed

    Ganske, Corrine M; Horning, Kristin K

    2012-05-01

    To describe what we believe to be the first reported possible case of tendinopathy of the hip in a patient receiving levofloxacin. A 58-year-old male with recurrent otitis media was admitted for left lateral hip pain of 10 on a scale of 10. He had started a 5-day course of levofloxacin 750 mg/day 10 days before he began experiencing pain. He also took simvastatin 20 mg/day and walked 90 minutes each day. He was treated with oxycodone with acetaminophen and physical therapy. His pain had improved significantly at a 10-day recheck. Fluoroquinolone-induced tendinopathy has been well-reported in the literature, but most cases involve pefloxacin and affect the Achilles tendon. Only 11 cases of tendinopathy have been reported with levofloxacin based on a MEDLINE search (1966-December 2011). This is the first known case reported that involved tendinopathy of the hip believed to be caused by fluoroquinolones. The Naranjo probability scale revealed a possible adverse reaction of levofloxacin-induced tendinopathy of the hip. Contributing factors likely included the high dose of levofloxacin, concomitant use of a statin, and strenuous physical activity. Health care professionals should be aware of the possibility of tendinopathy of the hip in patients who receive fluoroquinolones. Thorough history for possible risk factors should be obtained. Patients on fluoroquinolones at risk for tendinopathy should be counseled to avoid strenuous physical activity.

  16. Subjective Probabilities in Household Surveys

    PubMed Central

    Hurd, Michael D.

    2011-01-01

    Subjective probabilities are now collected on a number of large household surveys with the objective of providing data to better understand inter-temporal decision making. Comparison of subjective probabilities with actual outcomes shows that the probabilities have considerable predictive power in situations where individuals have considerable private information such as survival and retirement. In contrast the subjective probability of a stock market gain varies greatly across individuals even though no one has private information and the outcome is the same for everyone. An explanation is that there is considerable variation in accessing and processing information. Further, the subjective probability of a stock market gain is considerably lower than historical averages, providing an explanation for the relatively low frequency of stock holding. An important research objective will be to understand how individuals form their subjective probabilities. PMID:21643535

  17. Probability concepts in quality risk management.

    PubMed

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  18. The perception of probability.

    PubMed

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Knot probability of polygons subjected to a force: a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Janse van Rensburg, E. J.; Orlandini, E.; Tesi, M. C.; Whittington, S. G.

    2008-01-01

    We use Monte Carlo methods to study the knot probability of lattice polygons on the cubic lattice in the presence of an external force f. The force is coupled to the span of the polygons along a lattice direction, say the z-direction. If the force is negative polygons are squeezed (the compressive regime), while positive forces tend to stretch the polygons along the z-direction (the tensile regime). For sufficiently large positive forces we verify that the Pincus scaling law in the force-extension curve holds. At a fixed number of edges n the knot probability is a decreasing function of the force. For a fixed force the knot probability approaches unity as 1 - exp(-α0(f)n + o(n)), where α0(f) is positive and a decreasing function of f. We also examine the average of the absolute value of the writhe and we verify the square root growth law (known for f = 0) for all values of f.

  20. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  1. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  2. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  3. Definition of the Neutrosophic Probability

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2014-03-01

    Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.

  4. Why does Japan use the probability method to set design flood?

    NASA Astrophysics Data System (ADS)

    Nakamura, S.; Oki, T.

    2015-12-01

    Design flood is hypothetical flood to make flood prevention plan. In Japan, a probability method based on precipitation data is used to define the scale of design flood: Tone River, the biggest river in Japan, is 1 in 200 years, Shinano River is 1 in 150 years, and so on. It is one of important socio-hydrological issue how to set reasonable and acceptable design flood in a changing world. The method to set design flood vary among countries. Although the probability method is also used in Netherland, but the base data is water level or discharge data and the probability is 1 in 1250 years (in fresh water section). On the other side, USA and China apply the maximum flood method which set the design flood based on the historical or probable maximum flood. This cases can leads a question: "what is the reason why the method vary among countries?" or "why does Japan use the probability method?" The purpose of this study is to clarify the historical process which the probability method was developed in Japan based on the literature. In the late 19the century, the concept of "discharge" and modern river engineering were imported by Dutch engineers, and modern flood prevention plans were developed in Japan. In these plans, the design floods were set based on the historical maximum method. Although the historical maximum method had been used until World War 2, however, the method was changed to the probability method after the war because of limitations of historical maximum method under the specific socio-economic situations: (1) the budget limitation due to the war and the GHQ occupation, (2) the historical floods: Makurazaki typhoon in 1945, Kathleen typhoon in 1947, Ione typhoon in 1948, and so on, attacked Japan and broke the record of historical maximum discharge in main rivers and the flood disasters made the flood prevention projects difficult to complete. Then, Japanese hydrologists imported the hydrological probability statistics from the West to take account of

  5. Investigation of the relationship between suicide probability in inpatients and their psychological symptoms and coping strategies.

    PubMed

    Avci, Dilek; Sabanciogullar, Selma; Yilmaz, Feride T

    2016-10-01

    To investigate the relationship between suicide probability and psychological symptoms and coping strategies in hospitalized patients with physical illness. This cross-sectional study was conducted from April to June 2014 in Bandirma State Hospital, Balikesir, Turkey. The sample of the study consisted of 470 inpatients who met the inclusion criteria and agreed to participate in the study. The data were collected with the Personal Information Form, Suicide Probability Scale, Brief Symptom Inventory and Ways of Coping with Stress Inventory. In the study, 74.7% were at moderate risk for suicide, whereas 20.4% were at high risk for suicide. According to the stepwise multiple linear regression analysis, sub-dimensions of the Ways of Coping with Stress Inventory and Brief Symptom Inventory were the significant predictors of suicide probability. The majority of the patients with physical illness were at risk for suicide probability. Individuals who had psychological symptoms and used maladaptive coping ways obtained significantly higher suicide probability scores.

  6. Continental-scale simulation of burn probabilities, flame lengths, and fire size distribution for the United States

    Treesearch

    Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley

    2010-01-01

    Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...

  7. Methemoglobinemia secondary to topical benzocaine use in a lung transplant patient.

    PubMed

    LeClaire, Aimée C; Mullett, Timothy W; Jahania, M Salik; Flynn, Jeremy D

    2005-02-01

    To report a case of methemoglobinemia secondary to the administration of topical benzocaine spray in an anemic patient who had previously undergone a lung transplant. A 40-year-old white man with a past medical history significant for lung transplant acutely decompensated following oropharyngeal administration of topical benzocaine spray. Subsequent blood analysis revealed a methemoglobin concentration of 51.2%. Following the administration of a single dose of methylene blue 2 mg/kg intravenously, the patient's respiratory status dramatically improved and stabilized. Methemoglobinemia is a rare but potentially fatal condition that may be either acquired or congenital; however, the disorder is most commonly acquired secondary to exposure to oxidizing chemicals, which are often routinely prescribed medications, including benzocaine. Benzocaine can react with hemoglobin to form methemoglobin at a rate that exceeds reduction capabilities, which may result in oxygenation difficulty and respiratory distress. In severe or symptomatic methemoglobinemia, the treatment of choice is methylene blue. Application of the Naranjo probability scale established a highly probable relationship between topical benzocaine spray and methemoglobinemia and associated respiratory compromise. The risks of palliative use of topical benzocaine in patients with preexisting disorders that compromise oxygen delivery may outweigh any benefit. In our patient, anemia and lung disease increased his risk for clinically significant adverse respiratory events secondary to deficiencies or interferences in oxygen delivery. Topical benzocaine should be administered with caution and careful monitoring in such patient populations.

  8. Worsening psychosis induced by varenicline in a hospitalized psychiatric patient.

    PubMed

    DiPaula, Bethany A; Thomas, Michele D

    2009-07-01

    Varenicline is a novel treatment for smoking cessation; however, the agent has not been well studied in a population with severe mental illness. Varenicline can reportedly cause neuropsychiatric adverse effects, some resulting in hospitalizations and/or suicides. We describe a case of clinician-observed, worsening psychotic symptoms in a patient with chronic mental illness who was receiving varenicline. A 45-year-old woman with bipolar disorder, mixed type with psychotic features, was admitted to a psychiatric hospital due to acute decompensation after she discontinued her drug therapy. Because of the facility's smoke-free policy, the patient was not permitted to smoke cigarettes during her hospitalization. Over the next several weeks, her condition was stabilized with psychotropic drugs. Her symptoms improved, and plans were made for her discharge. Varenicline was prescribed to manage her nicotine cravings. After 2 days of treatment, staff members noted worsening of the patient's psychotic symptoms and agitation. Varenicline was discontinued, the patient's mental status returned to baseline, and she was subsequently discharged. Use of the Naranjo adverse drug reaction probability scale indicated a probable relationship (score of 7) between the patient's worsening psychosis and her varenicline therapy. This case report provides valuable support of previously published cases that demonstrate the risk of exacerbation of psychotic symptoms with varenicline use in patients with severe mental illness. With proper assessment and management of varenicline-induced neuropsychiatric effects, health care professionals can provide an important role in helping to prevent and manage worsening psychiatric symptoms.

  9. Cluster membership probability: polarimetric approach

    NASA Astrophysics Data System (ADS)

    Medhi, Biman J.; Tamura, Motohide

    2013-04-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.

  10. Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices

    Treesearch

    Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling

    2008-01-01

    The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...

  11. Clinical and histologic features of acute-onset erythroderma in dogs with gastrointestinal disease: 18 cases (2005-2015).

    PubMed

    Cain, Christine L; Bradley, Charles W; Mauldin, Elizabeth A

    2017-12-15

    OBJECTIVE To describe the clinical and histologic features of acute erythroderma in dogs with gastrointestinal disease. DESIGN Retrospective case series. ANIMALS 18 dogs with erythroderma and gastrointestinal disease. PROCEDURES Medical records and biopsy specimens were reviewed. Information collected from medical records included signalment, clinical signs, physical examination and diagnostic test results, treatment, and outcome. The Naranjo algorithm was used to estimate the probability of an adverse drug reaction for each dog. RESULTS All dogs had an acute onset of erythematous macules or generalized erythroderma. Histologic features of skin biopsy specimens had 3 patterns representing a progressive spectrum of inflammation. Most dogs had vomiting (n = 17) and hematochezia (10). Signs of gastrointestinal disease became evident before, after, or concurrent with the onset of skin lesions in 10, 3, and 5 dogs, respectively. Inflammatory bowel disease, pancreatitis, and adverse food reaction were diagnosed in 5, 3, and 3 dogs, respectively. The cause of the gastrointestinal signs was not identified for 8 dogs. Eight dogs had a Naranjo score consistent with a possible adverse drug reaction. Treatment of skin lesions included drug withdrawal (n = 15), antihistamines (16), and corticosteroids (14). Signs of gastrointestinal disease and skin lesions resolved at a mean of 4.6 days and 20.8 days, respectively, after onset. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated acute erythroderma may be associated with > 1 gastrointestinal disease or an adverse drug reaction in some dogs. Recognition of the clinical and histologic features of this syndrome is essential for accurate diagnosis.

  12. Predicting the cosmological constant with the scale-factor cutoff measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Simone, Andrea; Guth, Alan H.; Salem, Michael P.

    2008-09-15

    It is well known that anthropic selection from a landscape with a flat prior distribution of cosmological constant {lambda} gives a reasonable fit to observation. However, a realistic model of the multiverse has a physical volume that diverges with time, and the predicted distribution of {lambda} depends on how the spacetime volume is regulated. A very promising method of regulation uses a scale-factor cutoff, which avoids a number of serious problems that arise in other approaches. In particular, the scale-factor cutoff avoids the 'youngness problem' (high probability of living in a much younger universe) and the 'Q and G catastrophes'more » (high probability for the primordial density contrast Q and gravitational constant G to have extremely large or small values). We apply the scale-factor cutoff measure to the probability distribution of {lambda}, considering both positive and negative values. The results are in good agreement with observation. In particular, the scale-factor cutoff strongly suppresses the probability for values of {lambda} that are more than about 10 times the observed value. We also discuss qualitatively the prediction for the density parameter {omega}, indicating that with this measure there is a possibility of detectable negative curvature.« less

  13. Teachers' Understandings of Probability

    ERIC Educational Resources Information Center

    Liu, Yan; Thompson, Patrick

    2007-01-01

    Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…

  14. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  15. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  16. Mathematical Analysis of Vehicle Delivery Scale of Bike-Sharing Rental Nodes

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Liu, J.; Liu, L.

    2018-04-01

    Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state) probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  17. Scale Dependence of Spatiotemporal Intermittence of Rain

    NASA Technical Reports Server (NTRS)

    Kundu, Prasun K.; Siddani, Ravi K.

    2011-01-01

    It is a common experience that rainfall is intermittent in space and time. This is reflected by the fact that the statistics of area- and/or time-averaged rain rate is described by a mixed distribution with a nonzero probability of having a sharp value zero. In this paper we have explored the dependence of the probability of zero rain on the averaging space and time scales in large multiyear data sets based on radar and rain gauge observations. A stretched exponential fannula fits the observed scale dependence of the zero-rain probability. The proposed formula makes it apparent that the space-time support of the rain field is not quite a set of measure zero as is sometimes supposed. We also give an ex.planation of the observed behavior in tenus of a simple probabilistic model based on the premise that rainfall process has an intrinsic memory.

  18. Probability and Quantum Paradigms: the Interplay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kracklauer, A. F.

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a fewmore » details, this variant is appealing in its reliance on well tested concepts and technology.« less

  19. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  20. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Dual Diagnosis and Suicide Probability in Poly-Drug Users.

    PubMed

    Youssef, Ismail M; Fahmy, Magda T; Haggag, Wafaa L; Mohamed, Khalid A; Baalash, Amany A

    2016-02-01

    To determine the frequency of suicidal thoughts and suicidal probability among poly-substance abusers in Saudi population, and to examine the relation between dual diagnosis and suicidal thoughts. Case control study. Al-Baha Psychiatric Hospital, Saudi Arabia, from May 2011 to June 2012. Participants were 239 subjects, aged 18 - 45 years. We reviewed 122 individuals who fulfilled the DSM-IV-TR criteria of substance abuse for two or more substances, and their data were compared with that collected from 117 control persons. Suicidal cases were highly present among poly-substance abusers 64.75%. Amphetamine and cannabis were the most abused substances, (87.7% and 70.49%, respectively). Astatistically significant association with suicidality was found with longer duration of substance abuse (p < 0.001), using alcohol (p=0.001), amphetamine (p=0.007), volatile substances (p=0.034), presence of comorbid psychiatric disorders (dual diagnosis) as substance induced mood disorder (p=0.001), schizo-affective disorder (p=0.017), major depressive disorders (p=0.001), antisocial (p=0.016) and borderline (p=0.005) personality disorder. Suicidal cases showed significant higher scores (p < 0.001) of suicide probability scale and higher scores in Beck depressive inventory (p < 0.001). Abusing certain substances for long duration, in addition to comorbid psychiatric disorders especially with disturbed-mood element, may trigger suicidal thoughts in poly-substance abusers. Depression and suicide probability is common consequences of substance abuse.

  2. Multiscale Characterization of the Probability Density Functions of Velocity and Temperature Increment Fields

    NASA Astrophysics Data System (ADS)

    DeMarco, Adam Ward

    The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This

  3. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Extrapolating regional probability of drying of headwater streams using discrete observations and gauging networks

    NASA Astrophysics Data System (ADS)

    Beaufort, Aurélien; Lamouroux, Nicolas; Pella, Hervé; Datry, Thibault; Sauquet, Eric

    2018-05-01

    Headwater streams represent a substantial proportion of river systems and many of them have intermittent flows due to their upstream position in the network. These intermittent rivers and ephemeral streams have recently seen a marked increase in interest, especially to assess the impact of drying on aquatic ecosystems. The objective of this paper is to quantify how discrete (in space and time) field observations of flow intermittence help to extrapolate over time the daily probability of drying (defined at the regional scale). Two empirical models based on linear or logistic regressions have been developed to predict the daily probability of intermittence at the regional scale across France. Explanatory variables were derived from available daily discharge and groundwater-level data of a dense gauging/piezometer network, and models were calibrated using discrete series of field observations of flow intermittence. The robustness of the models was tested using an independent, dense regional dataset of intermittence observations and observations of the year 2017 excluded from the calibration. The resulting models were used to extrapolate the daily regional probability of drying in France: (i) over the period 2011-2017 to identify the regions most affected by flow intermittence; (ii) over the period 1989-2017, using a reduced input dataset, to analyse temporal variability of flow intermittence at the national level. The two empirical regression models performed equally well between 2011 and 2017. The accuracy of predictions depended on the number of continuous gauging/piezometer stations and intermittence observations available to calibrate the regressions. Regions with the highest performance were located in sedimentary plains, where the monitoring network was dense and where the regional probability of drying was the highest. Conversely, the worst performances were obtained in mountainous regions. Finally, temporal projections (1989-2016) suggested the highest

  5. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  6. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  7. Azathioprine-Induced Warfarin Resistance

    PubMed Central

    Vazquez, Sara R; Rondina, Matthew T; Pendleton, Robert C

    2011-01-01

    OBJECTIVE To describe a case of azathioprine-induced warfarin resistance, present a literature review on warfarin–azathioprine interactions, and provide recommendations on appropriate management of this clinically significant interaction. CASE SUMMARY A 29-year-old female with Cogan’s syndrome experienced thrombosis of the left internal carotid artery. She was treated with an average weekly warfarin dose of 39 mg (5.5 mg daily) prior to beginning azathioprine therapy. Three weeks following initiation of azathioprine 150 mg daily, the international normalized ratio (INR) decreased from 1.9 (prior to the medication change) to 1.0 without any change in the warfarin dose or other relevant factors. Over several weeks, the patient’s warfarin dose was titrated up to 112 mg weekly (16 mg daily) to achieve an INR of 2.5 (a 188%, or 2.9-fold dose increase). Because of elevated liver enzyme levels, the azathioprine dosage was decreased to 100 mg daily. Within 2 weeks following that decrease, warfarin requirements decreased to 105 mg weekly (15 mg daily). DISCUSSION Azathioprine was the probable causative agent of warfarin resistance according to the Naranjo probability scale, and a possible causative agent according to the Drug Interaction Probability Scale. A literature search (PubMed, 1966–December 2007) revealed 8 case reports of this drug interaction and 2 cases involving a similar effect with 6-mercaptopurine, the active metabolite of azathioprine. The exact mechanism of the interaction remains unknown. Previously published case reports point to a rapid onset and offset of the warfarin–azathioprine interaction and a dose-dependent increase of at least 2.5-fold in warfarin dose requirement with the initiation of azathioprine 75–200 mg daily. CONCLUSIONS This case report and several others point toward azathioprine as a clinically significant inducer of warfarin resistance. Providers should anticipate the need for higher warfarin doses, warfarin dose adjustment

  8. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  9. THE SEMIGROUP OF METRIC MEASURE SPACES AND ITS INFINITELY DIVISIBLE PROBABILITY MEASURES

    PubMed Central

    EVANS, STEVEN N.; MOLCHANOV, ILYA

    2015-01-01

    A metric measure space is a complete, separable metric space equipped with a probability measure that has full support. Two such spaces are equivalent if they are isometric as metric spaces via an isometry that maps the probability measure on the first space to the probability measure on the second. The resulting set of equivalence classes can be metrized with the Gromov–Prohorov metric of Greven, Pfaffelhuber and Winter. We consider the natural binary operation ⊞ on this space that takes two metric measure spaces and forms their Cartesian product equipped with the sum of the two metrics and the product of the two probability measures. We show that the metric measure spaces equipped with this operation form a cancellative, commutative, Polish semigroup with a translation invariant metric. There is an explicit family of continuous semicharacters that is extremely useful for, inter alia, establishing that there are no infinitely divisible elements and that each element has a unique factorization into prime elements. We investigate the interaction between the semigroup structure and the natural action of the positive real numbers on this space that arises from scaling the metric. For example, we show that for any given positive real numbers a, b, c the trivial space is the only space that satisfies a ⊞ b = c . We establish that there is no analogue of the law of large numbers: if X1, X2, … is an identically distributed independent sequence of random spaces, then no subsequence of 1n⊞k=1nXk converges in distribution unless each Xk is almost surely equal to the trivial space. We characterize the infinitely divisible probability measures and the Lévy processes on this semigroup, characterize the stable probability measures and establish a counterpart of the LePage representation for the latter class. PMID:28065980

  10. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  11. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  12. The investigation of the relationship between probability of suicide and reasons for living in psychiatric inpatients.

    PubMed

    Eskiyurt, Reyhan; Ozkan, Birgul

    2017-01-01

    This study was carried out to determine the reasons of the suicide probability and reasons for living of the inpatients hospitalized at the psychiatry clinic and to analyze the relationship between them. The sample of the study consisted of 192 patients who were hospitalized in psychiatric clinics between February and May 2016 and who agreed to participate in the study. In collecting data, personal information form, suicide probability scale (SPS), reasons for living inventory (RFL), and Beck's depression inventory (BDI) were used. Stepwise regression method was used to determine the factors that predict suicide probability. In the study, as a result of analyses made, the median score on the SPS was found 76.0, the median score on the RFL was found 137.0, the median score on the BDI of the patients was found 13.5, and it was found that patients with a high probability of suicide had less reasons for living and that their depression levels were very high. As a result of stepwise regression analysis, it was determined that suicidal ideation, reasons for living, maltreatment, education level, age, and income status were the predictors of suicide probability ( F = 61.125; P < 0.001). It was found that the patients who hospitalized in the psychiatric clinic have high suicide probability and the reasons of living are strong predictors of suicide probability in accordance with the literature.

  13. You Say "Probable" and I Say "Likely": Improving Interpersonal Communication With Verbal Probability Phrases

    ERIC Educational Resources Information Center

    Karelitz, Tzur M.; Budescu, David V.

    2004-01-01

    When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…

  14. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  15. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Lower survival probabilities for adult Florida manatees in years with intense coastal storms

    USGS Publications Warehouse

    Langtimm, C.A.; Beck, C.A.

    2003-01-01

    The endangered Florida manatee (Trichechus manatus latirostris) inhabits the subtropical waters of the southeastern United States, where hurricanes are a regular occurrence. Using mark-resighting statistical models, we analyzed 19 years of photo-identification data and detected significant annual variation in adult survival for a subpopulation in northwest Florida where human impact is low. That variation coincided with years when intense hurricanes (Category 3 or greater on the Saffir-Simpson Hurricane Scale) and a major winter storm occurred in the northern Gulf of Mexico. Mean survival probability during years with no or low intensity storms was 0.972 (approximate 95% confidence interval = 0.961-0.980) but dropped to 0.936 (0.864-0.971) in 1985 with Hurricanes Elena, Kate, and Juan; to 0.909 (0.837-0.951) in 1993 with the March "Storm of the Century"; and to 0.817 (0.735-0.878) in 1995 with Hurricanes Opal, Erin, and Allison. These drops in survival probability were not catastrophic in magnitude and were detected because of the use of state-of-the-art statistical techniques and the quality of the data. Because individuals of this small population range extensively along the north Gulf coast of Florida, it was possible to resolve storm effects on a regional scale rather than the site-specific local scale common to studies of more sedentary species. This is the first empirical evidence in support of storm effects on manatee survival and suggests a cause-effect relationship. The decreases in survival could be due to direct mortality, indirect mortality, and/or emigration from the region as a consequence of storms. Future impacts to the population by a single catastrophic hurricane, or series of smaller hurricanes, could increase the probability of extinction. With the advent in 1995 of a new 25- to 50-yr cycle of greater hurricane activity, and longer term change possible with global climate change, it becomes all the more important to reduce mortality and injury

  17. Economic Choices Reveal Probability Distortion in Macaque Monkeys

    PubMed Central

    Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-01-01

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. PMID:25698750

  18. Transition probabilities in neutron-rich Se,8684

    NASA Astrophysics Data System (ADS)

    Litzinger, J.; Blazhev, A.; Dewald, A.; Didierjean, F.; Duchêne, G.; Fransen, C.; Lozeva, R.; Sieja, K.; Verney, D.; de Angelis, G.; Bazzacco, D.; Birkenbach, B.; Bottoni, S.; Bracco, A.; Braunroth, T.; Cederwall, B.; Corradi, L.; Crespi, F. C. L.; Désesquelles, P.; Eberth, J.; Ellinger, E.; Farnea, E.; Fioretto, E.; Gernhäuser, R.; Goasduff, A.; Görgen, A.; Gottardo, A.; Grebosz, J.; Hackstein, M.; Hess, H.; Ibrahim, F.; Jolie, J.; Jungclaus, A.; Kolos, K.; Korten, W.; Leoni, S.; Lunardi, S.; Maj, A.; Menegazzo, R.; Mengoni, D.; Michelagnoli, C.; Mijatovic, T.; Million, B.; Möller, O.; Modamio, V.; Montagnoli, G.; Montanari, D.; Morales, A. I.; Napoli, D. R.; Niikura, M.; Pollarolo, G.; Pullia, A.; Quintana, B.; Recchia, F.; Reiter, P.; Rosso, D.; Sahin, E.; Salsac, M. D.; Scarlassara, F.; Söderström, P.-A.; Stefanini, A. M.; Stezowski, O.; Szilner, S.; Theisen, Ch.; Valiente Dobón, J. J.; Vandone, V.; Vogt, A.

    2015-12-01

    Reduced quadrupole transition probabilities for low-lying transitions in neutron-rich Se,8684 are investigated with a recoil distance Doppler shift (RDDS) experiment. The experiment was performed at the Istituto Nazionale di Fisica Nucleare (INFN) Laboratori Nazionali di Legnaro using the Cologne Plunger device for the RDDS technique and the AGATA Demonstrator array for the γ -ray detection coupled to the PRISMA magnetic spectrometer for an event-by-event particle identification. In 86Se the level lifetime of the yrast 21+ state and an upper limit for the lifetime of the 41+ state are determined for the first time. The results of 86Se are in agreement with previously reported predictions of large-scale shell-model calculations using Ni78-I and Ni78-II effective interactions. In addition, intrinsic shape parameters of lowest yrast states in 86Se are calculated. In semimagic 84Se level lifetimes of the yrast 41+ and 61+ states are determined for the first time. Large-scale shell-model calculations using effective interactions Ni78-II, JUN45, jj4b, and jj4pna are performed. The calculations describe B (E 2 ;21+→01+) and B (E 2 ;61+→41+) fairly well and point out problems in reproducing the experimental B (E 2 ;41+→21+) .

  19. Growing optimal scale-free networks via likelihood

    NASA Astrophysics Data System (ADS)

    Small, Michael; Li, Yingying; Stemler, Thomas; Judd, Kevin

    2015-04-01

    Preferential attachment, by which new nodes attach to existing nodes with probability proportional to the existing nodes' degree, has become the standard growth model for scale-free networks, where the asymptotic probability of a node having degree k is proportional to k-γ. However, the motivation for this model is entirely ad hoc. We use exact likelihood arguments and show that the optimal way to build a scale-free network is to attach most new links to nodes of low degree. Curiously, this leads to a scale-free network with a single dominant hub: a starlike structure we call a superstar network. Asymptotically, the optimal strategy is to attach each new node to one of the nodes of degree k with probability proportional to 1/N +ζ (γ ) (k+1 ) γ (in a N node network): a stronger bias toward high degree nodes than exhibited by standard preferential attachment. Our algorithm generates optimally scale-free networks (the superstar networks) as well as randomly sampling the space of all scale-free networks with a given degree exponent γ . We generate viable realization with finite N for 1 ≪γ <2 as well as γ >2 . We observe an apparently discontinuous transition at γ ≈2 between so-called superstar networks and more treelike realizations. Gradually increasing γ further leads to reemergence of a superstar hub. To quantify these structural features, we derive a new analytic expression for the expected degree exponent of a pure preferential attachment process and introduce alternative measures of network entropy. Our approach is generic and can also be applied to an arbitrary degree distribution.

  20. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  1. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  2. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  3. Delayed seizure associated with paclitaxel-Cremophor el in a patient with early-stage breast cancer.

    PubMed

    O'Connor, Tracey L; Kossoff, Ellen

    2009-08-01

    Paclitaxel, a microtubule stabilizer, is an effective agent for treating cancer of the breast, ovary, head and neck, and lung. Because paclitaxel is insoluble in water, it is formulated with the micelle-forming Cremophor EL. Neurologic toxicity is well described with both the drug and this carrier, with most toxicities manifesting as peripheral neuropathy, motor neuropathy, autonomic neuropathy, and myopathy. Toxic effects on the central nervous system, such as seizures or encephalopathy, have been rarely reported; however, the seizures reported were closely related to the time of infusion. We describe a 41-year-old woman with no history of seizures who was treated with paclitaxel for breast cancer. Four days after the drug was infused, she developed a generalized tonic-clonic seizure that could not be attributed to other causes. The patient was treated with phenytoin and was able to complete her adjuvant chemotherapy with nab-paclitaxel without further events. Her condition was neurologically stable without phenytoin for the next 6 months. Use of the Naranjo adverse drug reaction probability scale indicated a possible association (score of 3) between the delayed seizure and paclitaxel or its solvent, Cremophor EL. Clinicians should be aware of the potential for seizure activity in patients who receive paclitaxel formulated with Cremophor EL.

  4. Atypical Fracture of the Sternum After Long-Term Alendronate Plus Cholecalciferol Treatment: A Case Report.

    PubMed

    Martín Arias, Luis H; García Ortega, Pilar; Sáinz Gil, María; Navarro García, Ester; Treceño Lobato, Carlos; Delgado Armas, Virginia

    2017-12-01

    A 55-year-old woman developed an atraumatic sternum fracture during treatment with alendronate for osteoporosis. The woman received alendronate 70 mg in combination with cholecalciferol 5600 IU once weekly, as well as nonsteroidal anti-inflammatory drugs. After 4 years of treatment, following a dorsal flexion with no direct thoracic trauma, the patient suffered a fracture of the sternum, with an X-ray revealing sternal body fracture. This fracture was seen to be transverse, noncomminuted and without displacement. Magnetic resonance imaging was carried out to rule out the presence of either a pathological fracture or a fracture resulting from osteoporotic fragility, and showed a triple sternal fracture involving the body, as well as the upper and lower manubrium of the sternum. This fracture presented the features of an atypical femur fracture, except for the location. The alendronate and cholecalciferol combination was discontinued and denosumab was prescribed. After the withdrawal of alendronate, the patient showed clinical improvement, with a decrease in pain, and is currently having routine checkups. The causality algorithm of the Spanish Pharmacovigilance System shows a score of 5, indicating a possible relationship between the patient's sternum fracture and her use of the suspect drug (Naranjo scale 6 = probable).

  5. Sunitinib-related fulminant hepatic failure: case report and review of the literature.

    PubMed

    Mueller, Eric W; Rockey, Michelle L; Rashkin, Mitchell C

    2008-08-01

    Drug-induced hepatotoxicity is an infrequent but life-threatening complication. Sunitinib is a multitargeted receptor tyrosine kinase inhibitor approved for treatment of renal cell carcinoma and gastrointestinal stromal tumor. However, results from preapproval clinical trials suggest an equivocal hepatic risk profile for sunitinib. We describe a 75-year-old woman with renal cell carcinoma who was admitted to the intensive care unit after experiencing fulminant hepatic failure during sunitinib therapy. The patient's hepatic and renal chemistries had been within normal limits throughout four previous cycles of sunitinib therapy spanning 9 months. After the fifth cycle, she complained of a 3-day history of severe diarrhea and dehydration. Her abnormal laboratory test results included the following: total bilirubin level 5.9 mg/dl, aspartate aminotransferase level 3872 U/L, alanine aminotransferase level 3332 U/L, ammonia level 897 microg/dl, and an international normalized ratio of 4.8. Use of the Naranjo adverse drug reaction probability scale indicated a possible relationship between sunitinib and hepatotoxicity. Supportive care including aggressive intravenous hydration and reversal of coagulopathy was successful. The patient was discharged home on hospital day 7 without apparent longstanding sequelae. Clinicians should be aware of this possible adverse effect of sunitinib, and continued pharmacovigilance is imperative to accurately quantify the possible risk of sunitinib-related hepatotoxicity.

  6. The importance of monitoring adverse drug reactions in pediatric patients: the results of a national surveillance program in Italy.

    PubMed

    Carnovale, Carla; Brusadelli, Tatiana; Zuccotti, GianVincenzo; Beretta, Silvia; Sullo, Maria Giuseppa; Capuano, Annalisa; Rossi, Francesco; Moschini, Martina; Mugelli, Alessandro; Vannacci, Alfredo; Laterza, Marcella; Clementi, Emilio; Radice, Sonia

    2014-09-01

    To gain information on safety of drugs used in pediatrics through a 4-year post-marketing active pharmacovigilance program. The program sampled the Italian population and was termed 'Monitoring of the Adverse Effects in Pediatric population' (MEAP). Adverse drug reactions (ADRs) were collected for individuals aged 0 - 17 years treated in hospitals and territorial health services in Lombardy, Tuscany, Apulia and Campania; located to gain an appropriate sampling of the population. ADRs were evaluated using the Adverse Drug Reaction Probability Scale (Naranjo) and analyzed with respect to time, age, sex, category of ADR, seriousness, suspected medicines, type of reporter and off-label use. We collected and analyzed reports from 3539 ADRs. Vaccines, antineoplastic and psychotropic drugs were the most frequently pharmacotherapeutic subgroups involved. Seventeen percent of reported ADRs were serious; of them fever, vomiting and angioedema were the most frequently reported. Eight percent of ADRs were associated with off-label use, and 10% were unknown ADRs. Analysis of these revealed possible strategies of therapy optimization. The MEAP project demonstrated that active post-marketing pharmacovigilance programs are a valid strategy to increase awareness on pediatric pharmacology, reduce underreporting and provide information on drug actions in pediatrics. This information enhances drug therapy optimization in the pediatric patients.

  7. Economic choices reveal probability distortion in macaque monkeys.

    PubMed

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  8. Estimating the Exceedance Probability of the Reservoir Inflow Based on the Long-Term Weather Outlooks

    NASA Astrophysics Data System (ADS)

    Huang, Q. Z.; Hsu, S. Y.; Li, M. H.

    2016-12-01

    The long-term streamflow prediction is important not only to estimate water-storage of a reservoir but also to the surface water intakes, which supply people's livelihood, agriculture, and industry. Climatology forecasts of streamflow have been traditionally used for calculating the exceedance probability curve of streamflow and water resource management. In this study, we proposed a stochastic approach to predict the exceedance probability curve of long-term streamflow with the seasonal weather outlook from Central Weather Bureau (CWB), Taiwan. The approach incorporates a statistical downscale weather generator and a catchment-scale hydrological model to convert the monthly outlook into daily rainfall and temperature series and to simulate the streamflow based on the outlook information. Moreover, we applied Bayes' theorem to derive a method for calculating the exceedance probability curve of the reservoir inflow based on the seasonal weather outlook and its imperfection. The results show that our approach can give the exceedance probability curves reflecting the three-month weather outlook and its accuracy. We also show how the improvement of the weather outlook affects the predicted exceedance probability curves of the streamflow. Our approach should be useful for the seasonal planning and management of water resource and their risk assessment.

  9. Striatal activity is modulated by target probability.

    PubMed

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  10. Time-dependent earthquake probabilities

    USGS Publications Warehouse

    Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.

    2005-01-01

    We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.

  11. Prevalence of probable Attention-Deficit/Hyperactivity Disorder symptoms: result from a Spanish sample of children.

    PubMed

    Cerrillo-Urbina, Alberto José; García-Hermoso, Antonio; Martínez-Vizcaíno, Vicente; Pardo-Guijarro, María Jesús; Ruiz-Hermosa, Abel; Sánchez-López, Mairena

    2018-03-15

    The aims of our study were to: (i) determine the prevalence of children aged 4 to 6 years with probable Attention-Deficit/Hyperactivity Disorder (ADHD) symptoms in the Spanish population; and (ii) analyse the association of probable ADHD symptoms with sex, age, type of school, origin (native or foreign) and socio-economic status in these children. This cross-sectional study included 1189 children (4 to 6 years-old) from 21 primary schools in 19 towns from the Ciudad Real and Cuenca provinces, Castilla-La Mancha region, Spain. The ADHD Rating Scales IV for parents and teachers was administered to determine the probability of ADHD. The 90th percentile cut-off was used to establish the prevalence of inattention, hyperactivity/impulsivity and combined subtype. The prevalence of children with probable ADHD symptoms was 5.4% (2.6% inattention subtype symptoms, 1.5% hyperactivity/impulsivity subtype symptoms, and 1.3% combined subtype symptoms). Children aged 4 to 5 years showed a higher prevalence of probable ADHD in the inattention subtype symptoms and in total of all subtypes than children aged 6 years, and children with low socio-economic status reported a higher prevalence of probable ADHD symptoms (each subtype and total of all of them) than those with medium and high socio-economic status. Early diagnosis and an understanding of the predictors of being probable ADHD are needed to direct appropriate identification and intervention efforts. These screening efforts should be especially addressed to vulnerable groups, particularly low socio-economic status families and younger children.

  12. Probability and Statistics: A Prelude.

    ERIC Educational Resources Information Center

    Goodman, A. F.; Blischke, W. R.

    Probability and statistics have become indispensable to scientific, technical, and management progress. They serve as essential dialects of mathematics, the classical language of science, and as instruments necessary for intelligent generation and analysis of information. A prelude to probability and statistics is presented by examination of the…

  13. 'Twisted plywood' structure and mineralization in the scales of a primitive living fish Amia calva.

    PubMed

    Meunier, F J

    1981-01-01

    The basal plate of the scales of Amia calva is composed of regular double twisted plywood, as in Latimeria and Dipnoan scales. However, the progressive rotation of the fibrils direction is left-handed in Amia and right-handed in the 'Sarcopterygians'. So, the similarity between these peculiar plywoods is probably the result of convergence. The basal plate of Amia scales is incompletely mineralized. There are numerous calcified ovoid corpuscles which look very like the Mandl's corpuscles of Teleost scales. The mineralization probably progresses essentially by the fusion of these corpuscles, as in Teleost scales, and would be inotropic rather than spheritic.

  14. Probability interpretations of intraclass reliabilities.

    PubMed

    Ellis, Jules L

    2013-11-20

    Research where many organizations are rated by different samples of individuals such as clients, patients, or employees frequently uses reliabilities computed from intraclass correlations. Consumers of statistical information, such as patients and policy makers, may not have sufficient background for deciding which levels of reliability are acceptable. It is shown that the reliability is related to various probabilities that may be easier to understand, for example, the proportion of organizations that will be classed significantly above (or below) the mean and the probability that an organization is classed correctly given that it is classed significantly above (or below) the mean. One can view these probabilities as the amount of information of the classification and the correctness of the classification. These probabilities have an inverse relationship: given a reliability, one can 'buy' correctness at the cost of informativeness and conversely. This article discusses how this can be used to make judgments about the required level of reliabilities. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Memory disorders in probable Alzheimer's disease: the role of hippocampal atrophy as shown with MRI.

    PubMed Central

    Deweer, B; Lehéricy, S; Pillon, B; Baulac, M; Chiras, J; Marsault, C; Agid, Y; Dubois, B

    1995-01-01

    Magnetic resonance based volumetric measures of hippocampal formation, amygdala (A), caudate nucleus (CN), normalised for total intracranial volume (TIV), were analysed in relation to measures of cognitive deterioration and specific features of memory functions in 18 patients with probable Alzheimer's disease. Neuropsychological examination included the mini mental state examination (MMSE), the Mattis dementia rating scale (DRS), tests of executive functions, assessment of language abilities and praxis, the Wechsler memory scale (WMS), the California verbal learning test (CVLT) and the Grober and Buschke test. The volume of the hippocampal formation (HF/TIV) was correlated with specific memory variables: memory quotient and paired associates of the WMS; intrusions and discriminability at recognition for the Grober and Buschke test. By contrast, except for intrusions, no correlations were found between memory variables and the volume of amygdala (A/TIV). No correlations were found between the volume of caudate nuclei (CN/TIV) and any neuropsychological score. The volume of the hippocampal formation was therefore selectively related to quantitative and qualitative aspects of memory performance in patients with probable Alzheimer's disease. Images PMID:7745409

  16. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a

  17. UT Biomedical Informatics Lab (BMIL) probability wheel

    NASA Astrophysics Data System (ADS)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  18. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  19. Domestic wells have high probability of pumping septic tank leachate

    NASA Astrophysics Data System (ADS)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  20. The investigation of the relationship between probability of suicide and reasons for living in psychiatric inpatients

    PubMed Central

    Eskiyurt, Reyhan; Ozkan, Birgul

    2017-01-01

    Aim: This study was carried out to determine the reasons of the suicide probability and reasons for living of the inpatients hospitalized at the psychiatry clinic and to analyze the relationship between them. Materials and Methods: The sample of the study consisted of 192 patients who were hospitalized in psychiatric clinics between February and May 2016 and who agreed to participate in the study. In collecting data, personal information form, suicide probability scale (SPS), reasons for living inventory (RFL), and Beck's depression inventory (BDI) were used. Stepwise regression method was used to determine the factors that predict suicide probability. Results: In the study, as a result of analyses made, the median score on the SPS was found 76.0, the median score on the RFL was found 137.0, the median score on the BDI of the patients was found 13.5, and it was found that patients with a high probability of suicide had less reasons for living and that their depression levels were very high. As a result of stepwise regression analysis, it was determined that suicidal ideation, reasons for living, maltreatment, education level, age, and income status were the predictors of suicide probability (F = 61.125; P < 0.001). Discussion: It was found that the patients who hospitalized in the psychiatric clinic have high suicide probability and the reasons of living are strong predictors of suicide probability in accordance with the literature. PMID:29497185

  1. [Biometric bases: basic concepts of probability calculation].

    PubMed

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  2. Fatal acute pulmonary injury associated with everolimus.

    PubMed

    Depuydt, Pieter; Nollet, Joke; Benoit, Dominique; Praet, Marleen; Caes, Frank

    2012-03-01

    To report a case of fatal alveolar hemorrhage associated with the use of everolimus in a patient who underwent a solid organ transplant. In a 71-year-old cardiac transplant patient, cyclosporine was replaced with everolimus because of worsening renal function. Over the following weeks, the patient developed nonproductive cough and increasing dyspnea. His condition deteriorated to acute respiratory failure with hemoptysis, requiring hospital admission. Bilateral patchy alveolar infiltrates were apparent on chest X-ray and computed tomography. Cardiac failure was ruled out and empiric antimicrobial therapy was initiated. Additional extensive workup could not document opportunistic infection. Everolimus was discontinued and high-dose corticosteroid therapy was initiated. Despite this, the patient required invasive mechanical ventilation and died because of refractory massive hemoptysis. Autopsy revealed diffuse alveolar hemorrhage. Everolimus is a mammalian target of rapamycin inhibitor approved for use as an immunosuppressant and antineoplastic agent. Its main advantage over calcineurin inhibitors (tacrolimus and cyclosporine) is a distinct safety profile. Although it has become clear that everolimus induces pulmonary toxicity more frequently than initially thought, most published cases thus far represented mild and reversible disease, and none was fatal. Here, we report a case of pulmonary toxicity developing over weeks following the introduction of everolimus, in which a fatal outcome could not be prevented by drug withdrawal and corticosteroid treatment. The association of everolimus and this syndrome was probable according to the Naranjo probability scale. This case indicates that with the increasing use of everolimus, clinicians should be aware of the rare, but life-threatening manifestation of pulmonary toxicity.

  3. Anaphylactic reaction to a dietary supplement containing willow bark.

    PubMed

    Boullata, Joseph I; McDonnell, Patrick J; Oliva, Cynthia D

    2003-06-01

    To report a case of anaphylaxis resulting from the use of a willow bark-containing dietary supplement in a patient with a history of an aspirin allergy. A 25-year-old white woman presented to the emergency department of a community teaching hospital with anaphylaxis requiring epinephrine, diphenhydramine, methylprednisolone, and volume resuscitation to which she responded favorably. Medication history revealed that she had ingested 2 capsules of Stacker 2 (NVE Pharmaceuticals, Newton, NJ), a dietary supplement promoted for weight loss, prior to experiencing her initial symptoms. Among other active ingredients, this product contains willow bark. Of significance is that this patient also reported a history of allergy to acetylsalicylic acid. No other causes for anaphylaxis were identified. She continued to receive routine supportive care and the remaining hospital course was uncomplicated. Dietary supplements, including herbal products, are used by many individuals who consider them to be inherently safe despite limited regulatory oversight by the Food and Drug Administration. While there may be value to specific botanical ingredients, a potential for adverse effects also exists. The popular product consumed by our patient is used for weight loss and contains willow bark, a source of salicylates. Based on the Naranjo probability scale, it is probable that this case of anaphylaxis was due to this dietary supplement. The use of any willow bark-containing dietary supplement may present a risk of anaphylactic reaction to patients with a history of allergy to salicylates. Clinicians need to recognize the potential for adverse effects from dietary supplements.

  4. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  5. The probability density function (PDF) of Lagrangian Turbulence

    NASA Astrophysics Data System (ADS)

    Birnir, B.

    2012-12-01

    The statistical theory of Lagrangian turbulence is derived from the stochastic Navier-Stokes equation. Assuming that the noise in fully-developed turbulence is a generic noise determined by the general theorems in probability, the central limit theorem and the large deviation principle, we are able to formulate and solve the Kolmogorov-Hopf equation for the invariant measure of the stochastic Navier-Stokes equations. The intermittency corrections to the scaling exponents of the structure functions require a multiplicative (multipling the fluid velocity) noise in the stochastic Navier-Stokes equation. We let this multiplicative noise, in the equation, consists of a simple (Poisson) jump process and then show how the Feynmann-Kac formula produces the log-Poissonian processes, found by She and Leveque, Waymire and Dubrulle. These log-Poissonian processes give the intermittency corrections that agree with modern direct Navier-Stokes simulations (DNS) and experiments. The probability density function (PDF) plays a key role when direct Navier-Stokes simulations or experimental results are compared to theory. The statistical theory of turbulence is determined, including the scaling of the structure functions of turbulence, by the invariant measure of the Navier-Stokes equation and the PDFs for the various statistics (one-point, two-point, N-point) can be obtained by taking the trace of the corresponding invariant measures. Hopf derived in 1952 a functional equation for the characteristic function (Fourier transform) of the invariant measure. In distinction to the nonlinear Navier-Stokes equation, this is a linear functional differential equation. The PDFs obtained from the invariant measures for the velocity differences (two-point statistics) are shown to be the four parameter generalized hyperbolic distributions, found by Barndorff-Nilsen. These PDF have heavy tails and a convex peak at the origin. A suitable projection of the Kolmogorov-Hopf equations is the

  6. Identifying Chinese Microblog Users With High Suicide Probability Using Internet-Based Profile and Linguistic Features: Classification Model.

    PubMed

    Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul Sf; Zhu, Tingshao

    2015-01-01

    Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric "Screening Efficiency" that were adopted to evaluate model effectiveness. Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Individuals in China with high suicide

  7. Multi-Scale/Multi-Functional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A multi-level (multi-scale/multi-functional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  8. Training Teachers to Teach Probability

    ERIC Educational Resources Information Center

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  9. Beliefs about women's vibrator use: results from a nationally representative probability survey in the United States.

    PubMed

    Herbenick, Debra; Reece, Michael; Schick, Vanessa; Jozkowski, Kristen N; Middelstadt, Susan E; Sanders, Stephanie A; Dodge, Brian S; Ghassemi, Annahita; Fortenberry, J Dennis

    2011-01-01

    Women's vibrator use is common in the United States, although little is known about beliefs about its use. Elicitation surveys and interviews informed the development of a 10-item scale, the Beliefs About Women's Vibrator Use Scale, which was administered to a nationally representative probability sample of adults ages 18 to 60 years. Most women and men held high positive and low negative beliefs about women's vibrator use. Women with positive beliefs reported higher Female Sexual Function Index scores related to arousal, lubrication, orgasm, satisfaction, and pain (indicating less pain).

  10. The high order dispersion analysis based on first-passage-time probability in financial markets

    NASA Astrophysics Data System (ADS)

    Liu, Chenggong; Shang, Pengjian; Feng, Guochen

    2017-04-01

    The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.

  11. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  12. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  13. Long-term effect of mass chemotherapy, transmission and risk factors for Schistosoma mansoni infection in very low endemic communities of Venezuela.

    PubMed

    Hofstede, Stefanie N; Tami, Adriana; van Liere, Genevieve A F S; Ballén, Diana; Incani, Renzo N

    2014-12-01

    The prevalence of Schistosoma mansoni infection in Venezuela has changed from high to low due mostly to successful control activities, including mass chemotherapy and molluscicide applications. This study examined the impact of mass chemotherapy on S. mansoni transmission and risk factors for infection 12 years after administration of praziquantel in Venezuela. Two relatively isolated rural communities were studied, one with snail control (Manuare) and the second without (Los Naranjos). A cross-sectional survey of randomly selected households included 226 (Manuare) and 192 (Los Naranjos) consenting participants. S. mansoni prevalence was determined using a combination of coprological (Kato-Katz) and serological (circumoval precipitin test, alkaline phosphatase immunoassay and Western blot) tests. Data on epidemiological and socioeconomic risk factors were obtained through individual structured interviews. Univariate analysis and multivariate logistic regression models identified independent risk factors for infection. Water sites were examined for the presence of Biomphalaria glabrata snails. Only one participant was positive by coprology. The overall prevalences according to the combined tests were 32.7% in Manuare and 26.6% in Los Naranjos. Lower prevalences (12.7% in Manuare and 13.2% in Los Naranjos) were found in children <12 years of age representing those born after mass chemotherapy. Social demographic variables associated with infection in both communities were older age (>25 years), contact with specific water sites, and being a farmer/non-specialised worker. Mass treatment with praziquantel applied once to endemic communities led to an important and long-lasting sustained reduction of S. mansoni infections independent of the application of snail control. A degree of low active transmission of S. mansoni persisted in the treated areas which was associated with similar factors in both communities. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. The Problem with Probability: Why rare hazards feel even rarer

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.

    2013-12-01

    Even as scientists improve the accuracy of their forecasts for large-scale events like natural hazards and climate change, a gap remains between the confidence the scientific community has in those estimates, and the skepticism with which the lay public tends to view statements of uncertainty. Beyond the challenges of helping the public to understand probabilistic forecasts lies yet another barrier to effective communication: the fact that even when humans can estimate or state the correct probability of a rare event, we tend to distort that probability in our minds, acting as if the likelihood is higher or lower than we know it to be. A half century of empirical research in psychology and economics leaves us with a clear view of the ways that people interpret stated, or described probabilities--e.g., "There is a 6% chance of a Northridge-sized earthquake occurring in your area in the next 10 years." In the past decade, the focus of cognitive scientists has turned to the other method humans use to learn probabilities: intuitively estimating the chances of a rare event by assessing our personal experience with various outcomes. While it is well understood that described probabilities are over-weighted when they are small (e.g., a 5% chance might be treated more like a 10% or 12% chance), it appears that in many cases, experienced rare probabilities are in fact under-weighted. This distortion is not an under-estimation, and therefore cannot be prevented by reminding people of the described probability. This paper discusses the mechanisms and effects of this difference in the way probability is used when a number is provided, as opposed to when the frequency of a rare event is intuited. In addition to recommendations based on the current state of research on the way people appear to make decisions from experience, suggestions are made for how to present probabilistic information to best take advantage of people's tendencies to either amplify risk or ignore it, as well

  15. Regional Permafrost Probability Modelling in the northwestern Cordillera, 59°N - 61°N, Canada

    NASA Astrophysics Data System (ADS)

    Bonnaventure, P. P.; Lewkowicz, A. G.

    2010-12-01

    and Radburn, 1992; Heginbottom et al., 1995) but is several orders of magnitude more detailed. It also exhibits some significant differences, including the presence of an area of valley-floor continuous permafrost around Beaver Creek near the Alaskan border in the west, as well as higher probabilities of permafrost in the central parts of the region near the boundaries of the sporadic and extensive discontinuous zones. In addition, parts of the northernmost portion of the region would be classified as sporadic discontinuous permafrost because of inversions in the terrestrial surface lapse rate which cause permafrost probabilities to decrease with elevation through the forest. These model predictions are expected to of direct use for infrastructure planning and northern development and can serve as a benchmark for future studies of permafrost distribution in the Yukon. References Heginbottom JR, Dubreuil MA and Haker PT. 1995. Canada Permafrost. (1:7,500,000 scale). In The National Atlas of Canada, 5th Edition, sheet MCR 4177. Ottawa: National Resources Canada. Heginbottom, J.A. and Radburn, L.K. 1992. Permafrost and ground ice conditions of northwestern Canada; Geological Survey of Canada, Map 1691A, scale 1:1,000,000. Digitized by S. Smith, Geological Survey of Canada.

  16. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  17. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, Brent M.; Karlinger, Michael R.

    2003-01-01

    The T‐year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T‐year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at‐site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100‐year flood will occur on the average every 4.5 years.

  18. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  19. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  20. Transition probability spaces in loop quantum gravity

    NASA Astrophysics Data System (ADS)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  1. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  2. Computation of the Complex Probability Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainer, Amelia Jo; Ledwith, Patrick John

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n th degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  3. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  4. Does clinical pretest probability influence image quality and diagnostic accuracy in dual-source coronary CT angiography?

    PubMed

    Thomas, Christoph; Brodoefel, Harald; Tsiflikas, Ilias; Bruckner, Friederike; Reimann, Anja; Ketelsen, Dominik; Drosch, Tanja; Claussen, Claus D; Kopp, Andreas; Heuschmid, Martin; Burgstahler, Christof

    2010-02-01

    To prospectively evaluate the influence of the clinical pretest probability assessed by the Morise score onto image quality and diagnostic accuracy in coronary dual-source computed tomography angiography (DSCTA). In 61 patients, DSCTA and invasive coronary angiography were performed. Subjective image quality and accuracy for stenosis detection (>50%) of DSCTA with invasive coronary angiography as gold standard were evaluated. The influence of pretest probability onto image quality and accuracy was assessed by logistic regression and chi-square testing. Correlations of image quality and accuracy with the Morise score were determined using linear regression. Thirty-eight patients were categorized into the high, 21 into the intermediate, and 2 into the low probability group. Accuracies for the detection of significant stenoses were 0.94, 0.97, and 1.00, respectively. Logistic regressions and chi-square tests showed statistically significant correlations between Morise score and image quality (P < .0001 and P < .001) and accuracy (P = .0049 and P = .027). Linear regression revealed a cutoff Morise score for a good image quality of 16 and a cutoff for a barely diagnostic image quality beyond the upper Morise scale. Pretest probability is a weak predictor of image quality and diagnostic accuracy in coronary DSCTA. A sufficient image quality for diagnostic images can be reached with all pretest probabilities. Therefore, coronary DSCTA might be suitable also for patients with a high pretest probability. Copyright 2010 AUR. Published by Elsevier Inc. All rights reserved.

  5. Generalized probabilistic scale space for image restoration.

    PubMed

    Wong, Alexander; Mishra, Akshaya K

    2010-10-01

    A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.

  6. Risk and protective factors of dissocial behavior in a probability sample.

    PubMed

    Moral de la Rubia, José; Ortiz Morales, Humberto

    2012-07-01

    The aims of this study were to know risk and protective factors for dissocial behavior keeping in mind that the self-report of dissocial behavior is biased by the impression management. A probability sample of adolescents that lived in two neighborhoods with high indexes of gangs and offenses (112 male and 86 women) was collected. The 27-item Dissocial Behavior Scale (ECODI27; Pacheco & Moral, 2010), Balanced Inventory of Desirable Responding, version 6 (BIDR-6; Paulhus, 1991), Sensation Seeking Scale, form V (SSS-V; Zuckerman, Eysenck, & Eysenck, 1978), Parent-Adolescent Communication Scale (PACS; Barnes & Olson, 1982), 30-item Rathus Assertiveness Schedule (RAS; Rathus, 1973), Interpersonal Reactivity Index (IRI; Davis, 1983) and a social relationship questionnaire (SRQ) were applied. Binary logistic regression was used for the data analysis. A third of the participants showed dissocial behavior. Belonging to a gang in the school (schooled adolescents) or to a gang out of school and job (total sample) and desinhibition were risk factors; being woman, perspective taking and open communication with the father were protective factors. School-leaving was a differential aspect. We insisted on the need of intervention on these variables.

  7. Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.

    PubMed

    Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C

    2003-03-01

    Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.

  8. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  9. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    ERIC Educational Resources Information Center

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  10. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. The scale insects (Hemiptera: Coccoidea) of the Maltese Archipelago.

    PubMed

    Mifsud, David; Mazzeo, Gaetana; Russo, Agatino; Watson, Gillian W

    2014-09-25

    Past works on scale insects (Hemiptera: Coccoidea) from the Maltese Archipelago are reviewed. Based on the literature and contemporary collections, a total of 93 species of scale insects belonging to 12 scale insect families are here reported (Aclerdidae 1 species; Asterolecaniidae 4; Coccidae 17; Diaspididae 46; Eriococcidae 5; Kermesidae 1; Margarodidae 1; Micrococcidae 1; Monophlebidae 2; Pseudoccocidae 11; Putoidae 2 and Rhizoecidae 2). Of these, 17 species represent new distribution records. Ten species are excluded from the scale insect fauna of the Maltese Islands. Of the 93 species present, only 29 (31.18%) are probably indigenous and the rest (68.82%) represent established introductions from elsewhere. More than 65% of the indigenous species are typical Mediterranean in distribution, with a few species having a mainly European chorotype. A quarter of the established aliens originate from Eurasia, followed by an East Asian/ Oriental component (20.31%); European (14.06%); Neotropical (14.06%); cryptogenic (14.06%); African (7.81%) and Australasian (4.70%). Movement of live fruit trees and ornamental plants into the Maltese Archipelago from nearby countries is probably the main route for entry of alien scale insects into the country. Some possible future introductions are discussed.

  12. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-03-14

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment.

  13. Optimizing probability of detection point estimate demonstration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.

  14. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  15. A genetic scale of reading frame coding.

    PubMed

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. Copyright © 2014 Elsevier

  16. Probable or improbable universe? Correlating electroweak vacuum instability with the scale of inflation

    DOE PAGES

    Hook, Anson; Kearney, John; Shakya, Bibhushan; ...

    2015-01-13

    Measurements of the Higgs boson and top quark masses indicate that the Standard Model Higgs potential becomes unstable around Λ I ~ 10 11 GeV. This instability is cosmologically relevant since quantum fluctuations during inflation can easily destabilize the electroweak vacuum if the Hubble parameter during inflation is larger than Λ I (as preferred by the recent BICEP 2 measurement). Here, we perform a careful study of the evolution of the Higgs field during inflation, obtaining different results from those currently in the literature. We consider both tunneling via a Coleman-de Luccia or Hawking-Moss instanton, valid when the scale ofmore » inflation is below the instability scale, as well as a statistical treatment via the Fokker-Planck equation appropriate in the opposite regime. We show that a better understanding of the post-inflation evolution of the unstable AdS vacuum regions is crucial for determining the eventual fate of the universe. If these AdS regions devour all of space, a universe like ours is indeed extremely unlikely without new physics to stabilize the Higgs potential; however, if these regions crunch, our universe survives, but inflation must last a few e-folds longer to compensate for the lost AdS regions. Lastly, we examine the effects of generic Planck-suppressed corrections to the Higgs potential, which can be sufficient to stabilize the electroweak vacuum during inflation.« less

  17. Trial type probability modulates the cost of antisaccades

    PubMed Central

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  18. Identifying Chinese Microblog Users With High Suicide Probability Using Internet-Based Profile and Linguistic Features: Classification Model

    PubMed Central

    Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul SF

    2015-01-01

    Background Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. Objective The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. Methods There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric “Screening Efficiency” that were adopted to evaluate model effectiveness. Results Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30

  19. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  20. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  1. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  2. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  3. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    PubMed

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  5. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  6. Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base Superalloy IN100 (Preprint)

    DTIC Science & Technology

    2009-03-01

    transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which

  7. Convergence of Transition Probability Matrix in CLVMarkov Models

    NASA Astrophysics Data System (ADS)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  8. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  9. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    PubMed

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  10. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  11. Methods of scaling threshold color difference using printed samples

    NASA Astrophysics Data System (ADS)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  12. Flood protection diversification to reduce probabilities of extreme losses.

    PubMed

    Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor

    2012-11-01

    Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.

  13. Galactic-scale civilization

    NASA Technical Reports Server (NTRS)

    Kuiper, T. B. H.

    1980-01-01

    Evolutionary arguments are presented in favor of the existence of civilization on a galactic scale. Patterns of physical, chemical, biological, social and cultural evolution leading to increasing levels of complexity are pointed out and explained thermodynamically in terms of the maximization of free energy dissipation in the environment of the organized system. The possibility of the evolution of a global and then a galactic human civilization is considered, and probabilities that the galaxy is presently in its colonization state and that life could have evolved to its present state on earth are discussed. Fermi's paradox of the absence of extraterrestrials in light of the probability of their existence is noted, and a variety of possible explanations is indicated. Finally, it is argued that although mankind may be the first occurrence of intelligence in the galaxy, it is unjustified to presume that this is so.

  14. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  15. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  16. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We…

  17. Obtaining Accurate Probabilities Using Classifier Calibration

    ERIC Educational Resources Information Center

    Pakdaman Naeini, Mahdi

    2016-01-01

    Learning probabilistic classification and prediction models that generate accurate probabilities is essential in many prediction and decision-making tasks in machine learning and data mining. One way to achieve this goal is to post-process the output of classification models to obtain more accurate probabilities. These post-processing methods are…

  18. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  19. What is preexisting strength? Predicting free association probabilities, similarity ratings, and cued recall probabilities.

    PubMed

    Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B

    2005-08-01

    Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength.

  20. Arsenic concentrations, related environmental factors, and the predicted probability of elevated arsenic in groundwater in Pennsylvania

    USGS Publications Warehouse

    Gross, Eliza L.; Low, Dennis J.

    2013-01-01

    Logistic regression models were created to predict and map the probability of elevated arsenic concentrations in groundwater statewide in Pennsylvania and in three intrastate regions to further improve predictions for those three regions (glacial aquifer system, Gettysburg Basin, Newark Basin). Although the Pennsylvania and regional predictive models retained some different variables, they have common characteristics that can be grouped by (1) geologic and soils variables describing arsenic sources and mobilizers, (2) geochemical variables describing the geochemical environment of the groundwater, and (3) locally specific variables that are unique to each of the three regions studied and not applicable to statewide analysis. Maps of Pennsylvania and the three intrastate regions were produced that illustrate that areas most at risk are those with geology and soils capable of functioning as an arsenic source or mobilizer and geochemical groundwater conditions able to facilitate redox reactions. The models have limitations because they may not characterize areas that have localized controls on arsenic mobility. The probability maps associated with this report are intended for regional-scale use and may not be accurate for use at the field scale or when considering individual wells.

  1. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  2. Lattice Theory, Measures and Probability

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2007-11-01

    In this tutorial, I will discuss the concepts behind generalizing ordering to measuring and apply these ideas to the derivation of probability theory. The fundamental concept is that anything that can be ordered can be measured. Since we are in the business of making statements about the world around us, we focus on ordering logical statements according to implication. This results in a Boolean lattice, which is related to the fact that the corresponding logical operations form a Boolean algebra. The concept of logical implication can be generalized to degrees of implication by generalizing the zeta function of the lattice. The rules of probability theory arise naturally as a set of constraint equations. Through this construction we are able to neatly connect the concepts of order, structure, algebra, and calculus. The meaning of probability is inherited from the meaning of the ordering relation, implication, rather than being imposed in an ad hoc manner at the start.

  3. A Discrete Probability Function Method for the Equation of Radiative Transfer

    NASA Technical Reports Server (NTRS)

    Sivathanu, Y. R.; Gore, J. P.

    1993-01-01

    A discrete probability function (DPF) method for the equation of radiative transfer is derived. The DPF is defined as the integral of the probability density function (PDF) over a discrete interval. The derivation allows the evaluation of the PDF of intensities leaving desired radiation paths including turbulence-radiation interactions without the use of computer intensive stochastic methods. The DPF method has a distinct advantage over conventional PDF methods since the creation of a partial differential equation from the equation of transfer is avoided. Further, convergence of all moments of intensity is guaranteed at the basic level of simulation unlike the stochastic method where the number of realizations for convergence of higher order moments increases rapidly. The DPF method is described for a representative path with approximately integral-length scale-sized spatial discretization. The results show good agreement with measurements in a propylene/air flame except for the effects of intermittency resulting from highly correlated realizations. The method can be extended to the treatment of spatial correlations as described in the Appendix. However, information regarding spatial correlations in turbulent flames is needed prior to the execution of this extension.

  4. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability.

    PubMed

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-10-01

    Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. A total of 103 patients underwent computed tomography pulmonary angiography (CTPA) scan in which 21 (20%) had a positive scan, 81 (79%) had a negative scan and one (1%) had an equivocal result. The rate of PE in the normal, low-probability, and high-probability scan categories were: 2 (9.5%), 10 (47.5%), and 9 (43%) respectively. A very low correlation (Pearson correlation coefficient r = 0.20) between the clinical PTP score and lung VQ scan. The area under the curve (AUC) of the clinical PTP score was 52% when compared with the CTPA results. However, the accuracy of lung VQ scan was better (AUC = 74%) when compared with CTPA scan. The clinical PTP score is unreliable on its own; however, it may still aid in the interpretation of lung VQ scan. The accuracy of the lung VQ scan was better in the assessment of underlying pulmonary embolism (PE).

  5. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    PubMed Central

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomography pulmonary angiography (CTPA) scan in which 21 (20%) had a positive scan, 81 (79%) had a negative scan and one (1%) had an equivocal result. The rate of PE in the normal, low-probability, and high-probability scan categories were: 2 (9.5%), 10 (47.5%), and 9 (43%) respectively. A very low correlation (Pearson correlation coefficient r = 0.20) between the clinical PTP score and lung VQ scan. The area under the curve (AUC) of the clinical PTP score was 52% when compared with the CTPA results. However, the accuracy of lung VQ scan was better (AUC = 74%) when compared with CTPA scan. Conclusion: The clinical PTP score is unreliable on its own; however, it may still aid in the interpretation of lung VQ scan. The accuracy of the lung VQ scan was better in the assessment of underlying pulmonary embolism (PE). PMID:24379532

  6. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parke, Stephen J.; Denton, Peter B.; Minakata, Hisakazu

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  7. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  8. Probability, arrow of time and decoherence

    NASA Astrophysics Data System (ADS)

    Bacciagaluppi, Guido

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  9. Fine-temporal forecasting of outbreak probability and severity: Ross River virus in Western Australia.

    PubMed

    Koolhof, I S; Bettiol, S; Carver, S

    2017-10-01

    Health warnings of mosquito-borne disease risk require forecasts that are accurate at fine-temporal resolutions (weekly scales); however, most forecasting is coarse (monthly). We use environmental and Ross River virus (RRV) surveillance to predict weekly outbreak probabilities and incidence spanning tropical, semi-arid, and Mediterranean regions of Western Australia (1991-2014). Hurdle and linear models were used to predict outbreak probabilities and incidence respectively, using time-lagged environmental variables. Forecast accuracy was assessed by model fit and cross-validation. Residual RRV notification data were also examined against mitigation expenditure for one site, Mandurah 2007-2014. Models were predictive of RRV activity, except at one site (Capel). Minimum temperature was an important predictor of RRV outbreaks and incidence at all predicted sites. Precipitation was more likely to cause outbreaks and greater incidence among tropical and semi-arid sites. While variable, mitigation expenditure coincided positively with increased RRV incidence (r 2 = 0·21). Our research demonstrates capacity to accurately predict mosquito-borne disease outbreaks and incidence at fine-temporal resolutions. We apply our findings, developing a user-friendly tool enabling managers to easily adopt this research to forecast region-specific RRV outbreaks and incidence. Approaches here may be of value to fine-scale forecasting of RRV in other areas of Australia, and other mosquito-borne diseases.

  10. Subgrid-scale stresses and scalar fluxes constructed by the multi-scale turnover Lagrangian map

    NASA Astrophysics Data System (ADS)

    AL-Bairmani, Sukaina; Li, Yi; Rosales, Carlos; Xie, Zheng-tong

    2017-04-01

    The multi-scale turnover Lagrangian map (MTLM) [C. Rosales and C. Meneveau, "Anomalous scaling and intermittency in three-dimensional synthetic turbulence," Phys. Rev. E 78, 016313 (2008)] uses nested multi-scale Lagrangian advection of fluid particles to distort a Gaussian velocity field and, as a result, generate non-Gaussian synthetic velocity fields. Passive scalar fields can be generated with the procedure when the fluid particles carry a scalar property [C. Rosales, "Synthetic three-dimensional turbulent passive scalar fields via the minimal Lagrangian map," Phys. Fluids 23, 075106 (2011)]. The synthetic fields have been shown to possess highly realistic statistics characterizing small scale intermittency, geometrical structures, and vortex dynamics. In this paper, we present a study of the synthetic fields using the filtering approach. This approach, which has not been pursued so far, provides insights on the potential applications of the synthetic fields in large eddy simulations and subgrid-scale (SGS) modelling. The MTLM method is first generalized to model scalar fields produced by an imposed linear mean profile. We then calculate the subgrid-scale stress, SGS scalar flux, SGS scalar variance, as well as related quantities from the synthetic fields. Comparison with direct numerical simulations (DNSs) shows that the synthetic fields reproduce the probability distributions of the SGS energy and scalar dissipation rather well. Related geometrical statistics also display close agreement with DNS results. The synthetic fields slightly under-estimate the mean SGS energy dissipation and slightly over-predict the mean SGS scalar variance dissipation. In general, the synthetic fields tend to slightly under-estimate the probability of large fluctuations for most quantities we have examined. Small scale anisotropy in the scalar field originated from the imposed mean gradient is captured. The sensitivity of the synthetic fields on the input spectra is assessed by

  11. Electrofishing capture probability of smallmouth bass in streams

    USGS Publications Warehouse

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  12. A short note on probability in clinical medicine.

    PubMed

    Upshur, Ross E G

    2013-06-01

    Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine. © 2013 John Wiley & Sons Ltd.

  13. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  14. Risk estimation using probability machines.

    PubMed

    Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D

    2014-03-01

    Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.

  15. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  16. Causal inference, probability theory, and graphical insights.

    PubMed

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  17. Fixation Probability in a Haploid-Diploid Population.

    PubMed

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  18. Fixation Probability in a Haploid-Diploid Population

    PubMed Central

    Bessho, Kazuhiro; Otto, Sarah P.

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright–Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. PMID:27866168

  19. Immediate hypersensitivity to moxifloxacin with tolerance to ciprofloxacin: report of three cases and review of the literature.

    PubMed

    Chang, Brenda; Knowles, Sandra R; Weber, Elizabeth

    2010-04-01

    To report 3 cases of immediate hypersensitivity reactions to moxifloxacin in patients who tolerated ciprofloxacin. A 71-year-old man, a 44-year-old woman, and a 70-year-old woman with a history of a moxifloxacin reaction developed an immediate hypersensitivity reaction upon oral challenge with moxifloxacin in our Drug Safety Clinic. The reaction was mainly characterized by pruritus and urticaria, although dyspnea and hypotension were noted in the first and second patient, respectively. Two of the patients had negative oral challenge tests with ciprofloxacin and all 3 patients tolerated full treatment courses of oral ciprofloxacin. In all 3 cases, use of the Naranjo probability scale indicated a highly probable adverse drug reaction. Moxifloxacin, similar to other fluoroquinolones, can cause immediate hypersensitivity reactions. Previous publications have reported both cross-reactivity and a lack of cross-reactivity among various fluoroquinolones. The 3 patients discussed demonstrated a lack of cross-reactivity between moxifloxacin and ciprofloxacin since they tolerated oral challenge tests and full treatment courses of ciprofloxacin. Moxifloxacin has unique side chains at positions 7 and 8 on its bicyclic ring structure. Antigenic specificity to particular side chains at positions 7 and 8 on the bicyclic ring structure of moxifloxacin may explain this lack of cross-reactivity. Higher reporting rates of anaphylaxis to moxifloxacin compared to other fluoroquinolones may also be related to side chain specificity, although definitive evidence for this is lacking. Based on our experience, patients who develop immediate hypersensitivity reactions to moxifloxacin may receive ciprofloxacin therapy in an appropriately monitored setting if they have previously tolerated full treatment courses of ciprofloxacin. Research into whether there is a specific side chain reaction unique to moxifloxacin is warranted.

  20. Energy drink-induced acute kidney injury.

    PubMed

    Greene, Elisa; Oman, Kristy; Lefler, Mary

    2014-10-01

    To report a case of acute renal failure possibly induced by Red Bull. A 40-year-old man presented with various complaints, including a recent hypoglycemic episode. Assessment revealed that serum creatinine was elevated at 5.5 mg/dL, from a baseline of 0.9 mg/dL. An interview revealed a 2- to 3-week history of daily ingestion of 100 to 120 oz of Red Bull energy drink. Resolution of renal dysfunction occurred within 2 days of discontinuation of Red Bull and persisted through 10 months of follow-up. Rechallenge was not attempted. Energy-drink-induced renal failure has been reported infrequently. We identified 2 case reports via a search of MEDLINE, one of which occurred in combination with alcohol and the other of which was not available in English. According to the Food and Drug Administration's (FDA's) Center for Food Safety and Applied Nutrition Adverse Event Reporting System, between 2004 and 2012, the FDA has received 166 reports of adverse events associated with energy drink consumption. Only 3 of the 166 (0.18%) described renal failure, and none were reported with Red Bull specifically. A defined mechanism for injury is unknown. Assessment of the Naranjo adverse drug reaction probability scale indicates a probable relationship between the development of acute renal failure and Red Bull ingestion in our patient. Acute kidney injury has rarely been reported with energy drink consumption. Our report describes the first English language report of acute renal failure occurring in the context of ingestion of large quantities of energy drink without concomitant alcohol. © The Author(s) 2014.

  1. Interaction between levodopa and enteral nutrition.

    PubMed

    Cooper, Mandelin K; Brock, David G; McDaniel, Cara M

    2008-03-01

    To report and discuss a drug-nutrient interaction involving levodopa and protein in enteral nutrition. A 77-year-old male with Parkinson's disease was admitted to an intensive care unit for an intracerebral hemorrhage. To provide nutritional support, an oral gastric tube was placed and continuous enteral nutrition was initiated, with 1.4 g/kg of protein administered daily. The following medications were continued during hospitalization: immediate-release carbidopa/levodopa 25 mg/100 mg, with 1.5 tablets administered 4 times daily; pramipexole 1.5 mg 3 times daily; and entacapone 200 mg 4 times daily. Despite this drug therapy, the patient developed severe rigidity. A review of the literature revealed a potential interaction between levodopa and protein intake. To resolve this interaction, the amount of protein in the enteral nutrition was decreased to 0.9 g/kg/day and the nutritional administration was changed from continuous enteral feeding to bolus feeding, with levodopa given between boluses. After these adjustments, the patient showed marked improvement of parkinsonian symptoms. The drug-nutrient interaction between protein and levodopa in outpatient settings has been reported widely in the literature; however, this interaction has not been previously reported with continuous enteral nutrition. Decreased parkinsonian symptom control, despite adherence to an established medication regimen, together with dramatic improvement observed after manipulation of enteral nutrition delivery and content, strongly suggest interference with levodopa absorption. Use of the Naranjo probability scale supports a probable interaction between the protein content in tube feeds and levodopa, resulting in decreased levodopa efficacy. Clinicians should be cognizant of the potential drug-nutrient interaction between levodopa and enteral nutrition.

  2. Development of Hypercalcemia in a Patient Receiving Peginterferon alfa-2a Therapy for Polycythemia Vera.

    PubMed

    Karne, Sheetal; Mainor, Candace B; Baer, Maria R

    2016-06-01

    Peginterferon alfa-2a (PEG-IFN alfa-2a) is commonly used to treat hepatitis C virus infection and is also being used increasingly to treat myeloproliferative neoplasms including polycythemia vera. Sarcoidosis associated with IFN therapy for treatment of hepatitis C is well described, with hypercalcemia occurring as a rare manifestation. We describe a 25-year-old man with polycythemia vera who became resistant to hydroxyurea after 6 years of treatment, requiring therapeutic phlebotomy procedures with increasing frequency for elevated hemoglobin and hematocrit levels. PEG-IFN alfa-2a was then initiated at 90 μg subcutaneously once/week and was progressively increased to 180 μg/week over the next 11 months, with normalization of his hemoglobin and hematocrit. The patient then developed hypercalcemia with low parathyroid hormone, parathyroid hormone-related protein, and 25-hydroxyvitamin D levels, and high 1,25-dihydroxyvitamin D and angiotensin-converting enzyme levels, without other evidence of sarcoidosis. PEG-IFN alfa-2a was discontinued, treatment with intravenous fluids and zoledronic acid was initiated, and the hypercalcemia resolved 10 weeks later. Use of the Naranjo Adverse Drug Reaction Probability Scale indicated a probable relationship (score of 7) between the patient's development of hypercalcemia and PEG-IFN alfa-2a therapy; the relationship could not be considered as definite because the patient was not rechallenged with the drug. To our knowledge, this is the first case report of IFN-induced hypercalcemia without other manifestations of sarcoidosis. Practitioners should be aware of hypercalcemia as a potential complication of PEG-IFN alfa-2a therapy, as well as its protracted time course, in patients with myeloproliferative neoplasms. © 2016 Pharmacotherapy Publications, Inc.

  3. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Series approximation to probability densities

    NASA Astrophysics Data System (ADS)

    Cohen, L.

    2018-04-01

    One of the historical and fundamental uses of the Edgeworth and Gram-Charlier series is to "correct" a Gaussian density when it is determined that the probability density under consideration has moments that do not correspond to the Gaussian [5, 6]. There is a fundamental difficulty with these methods in that if the series are truncated, then the resulting approximate density is not manifestly positive. The aim of this paper is to attempt to expand a probability density so that if it is truncated it will still be manifestly positive.

  5. Match probabilities in a finite, subdivided population

    PubMed Central

    Malaspinas, Anna-Sapfo; Slatkin, Montgomery; Song, Yun S.

    2011-01-01

    We generalize a recently introduced graphical framework to compute the probability that haplotypes or genotypes of two individuals drawn from a finite, subdivided population match. As in the previous work, we assume an infinite-alleles model. We focus on the case of a population divided into two subpopulations, but the underlying framework can be applied to a general model of population subdivision. We examine the effect of population subdivision on the match probabilities and the accuracy of the product rule which approximates multi-locus match probabilities as a product of one-locus match probabilities. We quantify the deviation from predictions of the product rule by R, the ratio of the multi-locus match probability to the product of the one-locus match probabilities.We carry out the computation for two loci and find that ignoring subdivision can lead to underestimation of the match probabilities if the population under consideration actually has subdivision structure and the individuals originate from the same subpopulation. On the other hand, under a given model of population subdivision, we find that the ratio R for two loci is only slightly greater than 1 for a large range of symmetric and asymmetric migration rates. Keeping in mind that the infinite-alleles model is not the appropriate mutation model for STR loci, we conclude that, for two loci and biologically reasonable parameter values, population subdivision may lead to results that disfavor innocent suspects because of an increase in identity-by-descent in finite populations. On the other hand, for the same range of parameters, population subdivision does not lead to a substantial increase in linkage disequilibrium between loci. Those results are consistent with established practice. PMID:21266180

  6. Students' Understanding of Conditional Probability on Entering University

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  7. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  8. Analgesic efficacy and bioavailability of ketorolac in postoperative pain: a probability analysis.

    PubMed

    Pérez-Urizar, J; Granados-Soto, V; Castañeda-Hernández, G; Hong, E; González, C; Martínez, J L; Flores-Murrieta, F J

    2000-01-01

    The analgesic efficacy and bioavailability of 30 mg intramuscular ketorolac was studied in 24 patients with severe or very severe postoperative pain. Pain and pain relief were determined by a five-point verbal rating scale and data were submitted to a probability analysis. Ketorolac plasma levels were determined by high-performance liquid chromatography. Two patients chose not to finish the study; 22 patients completed the study achieving at least good pain relief. Of these 22 patients, 13 reached complete pain relief. Ketorolac was rapidly absorbed. Notwithstanding, pain relief increased gradually, showing considerable delay with regard to plasma concentrations. Analysis of the probability-time curves revealed that 25% of the patients obtained moderate pain relief at 7 min after ketorolac administration, 50% at 11 min, 75% at 29 min, and 95% at 60 min. Good pain relief was achieved in 25, 50, and 75% of the patients at 1.1, 1.8, and 2.7 h, respectively. Complete pain relief was achieved in 25% and 50% of the patients at 2.6 h and 3.7 h, respectively. The probability of exhibiting an acceptable pain relief in responsive patients for more than 5 h was 0.97. No serious side effects were detected. Results show that 30 mg intramuscular ketorolac is an adequate treatment for postoperative pain in the Mexican population. Therefore, the use of higher doses is not justified. Due to gradual installation of analgesia, administration of additional analgesic medication before 1 h is not recommended.

  9. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  10. Transition Probabilities for Hydrogen-Like Atoms

    NASA Astrophysics Data System (ADS)

    Jitrik, Oliverio; Bunge, Carlos F.

    2004-12-01

    E1, M1, E2, M2, E3, and M3 transition probabilities for hydrogen-like atoms are calculated with point-nucleus Dirac eigenfunctions for Z=1-118 and up to large quantum numbers l=25 and n=26, increasing existing data more than a thousandfold. A critical evaluation of the accuracy shows a higher reliability with respect to previous works. Tables for hydrogen containing a subset of the results are given explicitly, listing the states involved in each transition, wavelength, term energies, statistical weights, transition probabilities, oscillator strengths, and line strengths. The complete results, including 1 863 574 distinct transition probabilities, lifetimes, and branching fractions are available at http://www.fisica.unam.mx/research/tables/spectra/1el

  11. Fixation probability on clique-based graphs

    NASA Astrophysics Data System (ADS)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  12. Analytical expressions for the closure probability of a stiff wormlike chain for finite capture radius.

    PubMed

    Guérin, T

    2017-08-01

    Estimating the probability that two monomers of the same polymer chain are close together is a key ingredient to characterize intramolecular reactions and polymer looping. In the case of stiff wormlike polymers (rigid fluctuating elastic rods), for which end-to-end encounters are rare events, we derive an explicit analytical formula for the probability η(r_{c}) that the distance between the chain extremities is smaller than some capture radius r_{c}. The formula is asymptotically exact in the limit of stiff chains, and it leads to the identification of two distinct scaling regimes for the closure factor, originating from a strong variation of the fluctuations of the chain orientation at closure. Our theory is compatible with existing analytical results from the literature that cover the cases of a vanishing capture radius and of nearly fully extended chains.

  13. N -tag probability law of the symmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Poncet, Alexis; Bénichou, Olivier; Démery, Vincent; Oshanin, Gleb

    2018-06-01

    The symmetric exclusion process (SEP), in which particles hop symmetrically on a discrete line with hard-core constraints, is a paradigmatic model of subdiffusion in confined systems. This anomalous behavior is a direct consequence of strong spatial correlations induced by the requirement that the particles cannot overtake each other. Even if this fact has been recognized qualitatively for a long time, up to now there has been no full quantitative determination of these correlations. Here we study the joint probability distribution of an arbitrary number of tagged particles in the SEP. We determine analytically its large-time limit for an arbitrary density of particles, and its full dynamics in the high-density limit. In this limit, we obtain the time-dependent large deviation function of the problem and unveil a universal scaling form shared by the cumulants.

  14. Knotting probability of a shaken ball-chain.

    PubMed

    Hickford, J; Jones, R; du Pont, S Courrech; Eggers, J

    2006-11-01

    We study the formation of knots on a macroscopic ball chain, which is shaken on a horizontal plate at 12 times the acceleration of gravity. We find that above a certain critical length, the knotting probability is independent of chain length, while the time to shake out a knot increases rapidly with chain length. The probability of finding a knot after a certain time is the result of the balance of these two processes. In particular, the knotting probability tends to a constant for long chains.

  15. Crash probability estimation via quantifying driver hazard perception.

    PubMed

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Unmanned aerial vehicles for surveying marine fauna: assessing detection probability.

    PubMed

    Hodgson, Amanda; Peel, David; Kelly, Natalie

    2017-06-01

    Aerial surveys are conducted for various fauna to assess abundance, distribution, and habitat use over large spatial scales. They are traditionally conducted using light aircraft with observers recording sightings in real time. Unmanned Aerial Vehicles (UAVs) offer an alternative with many potential advantages, including eliminating human risk. To be effective, this emerging platform needs to provide detection rates of animals comparable to traditional methods. UAVs can also acquire new types of information, and this new data requires a reevaluation of traditional analyses used in aerial surveys; including estimating the probability of detecting animals. We conducted 17 replicate UAV surveys of humpback whales (Megaptera novaeangliae) while simultaneously obtaining a 'census' of the population from land-based observations, to assess UAV detection probability. The ScanEagle UAV, carrying a digital SLR camera, continuously captured images (with 75% overlap) along transects covering the visual range of land-based observers. We also used ScanEagle to conduct focal follows of whale pods (n = 12, mean duration = 40 min), to assess a new method of estimating availability. A comparison of the whale detections from the UAV to the land-based census provided an estimated UAV detection probability of 0.33 (CV = 0.25; incorporating both availability and perception biases), which was not affected by environmental covariates (Beaufort sea state, glare, and cloud cover). According to our focal follows, the mean availability was 0.63 (CV = 0.37), with pods including mother/calf pairs having a higher availability (0.86, CV = 0.20) than those without (0.59, CV = 0.38). The follows also revealed (and provided a potential correction for) a downward bias in group size estimates from the UAV surveys, which resulted from asynchronous diving within whale pods, and a relatively short observation window of 9 s. We have shown that UAVs are an effective alternative to

  17. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  18. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Probability shapes perceptual precision: A study in orientation estimation.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  20. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  1. Anomalous scaling in an age-dependent branching model.

    PubMed

    Keller-Schmidt, Stephanie; Tuğrul, Murat; Eguíluz, Víctor M; Hernández-García, Emilio; Klemm, Konstantin

    2015-02-01

    We introduce a one-parametric family of tree growth models, in which branching probabilities decrease with branch age τ as τ(-α). Depending on the exponent α, the scaling of tree depth with tree size n displays a transition between the logarithmic scaling of random trees and an algebraic growth. At the transition (α=1) tree depth grows as (logn)(2). This anomalous scaling is in good agreement with the trend observed in evolution of biological species, thus providing a theoretical support for age-dependent speciation and associating it to the occurrence of a critical point.

  2. Finite-size scaling for discontinuous nonequilibrium phase transitions

    NASA Astrophysics Data System (ADS)

    de Oliveira, Marcelo M.; da Luz, M. G. E.; Fiore, Carlos E.

    2018-06-01

    A finite-size scaling theory, originally developed only for transitions to absorbing states [Phys. Rev. E 92, 062126 (2015), 10.1103/PhysRevE.92.062126], is extended to distinct sorts of discontinuous nonequilibrium phase transitions. Expressions for quantities such as response functions, reduced cumulants, and equal area probability distributions are derived from phenomenological arguments. Irrespective of system details, all these quantities scale with the volume, establishing the dependence on size. The approach generality is illustrated through the analysis of different models. The present results are a relevant step in trying to unify the scaling behavior description of nonequilibrium transition processes.

  3. Probability of Detection Study on Impact Damage to Honeycomb Composite Structure using Thermographic Inspection

    NASA Technical Reports Server (NTRS)

    Hodge, Andrew J.; Walker, James L., II

    2008-01-01

    A probability of detection study was performed for the detection of impact damage using flash heating infrared thermography on a full scale honeycomb composite structure. The honeycomb structure was an intertank structure from a previous NASA technology demonstration program. The intertank was fabricated from IM7/8552 carbon fiber/epoxy facesheets and aluminum honeycomb core. The intertank was impacted in multiple locations with a range of impact energies utilizing a spherical indenter. In a single blind study, the intertank was inspected with thermography before and after impact damage was incurred. Following thermographic inspection several impact sites were sectioned from the intertank and cross-sectioned for microscopic comparisons of NDE detection and actual damage incurred. The study concluded that thermographic inspection was a good method of detecting delamination damage incurred by impact. The 90/95 confidence level on the probability of detection was close to the impact energy that delaminations were first observed through cross-sectional analysis.

  4. Probability theory, not the very guide of life.

    PubMed

    Juslin, Peter; Nilsson, Håkan; Winman, Anders

    2009-10-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.

  5. A discussion on the origin of quantum probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less

  6. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    PubMed

    Salazar, Rosie D; Montgomery, Robert A; Thresher, Sarah E; Macdonald, David W

    2016-01-01

    The common toad (Bufo bufo) is of increasing conservation concern in the United Kingdom (UK) due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields) and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat. We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT) tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013). We developed a population-level resource selection function (RSF) to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads. The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  7. The impossibility of probabilities

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    2017-11-01

    This paper discusses the problem of assigning probabilities to the likelihood of nuclear terrorism events, in particular examining the limitations of using Bayesian priors for this purpose. It suggests an alternate approach to analyzing the threat of nuclear terrorism.

  8. Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Misiūnė, Ieva

    2015-04-01

    Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 p<0.001). On average the probability of SWR was high with the OK (0.58±0.08) followed by PK (0.49±0.18), SK (0.32±0.16), DK (0.32±0.15) and IK (0.31±0.16). The most accurate probability methods predicted a lower probability of SWR in the studied plot. The spatial distribution of SWR was different according to the tested technique. Simple Kriging, DK, IK and PK methods

  9. Surprisingly rational: probability theory plus noise explains biases in judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  10. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  11. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  12. Probability Analysis of the Wave-Slamming Pressure Values of the Horizontal Deck with Elastic Support

    NASA Astrophysics Data System (ADS)

    Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao

    2018-06-01

    This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.

  13. Features and prevalence of patients with probable adult attention deficit hyperactivity disorder who request treatment for cocaine use disorders.

    PubMed

    Pérez de Los Cobos, José; Siñol, Núria; Puerta, Carmen; Cantillano, Vanessa; López Zurita, Cristina; Trujols, Joan

    2011-01-30

    To characterize those patients with probable adult attention deficit hyperactivity disorder (ADHD) who ask for treatment of cocaine use disorders; to estimate the prevalence of probable adult ADHD among these patients. This is a cross-sectional and multi-center study performed at outpatient resources of 12 addiction treatment centers in Spain. Participants were treatment-seeking primary cocaine abusers recruited consecutively at one center and through convenience sampling at the other centers. Assessments included semi-structured clinical interview focused on Diagnostic and Statistical Manual of Mental Disorders, fourth edition (DSM-IV) ADHD criteria adapted to adulthood, and the Wender-Utah Rating Scale (WURS) for screening childhood history of ADHD according to patients. Probable adult ADHD was diagnosed when patients met DSM-IV criteria of ADHD in adulthood and scored WURS>32. All participants were diagnosed with current cocaine dependence (n=190) or abuse (n=15). Patients with probable adult ADHD, compared with patients having no lifetime ADHD, were more frequently male, reported higher impulsivity, and began to use nicotine, alcohol, cannabis, or cocaine earlier. Before starting the current treatment, patients with probable adult ADHD also showed higher cocaine craving for the previous day, less frequent cocaine abstinence throughout the previous week, and higher use of cocaine and tobacco during the previous month. Impulsivity and male gender were the only independent risk factors of probable adult ADHD in a logistic regression analysis. The prevalence of probable adult ADHD was 20.5% in the sub-sample of patients consecutively recruited (n=78). A diagnosis of probable adult ADHD strongly distinguishes among treatment-seeking cocaine primary abusers regarding past and current key aspects of their addictive disorder; one-fifth of these patients present with probable adult ADHD. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  14. Predicting redox conditions in groundwater at a regional scale

    USGS Publications Warehouse

    Tesoriero, Anthony J.; Terziotti, Silvia; Abrams, Daniel B.

    2015-01-01

    Defining the oxic-suboxic interface is often critical for determining pathways for nitrate transport in groundwater and to streams at the local scale. Defining this interface on a regional scale is complicated by the spatial variability of reaction rates. The probability of oxic groundwater in the Chesapeake Bay watershed was predicted by relating dissolved O2 concentrations in groundwater samples to indicators of residence time and/or electron donor availability using logistic regression. Variables that describe surficial geology, position in the flow system, and soil drainage were important predictors of oxic water. The probability of encountering oxic groundwater at a 30 m depth and the depth to the bottom of the oxic layer were predicted for the Chesapeake Bay watershed. The influence of depth to the bottom of the oxic layer on stream nitrate concentrations and time lags (i.e., time period between land application of nitrogen and its effect on streams) are illustrated using model simulations for hypothetical basins. Regional maps of the probability of oxic groundwater should prove useful as indicators of groundwater susceptibility and stream susceptibility to contaminant sources derived from groundwater.

  15. A study of the vacancy loop formation probability in Ni-Cu and Ag-Pd alloys. [50-keV Kr sup + ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smalinskas, K.; Chen, Gengsheng; Haworth, J.

    1992-04-01

    The molten-zone model of vacancy loop formation from a displacement cascade predicts that the loop formation probability should scale with the melting temperature. To investigate this possibility the vacancy loop formation probability has been determined in a series of Cu-Ni and Ag-Pd alloys. The irradiations were performed at room temperature with 50 keV Kr+ ions and the resulting damage structure was examined by using transmission electron microscopy. In the Cu-Ni alloy series, the change in loop formation probability with increasing Ni concentration was complex, and at low- and high- nickel concentrations, the defect yield did not change in the predictedmore » manner. The defect yield was higher in the Cu-rich alloys than in the Ni-rich alloys. In the Ag-Pd alloy the change in the loop formation probability followed more closely the change in melting temperature, but no simple relationship was determined.« less

  16. Establishment probability in newly founded populations.

    PubMed

    Gusset, Markus; Müller, Michael S; Grimm, Volker

    2012-06-20

    Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population's state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the "Wissel plot", where -ln(1 - P0(t)) is plotted against time t. This plot is based on the equation P(0)t=1-c(1)e(-ω(1t)), which relates the probability of extinction by time t, P(0)(t), to two constants: c(1) describes the probability of a newly founded population to reach the established phase, whereas ω(1) describes the population's probability of extinction per short time interval once established. For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus). A newly founded population reaches the established phase if the intercept of the (extrapolated) linear parts of the "Wissel plot" with the y-axis, which is -ln(c(1)), is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population's viability by distinguishing establishment from persistence.

  17. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section 1316.10 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE ADMINISTRATIVE FUNCTIONS, PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge...

  18. WPE: A Mathematical Microworld for Learning Probability

    ERIC Educational Resources Information Center

    Kiew, Su Ding; Sam, Hong Kian

    2006-01-01

    In this study, the researchers developed the Web-based Probability Explorer (WPE), a mathematical microworld and investigated the effectiveness of the microworld's constructivist learning environment in enhancing the learning of probability and improving students' attitudes toward mathematics. This study also determined the students' satisfaction…

  19. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  20. Probability calculations for three-part mineral resource assessments

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  1. Non-native salmonids affect amphibian occupancy at multiple spatial scales

    USGS Publications Warehouse

    Pilliod, David S.; Hossack, Blake R.; Bahls, Peter F.; Bull, Evelyn L.; Corn, Paul Stephen; Hokit, Grant; Maxell, Bryce A.; Munger, James C.; Wyrick, Aimee

    2010-01-01

    Aim The introduction of non-native species into aquatic environments has been linked with local extinctions and altered distributions of native species. We investigated the effect of non-native salmonids on the occupancy of two native amphibians, the long-toed salamander (Ambystoma macrodactylum) and Columbia spotted frog (Rana luteiventris), across three spatial scales: water bodies, small catchments and large catchments. Location Mountain lakes at ≥ 1500 m elevation were surveyed across the northern Rocky Mountains, USA. Methods We surveyed 2267 water bodies for amphibian occupancy (based on evidence of reproduction) and fish presence between 1986 and 2002 and modelled the probability of amphibian occupancy at each spatial scale in relation to habitat availability and quality and fish presence. Results After accounting for habitat features, we estimated that A. macrodactylum was 2.3 times more likely to breed in fishless water bodies than in water bodies with fish. Ambystoma macrodactylum also was more likely to occupy small catchments where none of the water bodies contained fish than in catchments where at least one water body contained fish. However, the probability of salamander occupancy in small catchments was also influenced by habitat availability (i.e. the number of water bodies within a catchment) and suitability of remaining fishless water bodies. We found no relationship between fish presence and salamander occupancy at the large-catchment scale, probably because of increased habitat availability. In contrast to A. macrodactylum, we found no relationship between fish presence and R. luteiventris occupancy at any scale. Main conclusions Our results suggest that the negative effects of non-native salmonids can extend beyond the boundaries of individual water bodies and increase A. macrodactylum extinction risk at landscape scales. We suspect that niche overlap between non-native fish and A. macrodactylum at higher elevations in the northern Rocky

  2. Phonotactic Probability Effects in Children Who Stutter

    PubMed Central

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose The purpose of this study was to examine the influence of phonotactic probability, the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS), as well as to determine whether it has an effect on the type of stuttered disfluency produced. Method A 500+ word language sample was obtained from 19 CWS. Each stuttered word was randomly paired with a fluently produced word that closely matched it in grammatical class, word length, familiarity, word and neighborhood frequency, and neighborhood density. Phonotactic probability values were obtained for the stuttered and fluent words from an online database. Results Phonotactic probability did not have a significant influence on the overall susceptibility of words to stuttering, but it did impact the type of stuttered disfluency produced. In specific, single-syllable word repetitions were significantly lower in phonotactic probability than fluently produced words, as well as part-word repetitions and sound prolongations. Conclusions In general, the differential impact of phonotactic probability on the type of stuttering-like disfluency produced by young CWS provides some support for the notion that different disfluency types may originate in the disruption of different levels of processing. PMID:18658056

  3. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  4. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  5. Uncertainty and probability for branching selves

    NASA Astrophysics Data System (ADS)

    Lewis, Peter J.

    Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics.

  6. Predicting probability of occurrence and factors affecting distribution and abundance of three Ozark endemic crayfish species at multiple spatial scales

    USGS Publications Warehouse

    Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.

    2014-01-01

    We found that a range of environmental variables were important in predicting crayfish distribution and abundance at multiple spatial scales and their importance was species-, response variable- and scale dependent. We would encourage others to examine the influence of spatial scale on species distribution and abundance patterns.

  7. Multistage variable probability forest volume inventory. [the Defiance Unit of the Navajo Nation

    NASA Technical Reports Server (NTRS)

    Anderson, J. E. (Principal Investigator)

    1979-01-01

    An inventory scheme based on the use of computer processed LANDSAT MSS data was developed. Output from the inventory scheme provides an estimate of the standing net saw timber volume of a major timber species on a selected forested area of the Navajo Nation. Such estimates are based on the values of parameters currently used for scaled sawlog conversion to mill output. The multistage variable probability sampling appears capable of producing estimates which compare favorably with those produced using conventional techniques. In addition, the reduction in time, manpower, and overall costs lend it to numerous applications.

  8. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  9. Scaling of load in communications networks.

    PubMed

    Narayan, Onuttom; Saniee, Iraj

    2010-09-01

    We show that the load at each node in a preferential attachment network scales as a power of the degree of the node. For a network whose degree distribution is p(k)∼k{-γ} , we show that the load is l(k)∼k{η} with η=γ-1 , implying that the probability distribution for the load is p(l)∼1/l{2} independent of γ . The results are obtained through scaling arguments supported by finite size scaling studies. They contradict earlier claims, but are in agreement with the exact solution for the special case of tree graphs. Results are also presented for real communications networks at the IP layer, using the latest available data. Our analysis of the data shows relatively poor power-law degree distributions as compared to the scaling of the load versus degree. This emphasizes the importance of the load in network analysis.

  10. Imprecise Probability Methods for Weapons UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  11. Quantum probability and Hilbert's sixth problem

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2018-04-01

    With the birth of quantum mechanics, the two disciplines that Hilbert proposed to axiomatize, probability and mechanics, became entangled and a new probabilistic model arose in addition to the classical one. Thus, to meet Hilbert's challenge, an axiomatization should account deductively for the basic features of all three disciplines. This goal was achieved within the framework of quantum probability. The present paper surveys the quantum probabilistic axiomatization. This article is part of the themed issue `Hilbert's sixth problem'.

  12. Quantum temporal probabilities in tunneling systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastopoulos, Charis, E-mail: anastop@physics.upatras.gr; Savvidou, Ntina, E-mail: ksavvidou@physics.upatras.gr

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects ofmore » the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.« less

  13. Exact Derivation of a Finite-Size Scaling Law and Corrections to Scaling in the Geometric Galton-Watson Process

    PubMed Central

    Corral, Álvaro; Garcia-Millan, Rosalba; Font-Clos, Francesc

    2016-01-01

    The theory of finite-size scaling explains how the singular behavior of thermodynamic quantities in the critical point of a phase transition emerges when the size of the system becomes infinite. Usually, this theory is presented in a phenomenological way. Here, we exactly demonstrate the existence of a finite-size scaling law for the Galton-Watson branching processes when the number of offsprings of each individual follows either a geometric distribution or a generalized geometric distribution. We also derive the corrections to scaling and the limits of validity of the finite-size scaling law away the critical point. A mapping between branching processes and random walks allows us to establish that these results also hold for the latter case, for which the order parameter turns out to be the probability of hitting a distant boundary. PMID:27584596

  14. Probability-based hazard avoidance guidance for planetary landing

    NASA Astrophysics Data System (ADS)

    Yuan, Xu; Yu, Zhengshi; Cui, Pingyuan; Xu, Rui; Zhu, Shengying; Cao, Menglong; Luan, Enjie

    2018-03-01

    Future landing and sample return missions on planets and small bodies will seek landing sites with high scientific value, which may be located in hazardous terrains. Autonomous landing in such hazardous terrains and highly uncertain planetary environments is particularly challenging. Onboard hazard avoidance ability is indispensable, and the algorithms must be robust to uncertainties. In this paper, a novel probability-based hazard avoidance guidance method is developed for landing in hazardous terrains on planets or small bodies. By regarding the lander state as probabilistic, the proposed guidance algorithm exploits information on the uncertainty of lander position and calculates the probability of collision with each hazard. The collision probability serves as an accurate safety index, which quantifies the impact of uncertainties on the lander safety. Based on the collision probability evaluation, the state uncertainty of the lander is explicitly taken into account in the derivation of the hazard avoidance guidance law, which contributes to enhancing the robustness to the uncertain dynamics of planetary landing. The proposed probability-based method derives fully analytic expressions and does not require off-line trajectory generation. Therefore, it is appropriate for real-time implementation. The performance of the probability-based guidance law is investigated via a set of simulations, and the effectiveness and robustness under uncertainties are demonstrated.

  15. Introducing Disjoint and Independent Events in Probability.

    ERIC Educational Resources Information Center

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  16. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Ecosystem-scale plant hydraulic strategies inferred from remotely-sensed soil moisture

    NASA Astrophysics Data System (ADS)

    Bassiouni, M.; Good, S. P.; Higgins, C. W.

    2017-12-01

    Characterizing plant hydraulic strategies at the ecosystem scale is important to improve estimates of evapotranspiration and to understand ecosystem productivity and resilience. However, quantifying plant hydraulic traits beyond the species level is a challenge. The probability density function of soil moisture observations provides key information about the soil moisture states at which evapotranspiration is reduced by water stress. Here, an inverse Bayesian approach is applied to a standard bucket model of soil column hydrology forced with stochastic precipitation inputs. Through this approach, we are able to determine the soil moisture thresholds at which stomata are open or closed that are most consistent with observed soil moisture probability density functions. This research utilizes remotely-sensed soil moisture data to explore global patterns of ecosystem-scale plant hydraulic strategies. Results are complementary to literature values of measured hydraulic traits of various species in different climates and previous estimates of ecosystem-scale plant isohydricity. The presented approach provides a novel relation between plant physiological behavior and soil-water dynamics.

  18. Encephalopathy in an infant with infantile spasms: possible role of valproate toxicity

    PubMed Central

    Sivathanu, Shobhana; Sampath, Sowmya; Veerasamy, Madhubala; Sunderkumar, Satheeshkumar

    2014-01-01

    An infant presented with global developmental delay and infantile spasms. EEG was suggestive of hypsarrhythmia. She was started on sodium valproate, clonazepam and adrenocorticotropic hormone injection. After an initial improvement the child developed vomiting, altered sensorium and increase in frequency of seizures suggestive of encephalopathy. Valproate-induced hyperammonaemia or hepatic encephalopathy was considered and the drug was withheld following which there was a dramatic improvement. Paradoxically, the liver function tests and serum ammonia were normal. However, a complete reversal of encephalopathy, on withdrawal of the drug, strongly suggested an adverse drug reaction (ADR) due to valproic acid. Marginal elevation of serum valproic acid prompted us to use the Naranjo ADR probability score to confirm the diagnosis. This case highlights the fact that valproate toxicity can manifest with normal liver function and serum ammonia levels. This is the youngest reported case with this rare form of valproate-induced encephalopathy. PMID:24810446

  19. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  20. On probability-possibility transformations

    NASA Technical Reports Server (NTRS)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  1. Predators, environment and host characteristics influence the probability of infection by an invasive castrating parasite.

    PubMed

    Gehman, Alyssa-Lois M; Grabowski, Jonathan H; Hughes, A Randall; Kimbro, David L; Piehler, Michael F; Byers, James E

    2017-01-01

    Not all hosts, communities or environments are equally hospitable for parasites. Direct and indirect interactions between parasites and their predators, competitors and the environment can influence variability in host exposure, susceptibility and subsequent infection, and these influences may vary across spatial scales. To determine the relative influences of abiotic, biotic and host characteristics on probability of infection across both local and estuary scales, we surveyed the oyster reef-dwelling mud crab Eurypanopeus depressus and its parasite Loxothylacus panopaei, an invasive castrating rhizocephalan, in a hierarchical design across >900 km of the southeastern USA. We quantified the density of hosts, predators of the parasite and host, the host's oyster reef habitat, and environmental variables that might affect the parasite either directly or indirectly on oyster reefs within 10 estuaries throughout this biogeographic range. Our analyses revealed that both between and within estuary-scale variation and host characteristics influenced L. panopaei prevalence. Several additional biotic and abiotic factors were positive predictors of infection, including predator abundance and the depth of water inundation over reefs at high tide. We demonstrate that in addition to host characteristics, biotic and abiotic community-level variables both serve as large-scale indicators of parasite dynamics.

  2. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  3. Tobramycin-induced hepatotoxicity.

    PubMed

    Nisly, Sarah A; Ray, Shaunta' M; Moye, Robert A

    2007-12-01

    To report a case of tobramycin-induced hepatotoxicity. A 20-year-old female was hospitalized for treatment of Pseudomonas aeruginosa bacteremia and osteomyelitis. Empiric intravenous antibiotic therapy with piperacillin/tazobactam, vancomycin, and ciprofloxacin was started, and based on the results of culture and sensitivity testing, was changed to intravenous ceftazidime and tobramycin 70 mg every 8 hours on hospital day 3. Liver enzyme levels then increased over days 3-6. Tests for hepatitis A, B, and C were all nonreactive, and HIV testing was negative. On day 8, therapy was changed from ceftazidime to piperacillin/tazobactam and the tobramycin dose was increased to 100 mg every 8 hours. Due to a continued increase in total bilirubin, aspartate aminotransferase, and alanine aminotransferase, piperacillin/tazobactam was discontinued and aztreonam was started on day 10. All antibiotics were stopped on day 12 and the elevated liver parameters began to decrease. Aztreonam and ciprofloxacin were restarted on day 16, and most laboratory test results returned to baseline levels by day 19; total bilirubin and alkaline phosphatase decreased to lower than baseline values. This case illustrates a possible occurrence of tobramycin-induced hepatotoxicity. Liver enzymes rose when tobramycin therapy was initiated, markedly increased when the tobramycin dose was increased, then resolved upon discontinuation of therapy. Other medication-related causes were ruled out by temporal relationship or rechallenge (aztreonam). Use of the Naranjo probability scale indicated a possible relationship between hepatotoxicity and tobramycin therapy. Other adverse reaction scales specific for evaluation of drug-induced liver disease were also used. Both the Council for International Organizations of Medical Sciences and Maria and Victorino scales indicated a probable likelihood of tobramycin-induced hepatotoxicity. This patient was not rechallenged with tobramycin due to the highly suggestive

  4. Universal scaling laws in metro area election results.

    PubMed

    Bokányi, Eszter; Szállási, Zoltán; Vattay, Gábor

    2018-01-01

    We explain the anomaly of election results between large cities and rural areas in terms of urban scaling in the 1948-2016 US elections and in the 2016 EU referendum of the UK. The scaling curves are all universal and depend on a single parameter only, and one of the parties always shows superlinear scaling and drives the process, while the sublinear exponent of the other party is merely the consequence of probability conservation. Based on the recently developed model of urban scaling, we give a microscopic model of voter behavior in which we replace diversity characterizing humans in creative aspects with social diversity and tolerance. The model can also predict new political developments such as the fragmentation of the left and the immigration paradox.

  5. On the nonlinearity of spatial scales in extreme weather attribution statements

    NASA Astrophysics Data System (ADS)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  6. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  7. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE PAGES

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; ...

    2017-06-17

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  8. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  9. An Evaluation of the High-Probability Instruction Sequence with and without Programmed Reinforcement for Compliance with High-Probability Instructions

    ERIC Educational Resources Information Center

    Zuluaga, Carlos A.; Normand, Matthew P.

    2008-01-01

    We assessed the effects of reinforcement and no reinforcement for compliance to high-probability (high-p) instructions on compliance to low-probability (low-p) instructions using a reversal design. For both participants, compliance with the low-p instruction increased only when compliance with high-p instructions was followed by reinforcement.…

  10. Zero field reversal probability in thermally assisted magnetization reversal

    NASA Astrophysics Data System (ADS)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  11. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  12. Rethinking the learning of belief network probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musick, R.

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rotemore » learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.« less

  13. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  14. Critically Evaluated Energy Levels, Spectral Lines, Transition Probabilities, and Intensities of Neutral Vanadium (V i)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saloman, Edward B.; Kramida, Alexander

    2017-08-01

    The energy levels, observed spectral lines, and transition probabilities of the neutral vanadium atom, V i, have been compiled. Also included are values for some forbidden lines that may be of interest to the astrophysical community. Experimental Landé g -factors and leading percentage compositions for the levels are included where available, as well as wavelengths calculated from the energy levels (Ritz wavelengths). Wavelengths are reported for 3985 transitions, and 549 energy levels are determined. The observed relative intensities normalized to a common scale are provided.

  15. Nonparametric Estimation of the Probability of Ruin.

    DTIC Science & Technology

    1985-02-01

    MATHEMATICS RESEARCH CENTER I E N FREES FEB 85 MRC/TSR...in NONPARAMETRIC ESTIMATION OF THE PROBABILITY OF RUIN Lf Edward W. Frees * Mathematics Research Center University of Wisconsin-Madison 610 Walnut...34 - .. --- - • ’. - -:- - - ..- . . .- -- .-.-. . -. . .- •. . - . . - . . .’ . ’- - .. -’vi . .-" "-- -" ,’- UNIVERSITY OF WISCONSIN-MADISON MATHEMATICS RESEARCH CENTER NONPARAMETRIC ESTIMATION OF THE PROBABILITY

  16. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    NASA Astrophysics Data System (ADS)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  17. Activity-dependent regulation of release probability at excitatory hippocampal synapses: a crucial role of FMRP in neurotransmission

    PubMed Central

    Wang, Xiao-Sheng; Peng, Chun-Zi; Cai, Wei-Jun; Xia, Jian; Jin, Daozhong; Dai, Yuqiao; Luo, Xue-Gang; Klyachko, Vitaly A.; Deng, Pan-Yue

    2014-01-01

    Transcriptional silencing of the Fmr1 gene encoding fragile X mental retardation protein (FMRP) causes Fragile X Syndrome (FXS), the most common form of inherited intellectual disability and the leading genetic cause of autism. FMRP has been suggested to play important roles in regulating neurotransmission and short-term synaptic plasticity at excitatory hippocampal and cortical synapses. However, the origins and the mechanisms of these FMRP actions remain incompletely understood, and the role of FMRP in regulating synaptic release probability and presynaptic function remains debated. Here we used variance-mean analysis and peak scaled nonstationary variance analysis to examine changes in both pre- and postsynaptic parameters during repetitive activity at excitatory CA3-CA1 hippocampal synapses in a mouse model of FXS. Our analyses revealed that loss of FMRP did not affect the basal release probability or basal synaptic transmission, but caused an abnormally elevated release probability specifically during repetitive activity. These abnormalities were not accompanied by changes in EPSC kinetics, quantal size or postsynaptic AMPA receptor conductance. Our results thus indicate that FMRP regulates neurotransmission at excitatory hippocampal synapses specifically during repetitive activity via modulation of release probability in a presynaptic manner. Our study suggests that FMRP function in regulating neurotransmitter release is an activity-dependent phenomenon that may contribute to the pathophysiology of FXS. PMID:24646437

  18. Probability misjudgment, cognitive ability, and belief in the paranormal.

    PubMed

    Musch, Jochen; Ehrenberg, Katja

    2002-05-01

    According to the probability misjudgment account of paranormal belief (Blackmore & Troscianko, 1985), believers in the paranormal tend to wrongly attribute remarkable coincidences to paranormal causes rather than chance. Previous studies have shown that belief in the paranormal is indeed positively related to error rates in probabilistic reasoning. General cognitive ability could account for a relationship between these two variables without assuming a causal role of probabilistic reasoning in the forming of paranormal beliefs, however. To test this alternative explanation, a belief in the paranormal scale (BPS) and a battery of probabilistic reasoning tasks were administered to 123 university students. Confirming previous findings, a significant correlation between BPS scores and error rates in probabilistic reasoning was observed. This relationship disappeared, however, when cognitive ability as measured by final examination grades was controlled for. Lower cognitive ability correlated substantially with belief in the paranormal. This finding suggests that differences in general cognitive performance rather than specific probabilistic reasoning skills provide the basis for paranormal beliefs.

  19. Fine-scale characteristics of interplanetary sector

    NASA Technical Reports Server (NTRS)

    Behannon, K. W.; Neubauer, F. M.; Barnstoff, H.

    1980-01-01

    The structure of the interplanetary sector boundaries observed by Helios 1 within sector transition regions was studied. Such regions consist of intermediate (nonspiral) average field orientations in some cases, as well as a number of large angle directional discontinuities (DD's) on the fine scale (time scales 1 hour). Such DD's are found to be more similar to tangential than rotational discontinuities, to be oriented on average more nearly perpendicular than parallel to the ecliptic plane to be accompanied usually by a large dip ( 80%) in B and, with a most probable thickness of 3 x 10 to the 4th power km, significantly thicker previously studied. It is hypothesized that the observed structures represent multiple traversals of the global heliospheric current sheet due to local fluctuations in the position of the sheet. There is evidence that such fluctuations are sometimes produced by wavelike motions or surface corrugations of scale length 0.05 - 0.1 AU superimposed on the large scale structure.

  20. Mapping fire probability and severity in a Mediterranean area using different weather and fuel moisture scenarios

    NASA Astrophysics Data System (ADS)

    Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.

    2009-04-01

    Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in

  1. Public attitudes toward stuttering in Turkey: probability versus convenience sampling.

    PubMed

    Ozdemir, R Sertan; St Louis, Kenneth O; Topbaş, Seyhun

    2011-12-01

    A Turkish translation of the Public Opinion Survey of Human Attributes-Stuttering (POSHA-S) was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. A convenience sample of adults in Eskişehir, Turkey was compared with two replicates of a school-based, probability cluster sampling scheme. The two replicates of the probability sampling scheme yielded similar demographic samples, both of which were different from the convenience sample. Components of subscores on the POSHA-S were significantly different in more than half of the comparisons between convenience and probability samples, indicating important differences in public attitudes. If POSHA-S users intend to generalize to specific geographic areas, results of this study indicate that probability sampling is a better research strategy than convenience sampling. The reader will be able to: (1) discuss the difference between convenience sampling and probability sampling; (2) describe a school-based probability sampling scheme; and (3) describe differences in POSHA-S results from convenience sampling versus probability sampling. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Multiple Streaming and the Probability Distribution of Density in Redshift Space

    NASA Astrophysics Data System (ADS)

    Hui, Lam; Kofman, Lev; Shandarin, Sergei F.

    2000-07-01

    We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude (σl<~1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which are physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S3, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated.

  3. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  4. Probability in the Many-Worlds Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Vaidman, Lev

    It is argued that, although in the Many-Worlds Interpretation of quantum mechanics there is no "probability" for an outcome of a quantum experiment in the usual sense, we can understand why we have an illusion of probability. The explanation involves: (a) A "sleeping pill" gedanken experiment which makes correspondence between an illegitimate question: "What is the probability of an outcome of a quantum measurement?" with a legitimate question: "What is the probability that `I' am in the world corresponding to that outcome?"; (b) A gedanken experiment which splits the world into several worlds which are identical according to some symmetry condition; and (c) Relativistic causality, which together with (b) explain the Born rule of standard quantum mechanics. The Quantum Sleeping Beauty controversy and "caring measure" replacing probability measure are discussed.

  5. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korhonen, Marko; Lee, Eunghyun

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle'smore » position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.« less

  6. Identification of probabilities.

    PubMed

    Vitányi, Paul M B; Chater, Nick

    2017-02-01

    Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.

  7. Switching probability of all-perpendicular spin valve nanopillars

    NASA Astrophysics Data System (ADS)

    Tzoufras, M.

    2018-05-01

    In all-perpendicular spin valve nanopillars the probability density of the free-layer magnetization is independent of the azimuthal angle and its evolution equation simplifies considerably compared to the general, nonaxisymmetric geometry. Expansion of the time-dependent probability density to Legendre polynomials enables analytical integration of the evolution equation and yields a compact expression for the practically relevant switching probability. This approach is valid when the free layer behaves as a single-domain magnetic particle and it can be readily applied to fitting experimental data.

  8. Epidemic Threshold in Structured Scale-Free Networks

    NASA Astrophysics Data System (ADS)

    EguíLuz, VíCtor M.; Klemm, Konstantin

    2002-08-01

    We analyze the spreading of viruses in scale-free networks with high clustering and degree correlations, as found in the Internet graph. For the susceptible-infected-susceptible model of epidemics the prevalence undergoes a phase transition at a finite threshold of the transmission probability. Comparing with the absence of a finite threshold in networks with purely random wiring, our result suggests that high clustering (modularity) and degree correlations protect scale-free networks against the spreading of viruses. We introduce and verify a quantitative description of the epidemic threshold based on the connectivity of the neighborhoods of the hubs.

  9. Modelling of spatial contaminant probabilities of occurrence of chlorinated hydrocarbons in an urban aquifer.

    PubMed

    Greis, Tillman; Helmholz, Kathrin; Schöniger, Hans Matthias; Haarstrick, Andreas

    2012-06-01

    In this study, a 3D urban groundwater model is presented which serves for calculation of multispecies contaminant transport in the subsurface on the regional scale. The total model consists of two submodels, the groundwater flow and reactive transport model, and is validated against field data. The model equations are solved applying finite element method. A sensitivity analysis is carried out to perform parameter identification of flow, transport and reaction processes. Coming from the latter, stochastic variation of flow, transport, and reaction input parameters and Monte Carlo simulation are used in calculating probabilities of pollutant occurrence in the domain. These probabilities could be part of determining future spots of contamination and their measure of damages. Application and validation is exemplarily shown for a contaminated site in Braunschweig (Germany), where a vast plume of chlorinated ethenes pollutes the groundwater. With respect to field application, the methods used for modelling reveal feasible and helpful tools to assess natural attenuation (MNA) and the risk that might be reduced by remediation actions.

  10. Multifractal analysis with the probability density function at the three-dimensional anderson transition.

    PubMed

    Rodriguez, Alberto; Vasquez, Louella J; Römer, Rudolf A

    2009-03-13

    The probability density function (PDF) for critical wave function amplitudes is studied in the three-dimensional Anderson model. We present a formal expression between the PDF and the multifractal spectrum f(alpha) in which the role of finite-size corrections is properly analyzed. We show the non-Gaussian nature and the existence of a symmetry relation in the PDF. From the PDF, we extract information about f(alpha) at criticality such as the presence of negative fractal dimensions and the possible existence of termination points. A PDF-based multifractal analysis is shown to be a valid alternative to the standard approach based on the scaling of inverse participation ratios.

  11. Transition probabilities of Br II

    NASA Technical Reports Server (NTRS)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  12. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    PubMed

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  13. Prevalence rate, predictors and long-term course of probable posttraumatic stress disorder after major trauma: a prospective cohort study

    PubMed Central

    2012-01-01

    Background Among trauma patients relatively high prevalence rates of posttraumatic stress disorder (PTSD) have been found. To identify opportunities for prevention and early treatment, predictors and course of PTSD need to be investigated. Long-term follow-up studies of injury patients may help gain more insight into the course of PTSD and subgroups at risk for PTSD. The aim of our long-term prospective cohort study was to assess the prevalence rate and predictors, including pre-hospital trauma care (assistance of physician staffed Emergency Medical Services (EMS) at the scene of the accident), of probable PTSD in a sample of major trauma patients at one and two years after injury. The second aim was to assess the long-term course of probable PTSD following injury. Methods A prospective cohort study was conducted of 332 major trauma patients with an Injury Severity Score (ISS) of 16 or higher. We used data from the hospital trauma registry and self-assessment surveys that included the Impact of Event Scale (IES) to measure probable PTSD symptoms. An IES-score of 35 or higher was used as indication for the presence of probable PTSD. Results One year after injury measurements of 226 major trauma patients were obtained (response rate 68%). Of these patients 23% had an IES-score of 35 or higher, indicating probable PTSD. At two years after trauma the prevalence rate of probable PTSD was 20%. Female gender and co-morbid disease were strong predictors of probable PTSD one year following injury, whereas minor to moderate head injury and injury of the extremities (AIS less than 3) were strong predictors of this disorder at two year follow-up. Of the patients with probable PTSD at one year follow-up 79% had persistent PTSD symptoms a year later. Conclusions Up to two years after injury probable PTSD is highly prevalent in a population of patients with major trauma. The majority of patients suffered from prolonged effects of PTSD, underlining the importance of prevention

  14. Ditching Tests of a 1/18-Scale Model of the Lockheed Constellation Airplane

    NASA Technical Reports Server (NTRS)

    Fisher, Lloyd J.; Morris, Garland J.

    1948-01-01

    Tests were made of a 1/18-scale dynamically similar model of the Lockheed Constellation airplane to investigate its ditching characteristics and proper ditching technique. Scale-strength bottoms were used to reproduce probable damage to the fuselage. The model was landed in calm water at the Langley tank no. 2 monorail. Various landing attitudes, speeds, and fuselage configuration were simulated. The behavior of the model was determined from visual observations, by recording the longitudinal decelerations, and by taking motion pictures of the ditchings. Data are presented in tabular form, sequence photographs, and time-history deceleration curves. It was concluded that the airplane should be ditched at a medium nose-high landing attitude with the landing flaps full down. The airplane will probably make a deep run with heavy spray and may even dive slightly. The fuselage will be damaged and leak substantially but in calm water probably will not flood rapidly. Maximum longitudinal decelerations in a calm-water ditching will be about 4g.

  15. Statin-Associated Muscle-Related Adverse Effects: A Case Series of 354 Patients

    PubMed Central

    Cham, Stephanie; Evans, Marcella A.; Denenberg, Julie O.; Golomb, Beatrice A.

    2016-01-01

    Study Objective To characterize the properties and natural history of 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitor (statin)-associated muscle-related adverse effects (MAEs). Design Patient-targeted postmarketing adverse-effect surveillance approach coupling survey design with an open-ended narrative. Setting University-affiliated health care system. Subjects Three hundred fifty-four patients (age range 34–86 yrs) who self-reported muscle-related problems associated with statin therapy. Measurements and Main Results Patients with perceived statin-associated MAEs completed a survey assessing statin drugs and dosages; characteristics of the MAEs; time course of onset, resolution, or recurrence; and impact on quality of life (QOL). Cases were assessed for putative drug adverse-effect causality by using the Naranjo adverse drug reaction probability scale criteria and were evaluated for inclusion in groups for which mortality benefit with statins has been shown. Patients reported muscle pain (93%), fatigue (88%), and weakness (85%). Three hundred patients (85%) met literature criteria for probable or definite drug adverse-effect causality. Ninety-four percent of atorvastatin usages (240/255) generated MAEs versus 61% of lovastatin usages (38/62, p<0.0001). Higher potency statins reproduced MAEs in 100% of 39 rechallenges versus 73% (29/40) with lower potency rechallenges (p<0.01). Time course of onset after statin initiation varied (median 14 wks); some MAEs occurred after long-term symptom-free use. Recurrence with rechallenge had a significantly shorter latency to onset (median 2 wks). The MAEs adversely affected all assessed functional and QOL domains. Most patients with probable or definite MAEs were in categories for which available randomized controlled trial evidence shows no trend to all-cause mortality benefit with statin therapy. Conclusion This study complements available information on the properties and natural history of statin

  16. Statin-associated muscle-related adverse effects: a case series of 354 patients.

    PubMed

    Cham, Stephanie; Evans, Marcella A; Denenberg, Julie O; Golomb, Beatrice A

    2010-06-01

    To characterize the properties and natural history of 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitor (statin)-associated muscle-related adverse effects (MAEs). Patient-targeted postmarketing adverse-effect surveillance approach coupling survey design with an open-ended narrative. University-affiliated health care system. Three hundred fifty-four patients (age range 34-86 yrs) who self-reported muscle-related problems associated with statin therapy. Patients with perceived statin-associated MAEs completed a survey assessing statin drugs and dosages; characteristics of the MAEs; time course of onset, resolution, or recurrence; and impact on quality of life (QOL). Cases were assessed for putative drug adverse-effect causality by using the Naranjo adverse drug reaction probability scale criteria and were evaluated for inclusion in groups for which mortality benefit with statins has been shown. Patients reported muscle pain (93%), fatigue (88%), and weakness (85%). Three hundred patients (85%) met literature criteria for probable or definite drug adverse-effect causality. Ninety-four percent of atorvastatin usages (240/255) generated MAEs versus 61% of lovastatin usages (38/62, p<0.0001). Higher potency statins reproduced MAEs in 100% of 39 rechallenges versus 73% (29/40) with lower potency rechallenges (p<0.01). Time course of onset after statin initiation varied (median 14 wks); some MAEs occurred after long-term symptom-free use. Recurrence with rechallenge had a significantly shorter latency to onset (median 2 wks). The MAEs adversely affected all assessed functional and QOL domains. Most patients with probable or definite MAEs were in categories for which available randomized controlled trial evidence shows no trend to all-cause mortality benefit with statin therapy. This study complements available information on the properties and natural history of statin-associated MAEs, affirming dose dependence and strong QOL impact. The data indicating a dose

  17. Acute NSAID-related transmural duodenitis and extensive duodenal ulceration.

    PubMed

    Hashash, Jana G; Atweh, Lamya A; Saliba, Teddy; Chakhachiro, Zaher; Al-Kutoubi, Aghiad; Tawil, Ayman; Barada, Kassem A

    2007-11-01

    A 40-year-old previously healthy white man presented to the emergency department at American University of Beirut Medical Center, Beirut, Lebanon, with severe upper abdominal pain of 36-hour duration. The pain started a few hours after the intake of a single tablet of tiaprofenic acid and became more intense after the intake of another tablet 24 hours later. He had no other symptoms. He had no prior upper gastrointestinal (GI) symptoms, ulcer disease, steroidal or nonsteroidal anti-inflammatory drug use, or ethanol intake. Physical examination revealed mild upper abdominal tenderness. Complete blood count, amylase, lipase, and liver function tests were unremarkable. Computed tomography of the abdomen showed marked thickening of the duodenal wall with surrounding mesenteric streaking. Upper GI endoscopy revealed extensive ulceration involving the duodenal bulb, apex, and proximal D2, as well as a few gastric erosions. Histopathologic examination of duodenal biopsy samples showed extensive epithelial cell necrosis and infiltration of the lamina propria with neutrophils and eosinophils. The patient responded well to rabeprazole 20 mg BID and remains well 5 months later. We performed a literature search of PubMed for all English-language articles published between January 1970 and present (June 2007) using the key words tiaprofenic acid, nonsteroidal anti-inflammatory drugs, NSAID, duodenitis, duodenal erosion, duodenal ulcer, gastritis, gastric erosion, gastric ulcer, or peptic ulcer. We reviewed all randomized controlled trials involving NSAIDs found using PubMed, with a focus on their GI adverse effects. Based on the PubMed search, there were no published reports of acute transmural duodenitis and complicated duodenal ulcers associated with short-term exposure to tiaprofenic acid or other NSAIDs. The Naranjo adverse drug reaction (ADR) probability scale was used and a score of 6 was obtained, indicating a probable ADR from tiaprofenic acid use. We report a patient

  18. Hemorrhagic cystitis in a patient receiving conventional doses of dacarbazine for metastatic malignant melanoma: case report and review of the literature.

    PubMed

    Mohammadianpanah, Mohammad; Shirazi, Mehdi; Mosalaei, Ahmad; Omidvari, Shapour; Ahmadloo, Niloofar

    2007-06-01

    Hemorrhagic cystitis is a potentially life-threatening complication in patients receiving cancer therapy. This urologic emergency is commonly associated with the chemotherapeutic use of oxazaphosphorine alkylating agents. This report describes a case of hemorrhagic cystitis associated with dacarbazine treatment. A 63-year-old man with asymptomatic metastatic malignant melanoma received 3 cycles of dacarbazine (600-850 mg/m2) monochemotherapy, each 3 weeks apart. Two weeks after the third cycle, he presented with gross hematuria and mild dysuria. Physical examination revealed no significant finding. Hematuria was confirmed by urinalysis, and urinary infection was excluded by repeated urine cultures. Ultrasonography revealed diffuse bladder wall thickening with no discrete mass or ulceration. Cystoscopy findings revealed generalized inflammation and edema of the mucosa of the bladder, confirming the diagnosis of hemorrhagic cystitis. The patient's gross hematuria continued for 2 weeks and then completely resolved with supportive care. Two weeks after complete resolution, the patient experienced 2 transient episodes of gross hematuria that lasted a few hours and subsided spontaneously. Dacarbazine is currently considered the standard first-line treatment in patients with advanced malignant melanoma. At standard prescribed doses (a single dose of 850-1000 mg/m2 or 250 mg/m2 for 5 days per cycle), dacarbazine is a reasonably well tolerated chemotherapeutic drug; nausea, vomiting, and myelosuppression are the most common adverse effects. Association of dacarbazine with hemorrhagic cystitis has not been reported previously (in a PubMed literature search from 1950-2006), and only 1 case report associates temozolomide (an analog of dacarbazine) with hemorrhagic cystitis. Based on the Naranjo adverse drug reactions probability scale, an objective assessment revealed dacarbazine to be a probable cause of hemorrhagic cystitis in this case. This case report suggests that

  19. Survival probabilities at spherical frontiers.

    PubMed

    Lavrentovich, Maxim O; Nelson, David R

    2015-06-01

    Motivated by tumor growth and spatial population genetics, we study the interplay between evolutionary and spatial dynamics at the surfaces of three-dimensional, spherical range expansions. We consider range expansion radii that grow with an arbitrary power-law in time: R(t) = R0(1 + t/t(∗))Θ, where Θ is a growth exponent, R0 is the initial radius, and t(∗) is a characteristic time for the growth, to be affected by the inflating geometry. We vary the parameters t(∗) and Θ to capture a variety of possible growth regimes. Guided by recent results for two-dimensional inflating range expansions, we identify key dimensionless parameters that describe the survival probability of a mutant cell with a small selective advantage arising at the population frontier. Using analytical techniques, we calculate this probability for arbitrary Θ. We compare our results to simulations of linearly inflating expansions (Θ = 1 spherical Fisher-Kolmogorov-Petrovsky-Piscunov waves) and treadmilling populations (Θ = 0, with cells in the interior removed by apoptosis or a similar process). We find that mutations at linearly inflating fronts have survival probabilities enhanced by factors of 100 or more relative to mutations at treadmilling population frontiers. We also discuss the special properties of "marginally inflating" (Θ = 1/2) expansions. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  1. Anytime synthetic projection: Maximizing the probability of goal satisfaction

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Bresina, John L.

    1990-01-01

    A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

  2. Universality of local dissipation scales in buoyancy-driven turbulence.

    PubMed

    Zhou, Quan; Xia, Ke-Qing

    2010-03-26

    We report an experimental investigation of the local dissipation scale field eta in turbulent thermal convection. Our results reveal two types of universality of eta. The first one is that, for the same flow, the probability density functions (PDFs) of eta are insensitive to turbulent intensity and large-scale inhomogeneity and anisotropy of the system. The second is that the small-scale dissipation dynamics in buoyancy-driven turbulence can be described by the same models developed for homogeneous and isotropic turbulence. However, the exact functional form of the PDF of the local dissipation scale is not universal with respect to different types of flows, but depends on the integral-scale velocity boundary condition, which is found to have an exponential, rather than Gaussian, distribution in turbulent Rayleigh-Bénard convection.

  3. Universal scaling laws in metro area election results

    PubMed Central

    Szállási, Zoltán; Vattay, Gábor

    2018-01-01

    We explain the anomaly of election results between large cities and rural areas in terms of urban scaling in the 1948–2016 US elections and in the 2016 EU referendum of the UK. The scaling curves are all universal and depend on a single parameter only, and one of the parties always shows superlinear scaling and drives the process, while the sublinear exponent of the other party is merely the consequence of probability conservation. Based on the recently developed model of urban scaling, we give a microscopic model of voter behavior in which we replace diversity characterizing humans in creative aspects with social diversity and tolerance. The model can also predict new political developments such as the fragmentation of the left and the immigration paradox. PMID:29470518

  4. Traits Without Borders: Integrating Functional Diversity Across Scales.

    PubMed

    Carmona, Carlos P; de Bello, Francesco; Mason, Norman W H; Lepš, Jan

    2016-05-01

    Owing to the conceptual complexity of functional diversity (FD), a multitude of different methods are available for measuring it, with most being operational at only a small range of spatial scales. This causes uncertainty in ecological interpretations and limits the potential to generalize findings across studies or compare patterns across scales. We solve this problem by providing a unified framework expanding on and integrating existing approaches. The framework, based on trait probability density (TPD), is the first to fully implement the Hutchinsonian concept of the niche as a probabilistic hypervolume in estimating FD. This novel approach could revolutionize FD-based research by allowing quantification of the various FD components from organismal to macroecological scales, and allowing seamless transitions between scales. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Off the scale: a new species of fish-scale gecko (Squamata: Gekkonidae: Geckolepis) with exceptionally large scales

    PubMed Central

    Daza, Juan D.; Köhler, Jörn; Vences, Miguel; Glaw, Frank

    2017-01-01

    The gecko genus Geckolepis, endemic to Madagascar and the Comoro archipelago, is taxonomically challenging. One reason is its members ability to autotomize a large portion of their scales when grasped or touched, most likely to escape predation. Based on an integrative taxonomic approach including external morphology, morphometrics, genetics, pholidosis, and osteology, we here describe the first new species from this genus in 75 years: Geckolepis megalepis sp. nov. from the limestone karst of Ankarana in northern Madagascar. The new species has the largest known body scales of any gecko (both relatively and absolutely), which come off with exceptional ease. We provide a detailed description of the skeleton of the genus Geckolepis based on micro-Computed Tomography (micro-CT) analysis of the new species, the holotype of G. maculata, the recently resurrected G. humbloti, and a specimen belonging to an operational taxonomic unit (OTU) recently suggested to represent G. maculata. Geckolepis is characterized by highly mineralized, imbricated scales, paired frontals, and unfused subolfactory processes of the frontals, among other features. We identify diagnostic characters in the osteology of these geckos that help define our new species and show that the OTU assigned to G. maculata is probably not conspecific with it, leaving the taxonomic identity of this species unclear. We discuss possible reasons for the extremely enlarged scales of G. megalepis in the context of an anti-predator defence mechanism, and the future of Geckolepis taxonomy. PMID:28194313

  6. Tuned by experience: How orientation probability modulates early perceptual processing.

    PubMed

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Theoretical Analysis of Rain Attenuation Probability

    NASA Astrophysics Data System (ADS)

    Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan

    2007-07-01

    Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.

  8. Scaling behavior for random walks with memory of the largest distance from the origin

    NASA Astrophysics Data System (ADS)

    Serva, Maurizio

    2013-11-01

    We study a one-dimensional random walk with memory. The behavior of the walker is modified with respect to the simple symmetric random walk only when he or she is at the maximum distance ever reached from his or her starting point (home). In this case, having the choice to move farther or to move closer, the walker decides with different probabilities. If the probability of a forward step is higher then the probability of a backward step, the walker is bold, otherwise he or she is timorous. We investigate the asymptotic properties of this bold-timorous random walk, showing that the scaling behavior varies continuously from subdiffusive (timorous) to superdiffusive (bold). The scaling exponents are fully determined with a new mathematical approach based on a decomposition of the dynamics in active journeys (the walker is at the maximum distance) and lazy journeys (the walker is not at the maximum distance).

  9. Probability Quantization for Multiplication-Free Binary Arithmetic Coding

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.

  10. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  11. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  12. Effect of the human chorionic gonadotropin diet on patient outcomes.

    PubMed

    Goodbar, Nancy H; Foushee, Jaime A; Eagerton, David H; Haynes, Katie B; Johnson, Amanda A

    2013-05-01

    To report a case of left lower extremity deep vein thrombosis (DVT) and bilateral pulmonary embolisms in a patient who initiated the human chorionic gonadotropin (HCG) diet 2 weeks prior to presentation. A 64-year-old white female presented with leg swelling and shortness of breath. Lower extremity ultrasound revealed left leg DVT, and a computed tomography angiogram revealed bilateral pulmonary embolisms. A complete history and physical examination were unremarkable for any risk factors for acute thrombosis, with the exception of the initiation of the HCG diet approximately 2 weeks prior to presentation; the patient was taking 20 sublingual drops of HCG twice daily. Results of her hypercoagulable workup were negative. Upon admission, therapy was started with enoxaparin 120 mg subcutaneously twice daily and warfarin 5 mg orally once daily. According to the Naranjo probability scale, initiation of the HCG diet was a probable cause of our patient's adverse effects. The HCG diet has very few efficacy studies and no significant safety studies associated with its use. Six relevant studies were identified for assessment of efficacy, and only 1 was associated with a significant weight reduction in the HCG diet study population. All of these studies evaluated the use of the HCG diet via injections of the hormone and significant calorie restriction, which is known as the Simeons method. Currently marketed HCG products include sublingual drops, lozenges, and pellets, but none of these methods has an evidence-based efficacy and safety standard. As popularity of the HCG diet continues to increase, so do the potential adverse events associated with the management of weight loss via an unproven strategy. Patient safety information regarding this dieting strategy should be recognized by medical professionals.

  13. Clinical Application and Pharmacodynamic Monitoring of Apixaban in a Patient with End-Stage Renal Disease Requiring Chronic Hemodialysis.

    PubMed

    Kufel, Wesley D; Zayac, Adam S; Lehmann, David F; Miller, Christopher D

    2016-11-01

    Despite prescribing guidance, limited data exist to describe the use of apixaban in patients with end-stage renal disease (ESRD) requiring hemodialysis (HD). Current apixaban dosing recommendations for this patient population are based largely on a single-dose pharmacokinetic study of eight patients. We describe the clinical application and pharmacodynamic monitoring of apixaban in a 62-year-old 156-kg African-American woman with nonvalvular atrial fibrillation and ESRD requiring hemodialysis who developed calciphylaxis while receiving warfarin therapy. Based on a multidisciplinary clinical judgment decision due to concern for drug accumulation after multiple doses in patients with ESRD receiving HD, she was anticoagulated with apixaban 2.5 mg twice/day, as opposed to 5 mg twice/day as recommended by the package insert. Antifactor Xa monitoring was used, and resultant peak and trough apixaban concentrations were above the upper limit of detection for our clinical laboratory (more than 2.00 IU/ml). On day 7 of her hospitalization, the patient developed gastrointestinal bleeding, and apixaban was discontinued; no further clinical signs of bleeding occurred during her subsequent hospitalization course. Use of the Naranjo Adverse Drug Reaction Probability Scale indicated a probable relationship (score of 6) between apixaban exposure and the manifestation of gastrointestinal bleeding. The patient ultimately died 44 days after the acute bleeding event; however, coagulation concerns were not implicated in the patient's death. To our knowledge, this is the first case report that describes apixaban use and associated antifactor Xa monitoring in a patient with ESRD receiving HD, and it provides concern for current apixaban dosing recommendations in this patient population. Further pharmacokinetic and clinical data are likely necessary to better characterize apixaban use in these patients to optimize safety and efficacy. © 2016 Pharmacotherapy Publications, Inc.

  14. Myxedema coma associated with combination aripiprazole and sertraline therapy.

    PubMed

    Church, Chelsea O; Callen, Erin C

    2009-12-01

    To describe a case of myxedema coma (MC) associated with combination aripiprazole and sertraline therapy. A 41-year-old male presented to the emergency department with confusion, right-sided numbness and tingling, slurred speech, dizziness, and facial edema. His blood pressure was 160/113 mm Hg, with a pulse of 56 beats/min and temperature of 35.4 degrees C. Initial abnormal laboratory values included creatine kinase (CK) 439 U/L; serum creatinine 1.6 mg/dL; aspartate aminotransferase 85 U/L; and alanine aminotransferase 35 U/L. Repeat cardiac markers revealed an elevated CK level of 3573 U/L with a CK-MB of 24 ng/mL. Thyroid function tests showed thyroid-stimulating hormone 126.4 microIU/mL and free thyroxine 0.29 ng/dL. Home medications of unknown duration were sertraline 200 mg and aripiprazole 20 mg daily. He was admitted to the intensive care unit and initially treated with intravenous levothyroxine and dexamethasone. By hospital day 4, the patient was clinically stable and discharged to home. Myxedema coma, the most significant form of hypothyroidism (HT), is a rare but potentially fatal condition. The known precipitating causes of MC were ruled out in this patient, which left his home medications as the likely cause. Cases of HT caused by certain atypical antipsychotics and antidepressants are found in the literature, but none was reported with aripiprazole therapy. There are also no reported cases of sertraline or aripiprazole inducing MC. Use of the Naranjo probability scale indicates that the combination of aripiprazole and sertraline was a probable inducer of MC in this patient. Due to the widespread use of psychotropic medications, clinicians should be reminded of the rare, yet life-threatening, occurrence of MC when treating patients, especially with combination therapies such as sertraline and aripiprazole.

  15. Fanconi's syndrome and nephrogenic diabetes insipidus in an adult treated with ifosfamide.

    PubMed

    Ingemi, Amanda I; Bota, Vasile M; Peguero, Anyeri; Charpentier, Margaret

    2012-01-01

    Fanconi's syndrome is a serious condition characterized by type II proximal renal tubular dysfunction, with urinary loss of glucose, amino acids, phosphate, bicarbonate, and potassium. Ifosfamide-induced Fanconi's syndrome is reported in about 1.4-5% of children being treated for solid tumors, yet only a few cases have been reported in adults. We describe a 54-year-old man who came to the hospital with symptoms of neutropenic fever 4 days after his fourth cycle of ifosfamide and doxorubicin treatment for recurrent sarcoma with metastases to the lung. During admission, he was noted to have severe renal tubular dysfunction; ifosfamide-induced nephrogenic diabetes insipidus and Fanconi's syndrome were suspected. He received supportive therapy that resulted in incomplete resolution of signs and symptoms. The patient was discharged after a 5-day hospital stay when his white blood cell count increased from 0.1-2.5 × 10(3) /mm(3) and his fever had resolved. Use of the Naranjo adverse drug reaction probability scale indicated a probable relationship (score of 7) between the patient's development of diabetes insipidus and Fanconi's syndrome and his use of ifosfamide. This dual diagnosis of diabetes insipidus and Fanconi's syndrome in an adult makes this case unusual, as well as therapeutically challenging. We conducted a review of the existing literature regarding ifosfamide-induced Fanconi's syndrome and describe the proposed mechanisms and therapeutic options. This case suggests that patients treated with ifosfamide should be monitored closely for renal function to identify, and perhaps prevent, these rare adverse events. Preliminary animal models show promise for adding N-acetylcysteine to ifosfamide treatment, but more research is necessary before using this drug as a therapeutic option. © 2012 Pharmacotherapy Publications, Inc.

  16. Severe apnea in an infant exposed to lamotrigine in breast milk.

    PubMed

    Nordmo, Elisabet; Aronsen, Lena; Wasland, Kristin; Småbrekke, Lars; Vorren, Solveig

    2009-11-01

    To report a case of severe apnea in an infant exposed to lamotrigine through breast-feeding. A 16-day-old infant developed several mild episodes of apnea that culminated in a severe cyanotic episode requiring resuscitation. A thorough examination at the hospital gave no evidence of underlying diseases that could explain the reaction. The mother had used lamotrigine in increasing doses throughout pregnancy, and at the time of the apneic episodes, she used 850 mg/day. The infant was fully breast-fed, and the neonatal lamotrigine serum concentration was 4.87 microg/mL at the time of admission. Breast-feeding was terminated, and the infant fully recovered. Although there are several reports on extensive passage of lamotrigine into breast milk, this is the first published report of a serious adverse reaction in a breast-fed infant. Lamotrigine clearance increases throughout pregnancy, and maternal dose increases are often necessary to maintain therapeutic effect. After delivery, clearance rapidly returns to preconception levels, enhancing the risk of adverse reactions in both mothers and breast-fed infants if the dose is not sufficiently reduced. In this case, the dose was slowly reduced after delivery, and the maternal lamotrigine serum concentration more than doubled in the week before the neonatal apneic episodes. A high lamotrigine concentration was detected in the breast milk, and the neonatal lamotrigine serum concentration was in the upper therapeutic range. The neonatal lamotrigine elimination half-life was approximately twice that seen in adults. The Naranjo probability scale indicated a probable relationship between apnea and exposure to lamotrigine through breast-feeding in this infant. Infants can be exposed to clinically relevant doses of lamotrigine through breast-feeding. Individual risk/benefit assessment is important, and close monitoring of both mother and child is advisable, especially during the first 3 weeks postpartum.

  17. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  18. SureTrak Probability of Impact Display

    NASA Technical Reports Server (NTRS)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  19. Estimating site occupancy and detection probability parameters for meso- and large mammals in a coastal eosystem

    USGS Publications Warehouse

    O'Connell, Allan F.; Talancy, Neil W.; Bailey, Larissa L.; Sauer, John R.; Cook, Robert; Gilbert, Andrew T.

    2006-01-01

    Large-scale, multispecies monitoring programs are widely used to assess changes in wildlife populations but they often assume constant detectability when documenting species occurrence. This assumption is rarely met in practice because animal populations vary across time and space. As a result, detectability of a species can be influenced by a number of physical, biological, or anthropogenic factors (e.g., weather, seasonality, topography, biological rhythms, sampling methods). To evaluate some of these influences, we estimated site occupancy rates using species-specific detection probabilities for meso- and large terrestrial mammal species on Cape Cod, Massachusetts, USA. We used model selection to assess the influence of different sampling methods and major environmental factors on our ability to detect individual species. Remote cameras detected the most species (9), followed by cubby boxes (7) and hair traps (4) over a 13-month period. Estimated site occupancy rates were similar among sampling methods for most species when detection probabilities exceeded 0.15, but we question estimates obtained from methods with detection probabilities between 0.05 and 0.15, and we consider methods with lower probabilities unacceptable for occupancy estimation and inference. Estimated detection probabilities can be used to accommodate variation in sampling methods, which allows for comparison of monitoring programs using different protocols. Vegetation and seasonality produced species-specific differences in detectability and occupancy, but differences were not consistent within or among species, which suggests that our results should be considered in the context of local habitat features and life history traits for the target species. We believe that site occupancy is a useful state variable and suggest that monitoring programs for mammals using occupancy data consider detectability prior to making inferences about species distributions or population change.

  20. A logistic regression equation for estimating the probability of a stream in Vermont having intermittent flow

    USGS Publications Warehouse

    Olson, Scott A.; Brouillette, Michael C.

    2006-01-01

    A logistic regression equation was developed for estimating the probability of a stream flowing intermittently at unregulated, rural stream sites in Vermont. These determinations can be used for a wide variety of regulatory and planning efforts at the Federal, State, regional, county and town levels, including such applications as assessing fish and wildlife habitats, wetlands classifications, recreational opportunities, water-supply potential, waste-assimilation capacities, and sediment transport. The equation will be used to create a derived product for the Vermont Hydrography Dataset having the streamflow characteristic of 'intermittent' or 'perennial.' The Vermont Hydrography Dataset is Vermont's implementation of the National Hydrography Dataset and was created at a scale of 1:5,000 based on statewide digital orthophotos. The equation was developed by relating field-verified perennial or intermittent status of a stream site during normal summer low-streamflow conditions in the summer of 2005 to selected basin characteristics of naturally flowing streams in Vermont. The database used to develop the equation included 682 stream sites with drainage areas ranging from 0.05 to 5.0 square miles. When the 682 sites were observed, 126 were intermittent (had no flow at the time of the observation) and 556 were perennial (had flowing water at the time of the observation). The results of the logistic regression analysis indicate that the probability of a stream having intermittent flow in Vermont is a function of drainage area, elevation of the site, the ratio of basin relief to basin perimeter, and the areal percentage of well- and moderately well-drained soils in the basin. Using a probability cutpoint (a lower probability indicates the site has perennial flow and a higher probability indicates the site has intermittent flow) of 0.5, the logistic regression equation correctly predicted the perennial or intermittent status of 116 test sites 85 percent of the time.

  1. Scaling, soil moisture and evapotranspiration in runoff models

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.

    1993-01-01

    The effects of small-scale heterogeneity in land surface characteristics on the large-scale fluxes of water and energy in the land-atmosphere system has become a central focus of many of the climatology research experiments. The acquisition of high resolution land surface data through remote sensing and intensive land-climatology field experiments (like HAPEX and FIFE) has provided data to investigate the interactions between microscale land-atmosphere interactions and macroscale models. One essential research question is how to account for the small scale heterogeneities and whether 'effective' parameters can be used in the macroscale models. To address this question of scaling, the probability distribution for evaporation is derived which illustrates the conditions for which scaling should work. A correction algorithm that may appropriate for the land parameterization of a GCM is derived using a 2nd order linearization scheme. The performance of the algorithm is evaluated.

  2. Anomalous scaling of passive scalars in rotating flows.

    PubMed

    Rodriguez Imazio, P; Mininni, P D

    2011-06-01

    We present results of direct numerical simulations of passive scalar advection and diffusion in turbulent rotating flows. Scaling laws and the development of anisotropy are studied in spectral space, and in real space using an axisymmetric decomposition of velocity and passive scalar structure functions. The passive scalar is more anisotropic than the velocity field, and its power spectrum follows a spectral law consistent with ~ k[Please see text](-3/2). This scaling is explained with phenomenological arguments that consider the effect of rotation. Intermittency is characterized using scaling exponents and probability density functions of velocity and passive scalar increments. In the presence of rotation, intermittency in the velocity field decreases more noticeably than in the passive scalar. The scaling exponents show good agreement with Kraichnan's prediction for passive scalar intermittency in two dimensions, after correcting for the observed scaling of the second-order exponent.

  3. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  4. Multiple regression and inverse moments improve the characterization of the spatial scaling behavior of daily streamflows in the Southeast United States

    USGS Publications Warehouse

    Farmer, William H.; Over, Thomas M.; Vogel, Richard M.

    2015-01-01

    Understanding the spatial structure of daily streamflow is essential for managing freshwater resources, especially in poorly-gaged regions. Spatial scaling assumptions are common in flood frequency prediction (e.g., index-flood method) and the prediction of continuous streamflow at ungaged sites (e.g. drainage-area ratio), with simple scaling by drainage area being the most common assumption. In this study, scaling analyses of daily streamflow from 173 streamgages in the southeastern US resulted in three important findings. First, the use of only positive integer moment orders, as has been done in most previous studies, captures only the probabilistic and spatial scaling behavior of flows above an exceedance probability near the median; negative moment orders (inverse moments) are needed for lower streamflows. Second, assessing scaling by using drainage area alone is shown to result in a high degree of omitted-variable bias, masking the true spatial scaling behavior. Multiple regression is shown to mitigate this bias, controlling for regional heterogeneity of basin attributes, especially those correlated with drainage area. Previous univariate scaling analyses have neglected the scaling of low-flow events and may have produced biased estimates of the spatial scaling exponent. Third, the multiple regression results show that mean flows scale with an exponent of one, low flows scale with spatial scaling exponents greater than one, and high flows scale with exponents less than one. The relationship between scaling exponents and exceedance probabilities may be a fundamental signature of regional streamflow. This signature may improve our understanding of the physical processes generating streamflow at different exceedance probabilities

  5. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  6. The Statistics of Urban Scaling and Their Connection to Zipf’s Law

    PubMed Central

    Gomez-Lievano, Andres; Youn, HyeJin; Bettencourt, Luís M. A.

    2012-01-01

    Urban scaling relations characterizing how diverse properties of cities vary on average with their population size have recently been shown to be a general quantitative property of many urban systems around the world. However, in previous studies the statistics of urban indicators were not analyzed in detail, raising important questions about the full characterization of urban properties and how scaling relations may emerge in these larger contexts. Here, we build a self-consistent statistical framework that characterizes the joint probability distributions of urban indicators and city population sizes across an urban system. To develop this framework empirically we use one of the most granular and stochastic urban indicators available, specifically measuring homicides in cities of Brazil, Colombia and Mexico, three nations with high and fast changing rates of violent crime. We use these data to derive the conditional probability of the number of homicides per year given the population size of a city. To do this we use Bayes’ rule together with the estimated conditional probability of city size given their number of homicides and the distribution of total homicides. We then show that scaling laws emerge as expectation values of these conditional statistics. Knowledge of these distributions implies, in turn, a relationship between scaling and population size distribution exponents that can be used to predict Zipf’s exponent from urban indicator statistics. Our results also suggest how a general statistical theory of urban indicators may be constructed from the stochastic dynamics of social interaction processes in cities. PMID:22815745

  7. A well-scaling natural orbital theory

    DOE PAGES

    Gebauer, Ralph; Cohen, Morrel H.; Car, Roberto

    2016-11-01

    Here, we introduce an energy functional for ground-state electronic structure calculations. Its variables are the natural spin-orbitals of singlet many-body wave functions and their joint occupation probabilities deriving from controlled approximations to the two-particle density matrix that yield algebraic scaling in general, and Hartree–Fock scaling in its seniority-zero version. Results from the latter version for small molecular systems are compared with those of highly accurate quantum-chemical computations. The energies lie above full configuration interaction calculations, close to doubly occupied configuration interaction calculations. Their accuracy is considerably greater than that obtained from current density-functional theory approximations and from current functionals ofmore » the oneparticle density matrix.« less

  8. A well-scaling natural orbital theory

    PubMed Central

    Gebauer, Ralph; Cohen, Morrel H.; Car, Roberto

    2016-01-01

    We introduce an energy functional for ground-state electronic structure calculations. Its variables are the natural spin-orbitals of singlet many-body wave functions and their joint occupation probabilities deriving from controlled approximations to the two-particle density matrix that yield algebraic scaling in general, and Hartree–Fock scaling in its seniority-zero version. Results from the latter version for small molecular systems are compared with those of highly accurate quantum-chemical computations. The energies lie above full configuration interaction calculations, close to doubly occupied configuration interaction calculations. Their accuracy is considerably greater than that obtained from current density-functional theory approximations and from current functionals of the one-particle density matrix. PMID:27803328

  9. Speciation has a spatial scale that depends on levels of gene flow.

    PubMed

    Kisel, Yael; Barraclough, Timothy G

    2010-03-01

    Area is generally assumed to affect speciation rates, but work on the spatial context of speciation has focused mostly on patterns of range overlap between emerging species rather than on questions of geographical scale. A variety of geographical theories of speciation predict that the probability of speciation occurring within a given region should (1) increase with the size of the region and (2) increase as the spatial extent of intraspecific gene flow becomes smaller. Using a survey of speciation events on isolated oceanic islands for a broad range of taxa, we find evidence for both predictions. The probability of in situ speciation scales with island area in bats, carnivorous mammals, birds, flowering plants, lizards, butterflies and moths, and snails. Ferns are an exception to these findings, but they exhibit high frequencies of polyploid and hybrid speciation, which are expected to be scale independent. Furthermore, the minimum island size for speciation correlates across groups with the strength of intraspecific gene flow, as is estimated from a meta-analysis of published population genetic studies. These results indicate a general geographical model of speciation rates that are dependent on both area and gene flow. The spatial scale of population divergence is an important but neglected determinant of broad-scale diversity patterns.

  10. Hydrogeologic unit flow characterization using transition probability geostatistics.

    PubMed

    Jones, Norman L; Walker, Justin R; Carle, Steven F

    2005-01-01

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.

  11. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  12. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her with...

  13. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her with...

  14. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her with...

  15. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her with...

  16. 49 CFR 209.105 - Notice of probable violation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RAILROAD SAFETY ENFORCEMENT PROCEDURES Hazardous Materials Penalties Civil Penalties § 209.105 Notice of probable violation. (a) FRA, through the Chief Counsel, begins a civil penalty proceeding by serving a notice of probable violation on a person charging him or her with...

  17. Use of a trigger tool to detect adverse drug reactions in an emergency department.

    PubMed

    de Almeida, Silvana Maria; Romualdo, Aruana; de Abreu Ferraresi, Andressa; Zelezoglo, Giovana Roberta; Marra, Alexandre R; Edmond, Michael B

    2017-11-15

    Although there are systems for reporting adverse drug reactions (ADR), these safety events remain under reported. The low-cost, low-tech trigger tool method is based on the detection of events through clues, and it seems to increase the detection of adverse events compared to traditional methodologies. This study seeks to estimate the prevalence of adverse reactions to drugs in patients seeking care in the emergency department. Retrospective study from January to December, 2014, applying the Institute for Healthcare Improvement (IHI) trigger tool methodology for patients treated at the emergency room of a tertiary care hospital. The estimated prevalence of adverse reactions in patients presenting to the emergency department was 2.3% [CI 95 1.3% to 3.3%]; 28.6% of cases required hospitalization at an average cost of US$ 5698.44. The most common triggers were hydrocortisone (57% of the cases), diphenhydramine (14%) and fexofenadine (14%). Anti-infectives (19%), cardiovascular agents (14%), and musculoskeletal drugs (14%) were the most common causes of adverse reactions. According to the Naranjo Scale, 71% were classified as possible and 29% as probable. There was no association between adverse reactions and age and sex in the present study. The use of the trigger tool to identify adverse reactions in the emergency department was possible to identify a prevalence of 2.3%. It showed to be a viable method that can provide a better understanding of adverse drug reactions in this patient population.

  18. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Collision probability at low altitudes resulting from elliptical orbits

    NASA Technical Reports Server (NTRS)

    Kessler, Donald J.

    1990-01-01

    The probability of collision between a spacecraft and another object is calculated for various altitude and orbit conditions, and factors affecting the probability are discussed. It is shown that a collision can only occur when the spacecraft is located at an altitude which is between the perigee and apogee altitudes of the object and that the probability per unit time is largest when the orbit of the object is nearly circular. However, at low altitudes, the atmospheric drag causes changes with time of the perigee and the apogee, such that circular orbits have a much shorter lifetime than many of the elliptical orbits. Thus, when the collision probability is integrated over the lifetime of the orbiting object, some elliptical orbits are found to have much higher total collision probability than circular orbits. Rocket bodies used to boost payloads from low earth orbit to geosynchronous orbit are an example of objects in these elliptical orbits.

  20. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  1. A collision probability analysis of the double-heterogeneity problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebert, A.

    1993-10-01

    A practical collision probability model is presented for the description of geometries with many levels of heterogeneity. Regular regions of the macrogeometry are assumed to contain a stochastic mixture of spherical grains or cylindrical tubes. Simple expressions for the collision probabilities in the global geometry are obtained as a function of the collision probabilities in the macro- and microgeometries. This model was successfully implemented in the collision probability kernel of the APOLLO-1, APOLLO-2, and DRAGON lattice codes for the description of a broad range of reactor physics problems. Resonance self-shielding and depletion calculations in the microgeometries are possible because eachmore » microregion is explicitly represented.« less

  2. Misconceptions in Rational Numbers, Probability, Algebra, and Geometry

    ERIC Educational Resources Information Center

    Rakes, Christopher R.

    2010-01-01

    In this study, the author examined the relationship of probability misconceptions to algebra, geometry, and rational number misconceptions and investigated the potential of probability instruction as an intervention to address misconceptions in all 4 content areas. Through a review of literature, 5 fundamental concepts were identified that, if…

  3. A "Virtual Spin" on the Teaching of Probability

    ERIC Educational Resources Information Center

    Beck, Shari A.; Huse, Vanessa E.

    2007-01-01

    This article, which describes integrating virtual manipulatives with the teaching of probability at the elementary level, puts a "virtual spin" on the teaching of probability to provide more opportunities for students to experience successful learning. The traditional use of concrete manipulatives is enhanced with virtual coins and spinners from…

  4. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  5. Reinforcement Probability Modulates Temporal Memory Selection and Integration Processes

    PubMed Central

    Matell, Matthew S.; Kurti, Allison N.

    2013-01-01

    We have previously shown that rats trained in a mixed-interval peak procedure (tone = 4s, light = 12s) respond in a scalar manner at a time in between the trained peak times when presented with the stimulus compound (Swanton & Matell, 2011). In our previous work, the two component cues were reinforced with different probabilities (short = 20%, long = 80%) to equate response rates, and we found that the compound peak time was biased toward the cue with the higher reinforcement probability. Here, we examined the influence that different reinforcement probabilities have on the temporal location and shape of the compound response function. We found that the time of peak responding shifted as a function of the relative reinforcement probability of the component cues, becoming earlier as the relative likelihood of reinforcement associated with the short cue increased. However, as the relative probabilities of the component cues grew dissimilar, the compound peak became non-scalar, suggesting that the temporal control of behavior shifted from a process of integration to one of selection. As our previous work has utilized durations and reinforcement probabilities more discrepant than those used here, these data suggest that the processes underlying the integration/selection decision for time are based on cue value. PMID:23896560

  6. Fine flow structures in the transition region small-scale loops

    NASA Astrophysics Data System (ADS)

    Yan, L.; Peter, H.; He, J.; Wei, Y.

    2016-12-01

    The observation and model have suggested that the transition region EUV emission from the quiet sun region is contributed by very small scale loops which have not been resolved. Recently, the observation from IRIS has revealed that this kind of small scale loops. Based on the high resolution spectral and imaging observation from IRIS, much more detail work needs to be done to reveal the fine flow features in this kind of loop to help us understand the loop heating. Here, we present a detail statistical study of the spatial and temporal evolution of Si IV line profiles of small scale loops and report the spectral features: there is a transition from blue (red) wing enhancement dominant to red (blue) wing enhancement dominant along the cross-section of the loop, which is independent of time. This feature appears as the loop appear and disappear as the loop un-visible. This is probably the signature of helical flow along the loop. The result suggests that the brightening of this kind of loop is probably due to the current dissipation heating in the twisted magnetic field flux tube.

  7. Cytologic diagnosis: expression of probability by clinical pathologists.

    PubMed

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  8. Multiple Streaming and the Probability Distribution of Density in Redshift Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hui, Lam; Kofman, Lev; Shandarin, Sergei F.

    2000-07-01

    We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude ({sigma}{sub l}(less-or-similar sign)1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which aremore » physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S{sub 3}, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated. (c) 2000 The American Astronomical Society.« less

  9. High probability neurotransmitter release sites represent an energy efficient design

    PubMed Central

    Lu, Zhongmin; Chouhan, Amit K.; Borycz, Jolanta A.; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L.; Zhou, You; Meinertzhagen, Ian A.; Macleod, Gregory T.

    2016-01-01

    Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High probability release sites are not uncommon but their advantages are not well understood. Here we test the hypothesis that high probability release sites represent an energy efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements we calculated release site probabilities to differ considerably between terminals (0.33 vs. 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high probability release site terminals were found to be more efficient (0.13 vs. 0.06). Our analytical model indicates that energy efficiency is optimal (~0.15) at high release site probabilities (~0.76). As limitations in energy supply constrain neural function, high probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high efficiency terminals depress significantly during episodic bursts of activity. PMID:27593375

  10. Depression and anxiety in patients with rheumatoid arthritis: prevalence rates based on a comparison of the Depression, Anxiety and Stress Scale (DASS) and the hospital, Anxiety and Depression Scale (HADS)

    PubMed Central

    2012-01-01

    Background While it is recognised that depression is prevalent in Rheumatoid Arthritis (RA), recent studies have also highlighted significant levels of anxiety in RA patients. This study compared two commonly used scales, the Depression Anxiety and Stress Scale (DASS) and the Hospital Anxiety and Depression Scale (HADS), in relation to their measurement range and cut points to consider the relative prevalence of both constructs, and if prevalence rates may be due to scale-specific case definition. Methods Patients meeting the criteria for RA were recruited in Leeds, UK and Sydney, Australia and asked to complete a survey that included both scales. The data was analysed using the Rasch measurement model. Results A total of 169 RA patients were assessed, with a repeat subsample, resulting in 323 cases for analysis. Both scales met Rasch model expectations. Using the 'possible+probable' cut point from the HADS, 58.3% had neither anxiety nor depression; 13.5% had anxiety only; 6.4% depression only and 21.8% had both 'possible+probable' anxiety and depression. Cut points for depression were comparable across the two scales while a lower cut point for anxiety in the DASS was required to equate prevalence. Conclusions This study provides further support for high prevalence of depression and anxiety in RA. It also shows that while these two scales provide a good indication of possible depression and anxiety, the estimates of prevalence so derived could vary, particularly for anxiety. These findings are discussed in terms of comparisons across studies and selection of scales for clinical use. PMID:22269280

  11. Depression and anxiety in patients with rheumatoid arthritis: prevalence rates based on a comparison of the Depression, Anxiety and Stress Scale (DASS) and the hospital, Anxiety and Depression Scale (HADS).

    PubMed

    Covic, Tanya; Cumming, Steven R; Pallant, Julie F; Manolios, Nick; Emery, Paul; Conaghan, Philip G; Tennant, Alan

    2012-01-24

    While it is recognised that depression is prevalent in Rheumatoid Arthritis (RA), recent studies have also highlighted significant levels of anxiety in RA patients. This study compared two commonly used scales, the Depression Anxiety and Stress Scale (DASS) and the Hospital Anxiety and Depression Scale (HADS), in relation to their measurement range and cut points to consider the relative prevalence of both constructs, and if prevalence rates may be due to scale-specific case definition. Patients meeting the criteria for RA were recruited in Leeds, UK and Sydney, Australia and asked to complete a survey that included both scales. The data was analysed using the Rasch measurement model. A total of 169 RA patients were assessed, with a repeat subsample, resulting in 323 cases for analysis. Both scales met Rasch model expectations. Using the 'possible+probable' cut point from the HADS, 58.3% had neither anxiety nor depression; 13.5% had anxiety only; 6.4% depression only and 21.8% had both 'possible+probable' anxiety and depression. Cut points for depression were comparable across the two scales while a lower cut point for anxiety in the DASS was required to equate prevalence. This study provides further support for high prevalence of depression and anxiety in RA. It also shows that while these two scales provide a good indication of possible depression and anxiety, the estimates of prevalence so derived could vary, particularly for anxiety. These findings are discussed in terms of comparisons across studies and selection of scales for clinical use.

  12. Probable Gastrointestinal Toxicity of Kombucha Tea

    PubMed Central

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  13. Probabilistic Cloning of Three Real States with Optimal Success Probabilities

    NASA Astrophysics Data System (ADS)

    Rui, Pin-shu

    2017-06-01

    We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→ N PQC case.

  14. A Quantum Theoretical Explanation for Probability Judgment Errors

    ERIC Educational Resources Information Center

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  15. Prizes in Cereal Boxes: An Application of Probability.

    ERIC Educational Resources Information Center

    Litwiller, Bonnie H.; Duncan, David R.

    1992-01-01

    Presents four cases of real-world probabilistic situations to promote more effective teaching of probability. Calculates the probability of obtaining six of six different prizes successively in six, seven, eight, and nine boxes of cereal, generalizes the problem to n boxes of cereal, and offers suggestions to extend the problem. (MDH)

  16. Severe Secondary Polycythemia in a Female-to-Male Transgender Patient While Using Lifelong Hormonal Therapy: A Patient's Perspective.

    PubMed

    Ederveen, Ellen G T; van Hunsel, Florence P A M; Wondergem, Marielle J; van Puijenbroek, Eugène P

    2018-02-02

    After a registered drug is available on the market and used in everyday circumstances, hitherto unknown adverse drug reactions (ADRs) may occur. Furthermore, the patient can experience a previously unknown course of a known ADR. Voluntary reports by patients play an important role in gaining knowledge about ADRs in daily practice. The Netherlands Pharmacovigilance Centre Lareb received a report from a 55-year-old female-to-male transgender patient who experiences secondary polycythemia while using lifelong testosterone therapy. The onset age of the symptoms was 38 years. The symptoms appeared gradually and after approximately 1 year it was clear that the patient's hemoglobin and hematocrit had started to increase. A Naranjo assessment score of 6 was obtained, indicating a probable relationship between the patient's polycythemia and use of the suspect drug. Polycythemia is a known ADR in testosterone treatment, but little attention has been paid to the possible severity and complications of these symptoms as well as the impact on the patient's well-being.

  17. Gambling disorder: a side effect of an off-label prescription of baclofen-literature review.

    PubMed

    Guillou-Landreat, Morgane; Victorri Vigneau, Caroline; Gerardin, Marie

    2017-01-10

    The use of high-dose baclofen emerged in 2008 in the treatment of alcohol-use disorders. Its prescription is still off-label in France, but recent trials have suggested the interest of using high doses for alcohol dependence, so we have to deal with an increase in its use. However, we still have few data about the adverse effects of a high-dose baclofen prescription, especially in complex addictive disorders. We present a case of a 32-year-old man who sought treatment for gambling disorders (GDs). He had complex addictive disorders, including alcohol-use disorders and GDs. He developed a severe GD, after treatment with a high dose of baclofen. The maximum dose was 160 mg/day, prescribed for his alcohol-use disorders. According to the Naranjo algorithm, the score was +7, it enabled to conclude that problem of gambling was probably imputable to baclofen. We discuss this case with reference to literature. 2017 BMJ Publishing Group Ltd.

  18. A Balanced Approach to Adaptive Probability Density Estimation.

    PubMed

    Kovacs, Julio A; Helmick, Cailee; Wriggers, Willy

    2017-01-01

    Our development of a Fast (Mutual) Information Matching (FIM) of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE) method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  19. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Estimating soil moisture exceedance probability from antecedent rainfall

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Kalansky, J.; Stock, J. D.; Collins, B. D.

    2016-12-01

    The first storms of the rainy season in coastal California, USA, add moisture to soils but rarely trigger landslides. Previous workers proposed that antecedent rainfall, the cumulative seasonal rain from October 1 onwards, had to exceed specific amounts in order to trigger landsliding. Recent monitoring of soil moisture upslope of historic landslides in the San Francisco Bay Area shows that storms can cause positive pressure heads once soil moisture values exceed a threshold of volumetric water content (VWC). We propose that antecedent rainfall could be used to estimate the probability that VWC exceeds this threshold. A major challenge to estimating the probability of exceedance is that rain gauge records are frequently incomplete. We developed a stochastic model to impute (infill) missing hourly precipitation data. This model uses nearest neighbor-based conditional resampling of the gauge record using data from nearby rain gauges. Using co-located VWC measurements, imputed data can be used to estimate the probability that VWC exceeds a specific threshold for a given antecedent rainfall. The stochastic imputation model can also provide an estimate of uncertainty in the exceedance probability curve. Here we demonstrate the method using soil moisture and precipitation data from several sites located throughout Northern California. Results show a significant variability between sites in the sensitivity of VWC exceedance probability to antecedent rainfall.

  1. Probabilistic simulation of multi-scale composite behavior

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.

    1993-01-01

    A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.

  2. Dissipative tunnelling by means of scaled trajectories

    NASA Astrophysics Data System (ADS)

    Mousavi, S. V.; Miret-Artés, S.

    2018-06-01

    Dissipative quantum tunnelling through an inverted parabolic barrier is considered in the presence of an electric field. A Schrödinger-Langevin or Kostin quantum-classical transition wave equation is used and applied resulting in a scaled differential equation of motion. A Gaussian wave packet solution to the resulting scaled Kostin nonlinear equation is assumed and compared to the same solution for the scaled linear Caldirola-Kanai equation. The resulting scaled trajectories are obtained at different dynamical regimes and friction cases, showing the gradual decoherence process in this open dynamics. Theoretical results show that the transmission probabilities are always higher in the Kostin approach than in the Caldirola-Kanai approach in the presence or not of an external electric field. This discrepancy should be understood due to the presence of an environment since the corresponding open dynamics should be governed by nonlinear quantum equations, whereas the second approach is issued from an effective Hamiltonian within a linear theory.

  3. A study of complex scaling transformation using the Wigner representation of wavefunctions.

    PubMed

    Kaprálová-Ždánská, Petra Ruth

    2011-05-28

    The complex scaling operator exp(-θ ̂x̂p/ℏ), being a foundation of the complex scaling method for resonances, is studied in the Wigner phase-space representation. It is shown that the complex scaling operator behaves similarly to the squeezing operator, rotating and amplifying Wigner quasi-probability distributions of the respective wavefunctions. It is disclosed that the distorting effect of the complex scaling transformation is correlated with increased numerical errors of computed resonance energies and widths. The behavior of the numerical error is demonstrated for a computation of CO(2+) vibronic resonances. © 2011 American Institute of Physics

  4. False Positive Probabilities for all Kepler Objects of Interest: 1284 Newly Validated Planets and 428 Likely False Positives

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.; Bryson, Stephen T.; Coughlin, Jeffrey L.; Rowe, Jason F.; Ravichandran, Ganesh; Petigura, Erik A.; Haas, Michael R.; Batalha, Natalie M.

    2016-05-01

    We present astrophysical false positive probability calculations for every Kepler Object of Interest (KOI)—the first large-scale demonstration of a fully automated transiting planet validation procedure. Out of 7056 KOIs, we determine that 1935 have probabilities <1% of being astrophysical false positives, and thus may be considered validated planets. Of these, 1284 have not yet been validated or confirmed by other methods. In addition, we identify 428 KOIs that are likely to be false positives, but have not yet been identified as such, though some of these may be a result of unidentified transit timing variations. A side product of these calculations is full stellar property posterior samplings for every host star, modeled as single, binary, and triple systems. These calculations use vespa, a publicly available Python package that is able to be easily applied to any transiting exoplanet candidate.

  5. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    USGS Publications Warehouse

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  6. Knot probabilities in random diagrams

    NASA Astrophysics Data System (ADS)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  7. Probability matching and strategy availability.

    PubMed

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  8. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    PubMed

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and

  9. Sufficient Statistics for Divergence and the Probability of Misclassification

    NASA Technical Reports Server (NTRS)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  10. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... who have given information upon which revocation may be based) at a postponed probable cause hearing... attendance, unless good cause is found for not allowing confrontation. Whenever a probable cause hearing is...

  11. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    PubMed Central

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  12. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    PubMed

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  13. Density dependence, spatial scale and patterning in sessile biota.

    PubMed

    Gascoigne, Joanna C; Beadman, Helen A; Saurel, Camille; Kaiser, Michel J

    2005-09-01

    Sessile biota can compete with or facilitate each other, and the interaction of facilitation and competition at different spatial scales is key to developing spatial patchiness and patterning. We examined density and scale dependence in a patterned, soft sediment mussel bed. We followed mussel growth and density at two spatial scales separated by four orders of magnitude. In summer, competition was important at both scales. In winter, there was net facilitation at the small scale with no evidence of density dependence at the large scale. The mechanism for facilitation is probably density dependent protection from wave dislodgement. Intraspecific interactions in soft sediment mussel beds thus vary both temporally and spatially. Our data support the idea that pattern formation in ecological systems arises from competition at large scales and facilitation at smaller scales, so far only shown in vegetation systems. The data, and a simple, heuristic model, also suggest that facilitative interactions in sessile biota are mediated by physical stress, and that interactions change in strength and sign along a spatial or temporal gradient of physical stress.

  14. Global warming precipitation accumulation increases above the current-climate cutoff scale

    PubMed Central

    Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-01-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff. PMID:28115693

  15. Global warming precipitation accumulation increases above the current-climate cutoff scale

    NASA Astrophysics Data System (ADS)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-02-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  16. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  17. Global warming precipitation accumulation increases above the current-climate cutoff scale.

    PubMed

    Neelin, J David; Sahany, Sandeep; Stechmann, Samuel N; Bernstein, Diana N

    2017-02-07

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  18. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE PAGES

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; ...

    2017-01-23

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  19. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  20. Future southcentral US wildfire probability due to climate change

    USGS Publications Warehouse

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  1. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    PubMed

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  2. Probability of survival during accidental immersion in cold water.

    PubMed

    Wissler, Eugene H

    2003-01-01

    Estimating the probability of survival during accidental immersion in cold water presents formidable challenges for both theoreticians and empirics. A number of theoretical models have been developed assuming that death occurs when the central body temperature, computed using a mathematical model, falls to a certain level. This paper describes a different theoretical approach to estimating the probability of survival. The human thermal model developed by Wissler is used to compute the central temperature during immersion in cold water. Simultaneously, a survival probability function is computed by solving a differential equation that defines how the probability of survival decreases with increasing time. The survival equation assumes that the probability of occurrence of a fatal event increases as the victim's central temperature decreases. Generally accepted views of the medical consequences of hypothermia and published reports of various accidents provide information useful for defining a "fatality function" that increases exponentially with decreasing central temperature. The particular function suggested in this paper yields a relationship between immersion time for 10% probability of survival and water temperature that agrees very well with Molnar's empirical observations based on World War II data. The method presented in this paper circumvents a serious difficulty with most previous models--that one's ability to survive immersion in cold water is determined almost exclusively by the ability to maintain a high level of shivering metabolism.

  3. Electron number probability distributions for correlated wave functions.

    PubMed

    Francisco, E; Martín Pendás, A; Blanco, M A

    2007-03-07

    Efficient formulas for computing the probability of finding exactly an integer number of electrons in an arbitrarily chosen volume are only known for single-determinant wave functions [E. Cances et al., Theor. Chem. Acc. 111, 373 (2004)]. In this article, an algebraic method is presented that extends these formulas to the case of multideterminant wave functions and any number of disjoint volumes. The derived expressions are applied to compute the probabilities within the atomic domains derived from the space partitioning based on the quantum theory of atoms in molecules. Results for a series of test molecules are presented, paying particular attention to the effects of electron correlation and of some numerical approximations on the computed probabilities.

  4. An Evaluation of a Progressive High-Probability Instructional Sequence Combined with Low-Probability Demand Fading in the Treatment of Food Selectivity

    ERIC Educational Resources Information Center

    Penrod, Becky; Gardella, Laura; Fernand, Jonathan

    2012-01-01

    Few studies have examined the effects of the high-probability instructional sequence in the treatment of food selectivity, and results of these studies have been mixed (e.g., Dawson et al., 2003; Patel et al., 2007). The present study extended previous research on the high-probability instructional sequence by combining this procedure with…

  5. Characteristics of the first child predict the parents' probability of having another child.

    PubMed

    Jokela, Markus

    2010-07-01

    In a sample of 7,695 families in the prospective, nationally representative British Millennium Cohort Study, this study examined whether characteristics of the 1st-born child predicted parents' timing and probability of having another child within 5 years after the 1st child's birth. Infant temperament was assessed with the Carey Infant Temperament Scale (Carey, 1972; Carey & McDevitt, 1978) at age 9 months, childhood socioemotional and behavioral characteristics with the Strengths and Difficulties Questionnaire (Goodman, 2001), and childhood cognitive ability with the Bracken School Readiness Assessment (Bracken, 2002) test at age 3 years. Survival analysis modeling indicated that the 1st child's low reactivity to novelty in infancy, high prosociality, low conduct problems, and high cognitive ability in childhood were associated with increased probability of parents having another child. Except for reactivity to novelty, these associations became stronger with time. High emotional symptoms were also positively associated with childbearing, but this was likely to reflect reverse causality-that is, the effect of sibling birth on the 1st child's adjustment. The results suggest that child effects, particularly those related to the child's cognitive ability, adaptability to novelty, and prosocial behavior, may be relevant to parents' future childbearing. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  6. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  7. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  8. Nurse Family Partnership: Comparing Costs per Family in Randomized Trials Versus Scale-Up.

    PubMed

    Miller, Ted R; Hendrie, Delia

    2015-12-01

    The literature that addresses cost differences between randomized trials and full-scale replications is quite sparse. This paper examines how costs differed among three randomized trials and six statewide scale-ups of nurse family partnership (NFP) intensive home visitation to low income first-time mothers. A literature review provided data on pertinent trials. At our request, six well-established programs reported their total expenditures. We adjusted the costs to national prices based on mean hourly wages for registered nurses and then inflated them to 2010 dollars. A centralized data system provided utilization. Replications had fewer home visits per family than trials (25 vs. 31, p = .05), lower costs per client ($8860 vs. $12,398, p = .01), and lower costs per visit ($354 vs. $400, p = .30). Sample size limited the significance of these differences. In this type of labor intensive program, costs probably were lower in scale-up than in randomized trials. Key cost drivers were attrition and the stable caseload size possible in an ongoing program. Our estimates reveal a wide variation in cost per visit across six state programs, which suggests that those planning replications should not expect a simple rule to guide cost estimations for scale-ups. Nevertheless, NFP replications probably achieved some economies of scale.

  9. Note on a modified return period scale for upper-truncated unbounded flood distributions

    NASA Astrophysics Data System (ADS)

    Bardsley, Earl

    2017-01-01

    Probability distributions unbounded to the right often give good fits to annual discharge maxima. However, all hydrological processes are in reality constrained by physical upper limits, though not necessarily well defined. A result of this contradiction is that for sufficiently small exceedance probabilities the unbounded distributions anticipate flood magnitudes which are impossibly large. This raises the question of whether displayed return period scales should, as is current practice, have some given number of years, such as 500 years, as the terminating rightmost tick-point. This carries the implication that the scale might be extended indefinitely to the right with a corresponding indefinite increase in flood magnitude. An alternative, suggested here, is to introduce a sufficiently high upper truncation point to the flood distribution and modify the return period scale accordingly. The rightmost tick-mark then becomes infinity, corresponding to the upper truncation point discharge. The truncation point is likely to be set as being above any physical upper bound and the return period scale will change only slightly over all practical return periods of operational interest. The rightmost infinity tick point is therefore proposed, not as an operational measure, but rather to signal in flood plots that the return period scale does not extend indefinitely to the right.

  10. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Scale criticality in estimating ecosystem carbon dynamics

    USGS Publications Warehouse

    Zhao, Shuqing; Liu, Shuguang

    2014-01-01

    Scaling is central to ecology and Earth system sciences. However, the importance of scale (i.e. resolution and extent) for understanding carbon dynamics across scales is poorly understood and quantified. We simulated carbon dynamics under a wide range of combinations of resolution (nine spatial resolutions of 250 m, 500 m, 1 km, 2 km, 5 km, 10 km, 20 km, 50 km, and 100 km) and extent (57 geospatial extents ranging from 108 to 1 247 034 km2) in the southeastern United States to explore the existence of scale dependence of the simulated regional carbon balance. Results clearly show the existence of a critical threshold resolution for estimating carbon sequestration within a given extent and an error limit. Furthermore, an invariant power law scaling relationship was found between the critical resolution and the spatial extent as the critical resolution is proportional to An (n is a constant, and A is the extent). Scale criticality and the power law relationship might be driven by the power law probability distributions of land surface and ecological quantities including disturbances at landscape to regional scales. The current overwhelming practices without considering scale criticality might have largely contributed to difficulties in balancing carbon budgets at regional and global scales.

  12. Scale relativity and hierarchical structuring of planetary systems

    NASA Astrophysics Data System (ADS)

    Galopeau, P. H. M.; Nottale, L.; da Rocha, D.; Tran Minh, N.

    2003-04-01

    The theory of scale relativity, applied to macroscopic gravitational systems like planetary systems, allows one to predict quantization laws of several key parameters characterizing those systems (distance between planets and central star, obliquity, eccentricity...) which are organized in a hierarchical way. In the framework of the scale relativity approach, one demonstrates that the motion (at relatively large time-scales) of the bodies in planetary systems, described in terms of fractal geodesic trajectories, is governed by a Schrödinger-like equation. Preferential orbits are predicted in terms of probability density peaks with semi-major axis given by: a_n = GMn^2/w^2 (M is the mass of the central star and w is a velocity close to 144 km s-1 in the case of our inner solar system and of the presently observed exoplanets). The velocity of the planet orbiting at this distance satisfies the relation v_n = w/n. Moreover, the mass distribution of the planets in our solar system can be accounted for in this model. These predictions are in good agreement with the observed values of the actual orbital parameters. Furthermore, the exoplanets which have been recently discovered around nearby stars also follow the same law in terms of the same constant in a highly significant statistical way. The theory of scale relativity also predicts structures for the obliquities and inclinations of the planets and satellites: the probability density of their distribution between 0 and pi are expected to display peaks at particular angles θ_k = kpi/n. A statistical agreement is obtained for our solar system with n=7. Another prediction concerns the distribution of the planets eccentricities e. The theory foresees a quantization law e = k/n where k is an integer and n is the quantum number that characterizes semi-major axes. The presently known exoplanet eccentricities are compatible with this theoretical prediction. Finally, although all these planetary systems may look very

  13. Representation of Odds in Terms of Frequencies Reduces Probability Discounting

    ERIC Educational Resources Information Center

    Yi, Richard; Bickel, Warren K.

    2005-01-01

    In studies of probability discounting, the reduction in the value of an outcome as a result of its degree of uncertainty is calculated. Decision making studies suggest two issues with probability that may play a role in data obtained in probability discounting studies. The first issue involves the reduction of risk aversion via subdivision of…

  14. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  15. The Probable Prevalence and Sociodemographic Characteristics of Specific Learning Disorder in Primary School Children in Edirne.

    PubMed

    Görker, Işık; Bozatli, Leyla; Korkmazlar, Ümran; Yücel Karadağ, Meltem; Ceylan, Cansın; Söğüt, Ceren; Aykutlu, Hasan Cem; Subay, Büşra; Turan, Nesrin

    2017-12-01

    The aim of this study was to research the probable prevalence of Specific Learning Disorder (SLD) in primary school children in Edirne City and the relationships with their sociodemographic characteristics. The sample of our study was composed of 2,174 children who were educated in primary schools in second, third, and fourth grades in the academic year 2013-2014 in Edirne City. The teachers and parents of these children were given Specific Learning Difficulties Symptom Scale, Learning Disabilities Symptoms Checklist (teacher and parent forms), and sociodemographic data forms to fill in. Binary logistic regression analysis was used to assess the risk factors for SLD. Our study revealed that the probable prevalence of SLD was 13.6%; 17% for boys and 10.4% for girls. Reading impairment was 3.6%, writing impairment was 6.9%, and mathematic impairment was 6.5%. We determined that consanguineous marriages, low income, history of neonatal jaundice were found as risks for SLD; born by caesarean, developmental delay of walking, and history of neonatal jaundice were found as risks for mathematic impairment. A history of learning difficulties of parents was a risk factor for forming SLD and subtypes. Our findings were consistent with other study results about the prevalence of SLD. The relationships between the probable prevalence rates and sociodemographic data were discussed.

  16. The Probable Prevalence and Sociodemographic Characteristics of Specific Learning Disorder in Primary School Children in Edirne

    PubMed Central

    GÖRKER, Işık; BOZATLI, Leyla; KORKMAZLAR, Ümran; YÜCEL KARADAĞ, Meltem; CEYLAN, Cansın; SÖĞÜT, Ceren; AYKUTLU, Hasan Cem; SUBAY, Büşra; TURAN, Nesrin

    2017-01-01

    Introduction The aim of this study was to research the probable prevalence of Specific Learning Disorder (SLD) in primary school children in Edirne City and the relationships with their sociodemographic characteristics. Methods The sample of our study was composed of 2,174 children who were educated in primary schools in second, third, and fourth grades in the academic year 2013–2014 in Edirne City. The teachers and parents of these children were given Specific Learning Difficulties Symptom Scale, Learning Disabilities Symptoms Checklist (teacher and parent forms), and sociodemographic data forms to fill in. Binary logistic regression analysis was used to assess the risk factors for SLD. Results Our study revealed that the probable prevalence of SLD was 13.6%; 17% for boys and 10.4% for girls. Reading impairment was 3.6%, writing impairment was 6.9%, and mathematic impairment was 6.5%. We determined that consanguineous marriages, low income, history of neonatal jaundice were found as risks for SLD; born by caesarean, developmental delay of walking, and history of neonatal jaundice were found as risks for mathematic impairment. A history of learning difficulties of parents was a risk factor for forming SLD and subtypes. Conclusion Our findings were consistent with other study results about the prevalence of SLD. The relationships between the probable prevalence rates and sociodemographic data were discussed. PMID:29321709

  17. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  18. Nonstationary envelope process and first excursion probability.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.

  19. On the probability of cure for heavy-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Hanin, Leonid; Zaider, Marco

    2014-07-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  20. Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability

    DTIC Science & Technology

    2015-07-01

    12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015...Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...criterion is usually the failure probability . In this paper, we examine the buffered failure probability as an attractive alternative to the failure